Skip to content

Commit 295de4e

Browse files
committed
fix image directives
1 parent b995069 commit 295de4e

15 files changed

+42
-42
lines changed

book/content/part01/algorithms-analysis.asc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ How can you do that? Can you time how long it takes to run a program? Of course,
1010
[big]#⏱#
1111
However, if you run the same program on a smartwatch, cellphone or desktop computer, it will take different times.
1212

13-
image:image3.png[image,width=528,height=137]
13+
image::image3.png[image,width=528,height=137]
1414

1515
Wouldn't it be great if we can compare algorithms regardless of the hardware where we run them?
1616
That's what *time complexity* is for!
@@ -93,7 +93,7 @@ Time complexity, in computer science, is a function that describes the number of
9393
How do you get a function that gives you the number of operations that will be executed? Well, we count line by line and mind code inside loops. Let's do an example to explain this point. For instance, we have a function to find the minimum value on an array called `getMin`.
9494

9595
.Translating lines of code to an approximate number of operations
96-
image:image4.png[Operations per line]
96+
image::image4.png[Operations per line]
9797

9898
Assuming that each line of code is an operation, we get the following:
9999

book/content/part01/big-o-examples.asc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ We a going to provide examples for each one of them.
2222
Before we dive in, here’s a plot with all of them.
2323

2424
.CPU operations vs. Algorithm runtime as the input size grows
25-
image:image5.png[CPU time needed vs. Algorithm runtime as the input size increases]
25+
image::image5.png[CPU time needed vs. Algorithm runtime as the input size increases]
2626

2727
The above chart shows how the running time of an algorithm is related to the amount of work the CPU has to perform. As you can see O(1) and O(log n) are very scalable. However, O(n^2^) and worst can make your computer run for years [big]#😵# on large datasets. We are going to give some examples so you can identify each one.
2828

@@ -132,7 +132,7 @@ include::{codedir}/algorithms/sorting/merge-sort.js[tag=merge]
132132
The merge function combines two sorted arrays in ascending order. Let’s say that we want to sort the array `[9, 2, 5, 1, 7, 6]`. In the following illustration, you can see what each function does.
133133

134134
.Mergesort visualization. Shows the split, sort and merge steps
135-
image:image11.png[Mergesort visualization,width=500,height=600]
135+
image::image11.png[Mergesort visualization,width=500,height=600]
136136

137137
How do we obtain the running time of the merge sort algorithm? The mergesort divides the array in half each time in the split phase, _log n_, and the merge function join each splits, _n_. The total work we have *O(n log n)*. There more formal ways to reach to this runtime like using the https://adrianmejia.com/blog/2018/04/24/analysis-of-recursive-algorithms/[Master Method] and https://www.cs.cornell.edu/courses/cs3110/2012sp/lectures/lec20-master/lec20.html[recursion trees].
138138

book/content/part02/array.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@ Some programming languages have fixed size arrays like Java and C++. Fixed size
2323
Arrays look like this:
2424

2525
.Array representation: each value is accessed through an index.
26-
image:image16.png[image,width=388,height=110]
26+
image::image16.png[image,width=388,height=110]
2727

2828
Arrays are a sequential collection of elements that can be accessed randomly using an index. Let’s take a look into the different operations that we can do with arrays.
2929

book/content/part02/linked-list.asc

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ A list (or Linked List) is a linear data structure where each node is "linked" t
2121
Each element or node is *connected* to the next one by a reference. When a node only has one connection it's called *singly linked list*:
2222

2323
.Singly Linked List Representation: each node has a reference (blue arrow) to the next one.
24-
image:image19.png[image,width=498,height=97]
24+
image::image19.png[image,width=498,height=97]
2525

2626
Usually, a Linked List is referenced by the first element in called *head* (or *root* node). For instance, if you want to get the `cat` element from the example above, then the only way to get there is using the `next` field on the head node. You would get `art` first, then use the next field recursively until you eventually get the `cat` element.
2727

@@ -31,7 +31,7 @@ Usually, a Linked List is referenced by the first element in called *head* (or *
3131
When each node has a connection to the `next` item and also the `previous` one, then we have a *doubly linked list*.
3232

3333
.Doubly Linked List: each node has a reference to the next and previous element.
34-
image:image20.png[image,width=528,height=74]
34+
image::image20.png[image,width=528,height=74]
3535

3636
With a doubly list you can not only move forward but also backward. If you keep the reference to the last element (`cat`) you can step back and reach the middle part.
3737

@@ -121,7 +121,7 @@ Similar to the array, with a linked list you can add elements at the beginning,
121121
We are going to use the `Node` class to create a new element and stick it at the beginning of the list as shown below.
122122

123123
.Insert at the beginning by linking the new node with the current first node.
124-
image:image23.png[image,width=498,height=217]
124+
image::image23.png[image,width=498,height=217]
125125

126126

127127
To insert at the beginning, we create a new node with the next reference to the current first node. Then we make first the new node. In code, it would look something like this:
@@ -140,7 +140,7 @@ As you can see, we create a new node and make it the first one.
140140
Appending an element at the end of the list can be done very effectively if we have a pointer to the `last` item in the list. Otherwise, you would have to iterate through the whole list.
141141

142142
.Add element to the end of the linked list
143-
image:image24.png[image,width=498,height=208]
143+
image::image24.png[image,width=498,height=208]
144144

145145
.Linked List's add to the end of the list implementation
146146
[source, javascript]
@@ -171,7 +171,7 @@ art <-> dog <-> cat
171171
We want to insert the `new` node in the 2^nd^ position. For that we first create the "new" node and update the references around it.
172172

173173
.Inserting node in the middle of a doubly linked list.
174-
image:image25.png[image,width=528,height=358]
174+
image::image25.png[image,width=528,height=358]
175175

176176
Take a look into the implementation of https://github.com/amejiarosario/dsa.js/blob/master/src/data-structures/linked-lists/linked-list.js#L83[LinkedList.add]:
177177

@@ -199,7 +199,7 @@ Deleting is an interesting one. We don’t delete an element; we remove all refe
199199
Deleting the first element (or head) is a matter of removing all references to it.
200200

201201
.Deleting an element from the head of the list
202-
image:image26.png[image,width=528,height=74]
202+
image::image26.png[image,width=528,height=74]
203203

204204
For instance, to remove the head (“art”) node, we change the variable `first` to point to the second node “dog”. We also remove the variable `previous` from the "dog" node, so it doesn't point to the “art” node. The garbage collector will get rid of the “art” node when it seems nothing is using it anymore.
205205

@@ -216,7 +216,7 @@ As you can see, when we want to remove the first node we make the 2nd element th
216216
Removing the last element from the list would require to iterate from the head until we find the last one, that’s O(n). But, If we have a reference to the last element, which we do, We can do it in _O(1)_ instead!
217217

218218
.Removing last element from the list using the last reference.
219-
image:image27.png[image,width=528,height=221]
219+
image::image27.png[image,width=528,height=221]
220220

221221

222222
For instance, if we want to remove the last node “cat”. We use the last pointer to avoid iterating through the whole list. We check `last.previous` to get the “dog” node and make it the new `last` and remove its next reference to “cat”. Since nothing is pointing to “cat” then is out of the list and eventually is deleted from memory by the garbage collector.
@@ -235,7 +235,7 @@ The code is very similar to `removeFirst`, but instead of first we update `last`
235235
To remove a node from the middle, we make the surrounding nodes to bypass the one we want to delete.
236236

237237
.Remove the middle node
238-
image:image28.png[image,width=528,height=259]
238+
image::image28.png[image,width=528,height=259]
239239

240240

241241
In the illustration, we are removing the middle node “dog” by making art’s `next` variable to point to cat and cat’s `previous` to be “art” totally bypassing “dog”.

book/content/part02/queue.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ endif::[]
1212
A queue is a linear data structure where the data flows in a *First-In-First-Out* (FIFO) manner.
1313

1414
.Queue data structure is like a line of people: the First-in, is the First-out
15-
image:image30.png[image,width=528,height=171]
15+
image::image30.png[image,width=528,height=171]
1616

1717
A queue is like a line of people at the bank; the person that arrived first is the first to go out as well.
1818

book/content/part02/stack.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ The stack is a data structure that restricts the way you add and remove data. It
1414
An analogy is to think the stack is a rod and the data are discs. You can only take out the last one you put in.
1515

1616
.Stack data structure is like a stack of disks: the last element in is the first element out
17-
image:image29.png[image,width=240,height=238]
17+
image::image29.png[image,width=240,height=238]
1818

1919
// #Change image from https://www.khanacademy.org/computing/computer-science/algorithms/towers-of-hanoi/a/towers-of-hanoi[Khan Academy]#
2020

book/content/part03/binary-search-tree.asc

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -57,7 +57,7 @@ With the methods `add` and `remove` we have to guarantee that our tree always ha
5757
For instance, let’s say that we want to insert the values 19, 21, 10, 2, 8 in a BST:
5858

5959
.Inserting values on a BST.
60-
image:image36.png[image,width=528,height=329]
60+
image::image36.png[image,width=528,height=329]
6161

6262
In the last box of the image above, when we are inserting node 18, we start by the root (19). Since 18 is less than 19, then we move left. Node 18 is greater than 10, so we move right. There’s an empty spot, and we place it there. Let’s code it up:
6363

@@ -95,7 +95,7 @@ Deleting a node from a BST have three cases.
9595
Deleting a leaf is the easiest; we look for their parent and set the child to null.
9696

9797
.Removing node without children from a BST.
98-
image:image37.png[image,width=528,height=200]
98+
image::image37.png[image,width=528,height=200]
9999

100100

101101
Node 18, will be hanging around until the garbage collector is run. However, there’s no node referencing to it so it won’t be reachable from the tree anymore.
@@ -105,7 +105,7 @@ Node 18, will be hanging around until the garbage collector is run. However, the
105105
Removing a parent is not as easy since you need to find new parents for its children.
106106

107107
.Removing node with 1 children from a BST.
108-
image:image38.png[image,width=528,height=192]
108+
image::image38.png[image,width=528,height=192]
109109

110110

111111
In the example, we removed node `10` from the tree, so its child (node 2) needs a new parent. We made node 19 the new parent for node 2.
@@ -115,7 +115,7 @@ In the example, we removed node `10` from the tree, so its child (node 2) needs
115115
Removing a parent of two children is the trickiest of all cases because we need to find new parents for two children. (This sentence sounds tragic out of context 😂)
116116

117117
.Removing node with two children from a BST.
118-
image:image39.png[image,width=528,height=404]
118+
image::image39.png[image,width=528,height=404]
119119

120120

121121
In the example, we delete the root node 19. This deletion leaves two orphans (node 10 and node 21). There are no more parents because node 19 was the *root* element. One way to solve this problem is to *combine* the left subtree (Node 10 and descendants) into the right subtree (node 21). The final result is node 21 is the new root.
@@ -163,7 +163,7 @@ That’s all we need to remove elements from a BST. Check out the complete BST i
163163
As we insert and remove nodes from a BST we could end up like the tree on the left:
164164

165165
.Balanced vs. Unbalanced Tree.
166-
image:image40.png[image,width=454,height=201]
166+
image::image40.png[image,width=454,height=201]
167167

168168
The tree on the left is unbalanced. It looks like a Linked List and has the same runtime! Searching for an element would be *O(n)*, yikes! However, on a balanced tree, the search time is *O(log n)*, which is pretty good! That’s why we always want to keep the tree balanced. In further chapters, we are going to explore how to keep a tree balanced after each insert/delete.
169169

book/content/part03/graph.asc

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -33,14 +33,14 @@ The connection between two nodes is called *edge*.
3333
Also, nodes might be called *vertex*.
3434

3535
.Graph is composed of vertices/nodes and edges
36-
image:image42.png[image,width=305,height=233]
36+
image::image42.png[image,width=305,height=233]
3737

3838
===== Directed Graph vs Undirected
3939

4040
A graph can be either *directed* or *undirected*.
4141

4242
.Graph: directed vs undirected
43-
image:image43.jpg[image,width=469,height=192]
43+
image::image43.jpg[image,width=469,height=192]
4444

4545

4646
An *undirected graph* has edges that are *two-way street*. E.g., On the undirected example, you can traverse from the green node to the orange and vice versa.
@@ -52,7 +52,7 @@ A *directed graph (digraph)* has edges that are *one-way street*. E.g., On the d
5252
A graph can have *cycles* or not.
5353

5454
.Cyclic vs Acyclic Graphs.
55-
image:image44.jpg[image,width=444,height=194]
55+
image::image44.jpg[image,width=444,height=194]
5656

5757
(((Cyclic Graph)))
5858
A *cyclic graph* is the one that you can pass through a node more than once.
@@ -68,7 +68,7 @@ The *Directed Acyclic Graph (DAG)* is unique. It has many applications like sche
6868
===== Connected vs Disconnected vs Complete Graphs
6969

7070
.Different kinds of graphs: disconnected, connected, and complete.
71-
image:image45.png[image,width=1528,height=300]
71+
image::image45.png[image,width=1528,height=300]
7272

7373
A *disconnected graph* is one that has one or more subgraph. In other words, a graph is *disconnected* if two nodes don’t have a path between them.
7474

@@ -81,7 +81,7 @@ A *complete graph* is where every node is adjacent to all the other nodes in the
8181
Weighted graphs have labels in the edges (a.k.a *weight* or *cost*). The link weight can represent many things like distance, travel time, or anything else.
8282

8383
.Weighted Graph representing USA airports distance in miles.
84-
image:image46.png[image,width=528,height=337]
84+
image::image46.png[image,width=528,height=337]
8585

8686
For instance, a weighted graph can have a distance between nodes. So, algorithms can use the weight and optimize the path between them.
8787

@@ -120,7 +120,7 @@ There are two main ways to graphs one is:
120120
Representing graphs as adjacency matrix is done using a two-dimensional array. For instance, let’s say we have the following graph:
121121

122122
.Graph and its adjacency matrix.
123-
image:image47.png[image,width=438,height=253]
123+
image::image47.png[image,width=438,height=253]
124124

125125
The number of vertices |V| define the size of the matrix. In the example, we have five vertices, so we have a 5x5 matrix.
126126

@@ -167,7 +167,7 @@ The space complexity of the adjacency matrix is *O(|V|^2^)*, where |V| is the nu
167167
Another way to represent a graph is by using an adjacency list. This time instead of using an array (matrix) we use a list.
168168

169169
.Graph represented as an Adjacency List.
170-
image:image48.png[image,width=528,height=237]
170+
image::image48.png[image,width=528,height=237]
171171

172172
If we want to add a new node to the list, we can do it by adding one element to the end of the array of nodes *O(1)*. In the next section, we are going to explore the running times of all operations in an adjacency list.
173173

book/content/part03/hashmap.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ How are the keys mapped to their values?
2424
Using a hash function. Here’s an illustration:
2525

2626
.Internal HashMap representation
27-
image:image41.png[image,width=528,height=299]
27+
image::image41.png[image,width=528,height=299]
2828

2929

3030
.This is the main idea:

book/content/part03/tree-intro.asc

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ endif::[]
1010
A tree is a non-linear data structure where a node can have zero or more connections. The topmost node in a tree is called *root*. The linked nodes to the root are called *children* or *descendants*.
1111

1212
.Tree Data Structure: root node and descendants.
13-
image:image31.jpg[image,width=404,height=240]
13+
image::image31.jpg[image,width=404,height=240]
1414

1515
As you can see in the picture above, this data structure resembles an inverted tree hence the name. It starts with a *root* node and *branch* off with its descendants, and finally *leaves*.
1616

@@ -50,7 +50,7 @@ Simple! Right? But there are some constraints that you have to keep at all times
5050
* The *depth of a tree* is the distance (edge count) from the root to the farthest leaf.
5151

5252
.Tree anatomy
53-
image:image31.jpg[image]
53+
image::image31.jpg[image]
5454

5555
==== Types of Binary Trees
5656

@@ -62,7 +62,7 @@ There are different kinds of trees depending on the restrictions. E.g. The trees
6262
The binary restricts the nodes to have at most two children. Trees, in general, can have 3, 4, 23 or more, but not binary trees.
6363

6464
.Binary tree has at most 2 children while non-binary trees can have more.
65-
image:image32.png[image,width=321,height=193]
65+
image::image32.png[image,width=321,height=193]
6666

6767
Binary trees are one of the most used kinds of tree, and they are used to build other data structures.
6868

@@ -81,7 +81,7 @@ The Binary Search Tree (BST) is a specialization of the binary tree. BST has the
8181
> BST: left ≤ parent < right
8282

8383
.BST or ordered binary tree vs. non-BST.
84-
image:image33.png[image,width=348,height=189]
84+
image::image33.png[image,width=348,height=189]
8585

8686

8787
===== Binary Heap
@@ -93,15 +93,15 @@ image:image33.png[image,width=348,height=189]
9393
The heap (max-heap) is a type of binary tree where the children's values are higher than the parent. Opposed to the BST, the left child doesn’t have to be smaller than the right child.
9494

9595
.Heap vs BST
96-
image:image34.png[image,width=325,height=176]
96+
image::image34.png[image,width=325,height=176]
9797

9898
The (max) heap has the maximum value in the root, while BST doesn’t.
9999

100100
There are two kinds of heaps: min-heap and max-heap.
101101
For a *max-heap*, the root has the highest value. The heap guarantee that as you move away from the root, the values get smaller. The opposite is true for a *min-heap*. In a min-heap, the lowest value is at the root, and as you go down the lower to the descendants, they will keep increasing values.
102102

103103
.Max-heap keeps the highest value at the top while min-heap keep the lowest at the root.
104-
image:image35.png[image,width=258,height=169]
104+
image::image35.png[image,width=258,height=169]
105105

106106

107107
.Heap vs. Binary Search Tree

book/content/part03/tree-search-traversal.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -110,7 +110,7 @@ We can also implement it as recursive functions are we are going to see in the <
110110
We can see visually the difference between how the DFS and BFS search for nodes:
111111

112112
.Depth-First Search vs. Breadth-First Search
113-
image:depth-first-search-dfs-breadth-first-search-bfs.jpg[]
113+
image::depth-first-search-dfs-breadth-first-search-bfs.jpg[]
114114

115115
As you can see the DFS in two iterations is already at one of the farthest nodes from the root while BFS search nearby nodes first.
116116

book/content/part04/backtracking.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ it stops and steps back (backtracks) to try another alternative.
1515
Some examples that use backtracking is a solving Sudoku/crosswords puzzle, and graph operations.
1616

1717
ifndef::backend-pdf[]
18-
image:Sudoku_solved_by_bactracking.gif[]
18+
image::Sudoku_solved_by_bactracking.gif[]
1919
endif::backend-pdf[]
2020

2121
Listing all possible solutions might sound like a brute force.

book/content/part04/merge-sort.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ Merge Sort is an efficient sorting algorithm that uses <
1414
indexterm:[Divide and Conquer]
1515
Merge sort algorithm splits the array into halves until 2 or fewer elements are left. It sorts these two elements and then merges back all halves until the whole collection is sorted.
1616

17-
image:image11.png[Mergesort visualization,width=500,height=600]
17+
image::image11.png[Mergesort visualization,width=500,height=600]
1818

1919
===== Merge Sort Implementation
2020

book/content/part04/selection-sort.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ The selection sort is a simple sorting algorithm. As its name indicates, it _sel
1515
. Find the minimum item in the rest of the array. If a new minimum is found swap them.
1616
. Repeat step #1 and #2 with the next element until the last one.
1717

18-
image:selection-sort.gif[]
18+
image::selection-sort.gif[]
1919

2020
===== Selection sort implementation
2121
For implementing the selection sort, we need two indexes.

0 commit comments

Comments
 (0)