Skip to content

Commit fee24f8

Browse files
committed
before refactoring files
1 parent f850428 commit fee24f8

15 files changed

+78
-21
lines changed

book/book.adoc

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -185,7 +185,6 @@ include::chapters/greedy-algorithms--knapsack-problem.adoc[]
185185
186186
include::chapters/divide-and-conquer--intro.adoc[]
187187
188-
189188
:leveloffset: +1
190189
191190

book/chapters/algorithms-intro.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ We are going to start with sorting and searching algorithms, then you are going
44
IMPORTANT: There's not a single approach to solve all problems but knowing well-known techniques can help you build your own faster.
55

66
.We are going to discuss the following approaches for solving algorithms problems:
7-
- <>: makes choices at each step trying to find the optimal solution.
8-
- <>: use memoization for repated subproblems.
7+
- <>: makes greedy choices using heuristics to find the best solution without looking back.
8+
- <>: technique for solving problems with *overlapping subproblems*. It uses *memoization* to avoid duplicated work.
9+
- <>: *divide* problems into smaller pieces, *conquer* each subproblem and then *join* the results.
910
- <>: search all possible combinations until it finds the solution.
10-
- <>: break problems into smaller pieces and then join the results.

book/chapters/backtracking.adoc

Whitespace-only changes.

book/chapters/cheatsheet.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,7 @@ This section covers Binary Search Tree (BST) time complexity (Big O).
106106
| Selection sort | O(n^2^) | O(1) | Yes | Yes | Yes | Yes |
107107
| Bubble sort | O(n^2^) | O(1) | Yes | Yes | Yes | Yes |
108108
| Merge sort | O(n log n) | O(n) | Yes | No | No | No |
109-
| Quick sort | O(n log n) | O(log n) | Yes | No | No | No |
109+
| Quicksort | O(n log n) | O(log n) | Yes | No | No | No |
110110
// | Tim sort | O(n log n) | O(log n) | Yes | No | No | Yes | Hybrid of merge and insertion sort
111111
|===
112112

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,15 @@
1+
Divide and conquer is another strategy for solving algorithmic problems.
2+
It splits the input into manageble parts recursively and finally join solved pieces to form the end result.
3+
4+
.Examples of divide and conquer algorithms:
5+
- <>: *divides* the input into pairs, sort them and them *join* all the pieces in ascending order.
6+
- <>: *splits* the data by a random number called "pivot", then move everything smaller than the pivot to the left and anything bigger to the right. Repeat the process on the left and right side. Note: since this works in place doesn't need a join part.
7+
- <>: find a value in a sorted collection by *spliting* the data in half until it finds the value.
8+
- <>: *Take out* the first element from the input and solve permutation for the reminder of the data recursively, then *join* results and append the elements that were take out.
9+
10+
We can solve algorithms using D&C algorithms using the following steps.
11+
12+
.Divide and conquer algorithms steps:
13+
1. *Divide* data into subproblems.
14+
2. *Conquer* each subproblem.
15+
3. *Combine* solutions.

book/chapters/divide-and-conquer.adoc

Whitespace-only changes.

book/chapters/dynamic-programming--intro.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
Dynamic programming (dp) is a way to solve algorithmic problems by finding the base case and building a solution from the ground-up. We also keep track of previous answers to avoid re-computing the same operations.
1+
Dynamic programming (dp) is a way to solve algorithmic problems with *overlapping subproblems*. Algorithms using dp find the base case and building a solution from the ground-up. They also _keep track_ of previous answers to avoid re-computing the same operations.
22

33
// https://twitter.com/amejiarosario/status/1103050924933726208
44
// https://www.quora.com/How-should-I-explain-dynamic-programming-to-a-4-year-old/answer/Jonathan-Paulson

book/chapters/greedy-algorithms.adoc

Whitespace-only changes.

book/chapters/linear-data-structures-outro.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,11 +16,11 @@ To sum up,
1616

1717
.Use a Queue when:
1818
* You need to access your data in a first-come, first served basis (FIFO).
19-
* You need to implement a < First Search>>
19+
* You need to implement a <-First Search for Binary Tree, Breadth-First Search>>
2020

2121
.Use a Stack when:
2222
* You need to access your data as last-in, first-out (LIFO).
23-
* You need to implement a < First Search>>
23+
* You need to implement a <-First Search for Binary Tree, Depth-First Search>>
2424

2525
.Time Complexity of Linear Data Structures (Array, LinkedList, Stack & Queues)
2626
|===

book/chapters/output.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ The entire input must be iterated through, and this must occur O(log(n))
4242
times (the input can only be halved O(log(n)) times). n items iterated
4343
log(n) times gives O(n log(n)).
4444

45-
== Quick Sort
45+
== Quicksort
4646

4747
Body text
4848

0 commit comments

Comments
 (0)