You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: book/D-interview-questions-solutions.asc
+14-16
Original file line number
Diff line number
Diff line change
@@ -803,21 +803,21 @@ graph G {
803
803
804
804
The red connections are critical; if we remove any, some servers won't be reachable.
805
805
806
-
We can solve this problem in one pass using DFS. But for that, we keep track of the nodes that are part of a loop (strongly connected components). To do that, we use the time of visit (or depth in the recursion) each node.
806
+
We can solve this problem in one pass using DFS. But for that, we keep track of the nodes that are part of a loop (strongly connected components). We use the time of visit (or depth in the recursion) each node.
807
807
808
808
For example C, if we start on `c0`, it belongs to group 0, then we move c1, c2, and c3, increasing the depth counter. Each one will be on its own group since there's no loop.
809
809
810
810
For example B, we can start at `b0`, and then we move to `b1` and `b2`. However, `b2` circles back to `b0`, which is on group 0. We can update the group of `b1` and `b2` to be 0 since they are all connected in a loop.
811
811
812
-
For an *undirected graph*, If we found a node on our dfs, that we have previously visited, we found a loop! We can mark all of them with the lowest group number. We know we have a critical path when it's a connection that links two different groups. For example A, they all will belong to group 0, since they are all in a loop. For Example B, we will have `b0`, `b1`, and `b2` on the same group while `b3` will be on a different group.
812
+
For an *undirected graph*, If we found a node on our DFS, that we have previously visited, we found a loop! We can mark all of them with the lowest group number. We know we have a critical path when it's a connection that links two different groups. For example A, they all will belong to group 0, since they are all in a loop. For Example B, we will have `b0`, `b1`, and `b2` on the same group while `b3` will be on a different group.
813
813
814
814
*Algorithm*:
815
815
816
816
* Build the graph as an adjacency list (map + array)
817
817
* Run dfs on any node. E.g. `0`.
818
818
** Keep track of the nodes that you have seen using `group` array. But instead of marking them as seen or not. Let's mark it with the `depth`.
819
819
** Visit all the adjacent nodes that are NOT the parent.
820
-
** If we see a node that we have visited yet, do a dfs on it and increase the depth.
820
+
** If we see a node that we have visited yet, do a DFS on it and increase the depth.
821
821
** If the adjacent node has a lower grouping number, update the current node with it.
822
822
** If the adjacent node has a higher grouping number, then we found a critical path.
823
823
@@ -863,18 +863,16 @@ The first thing we need to understand is all the different possibilities for ove
We are asked to sort an array with 3 possible values. If we use the standard sorting method `Array.sort`, that will be `O(n log n)`. However, we are asked to solve in linear time and constant space complexity.
903
+
We are asked to sort an array with 3 possible values. If we use the standard sorting method `Array.sort`, that will be `O(n log n)`. However, there's a requirement to solve it in linear time and constant space complexity.
906
904
907
-
The concept on quicksort can help here. We can choose 1 as a pivot and move everything less than 1 to the left and everything bigger than 1 to the right.
905
+
The concept of quicksort can help here. We can choose `1` as a pivot and move everything less than 1 to the left and everything more significant than 1 to the right.
908
906
909
907
*Algorithm*:
910
908
@@ -922,7 +920,7 @@ The concept on quicksort can help here. We can choose 1 as a pivot and move ever
Copy file name to clipboardExpand all lines: book/content/part04/sorting-algorithms.asc
+8-8
Original file line number
Diff line number
Diff line change
@@ -5,8 +5,8 @@ endif::[]
5
5
6
6
=== Sorting Algorithms
7
7
8
-
Sorting is one of the most common solutions when we want to extract some insights about a collection of data.
9
-
We can sort to get the maximum or minimum value and many algorithmic problems involves sorting data first.
8
+
Sorting is one of the most common solutions when we want to extract some insights about data.
9
+
We can sort to get the maximum or minimum value, and many algorithmic problems can benefit from sorting.
10
10
11
11
.We are going to explore three basic sorting algorithms _O(n^2^)_ which have low overhead:
12
12
- <<part04-algorithmic-toolbox#bubble-sort>>
@@ -21,15 +21,15 @@ Before we dive into the most well-known sorting algorithms, let's discuss the so
21
21
22
22
==== Sorting Properties
23
23
24
-
Sorting implementations with the same time complexity might manipulate the data differently. We want to understand these differences so we can be aware of the side-effects it will have on data or extra resources they will require. For instance, some solutions will need auxiliary memory to store temporary data while sorting while others can do it in place.
24
+
Sorting implementations with the same time complexity might manipulate the data differently. We want to understand these differences to be aware of the sideeffects it will have on data or extra resources they will require. For instance, some solutions will need auxiliary memory to store temporary data while sorting, while others can do it in place.
25
25
26
-
Sorting properties are stable, adaptive, online and in-place. Let's go one by one.
26
+
Sorting properties are stable, adaptive, online, and in-place. Let's go one by one.
27
27
28
28
===== Stable
29
29
(((Sorting, stable)))
30
30
An ((stable sorting)) algorithms keep the relative order of items with the same comparison criteria.
31
31
32
-
This especially useful when you want to sort on multiple phases.
32
+
This incredibly useful when you want to sort on multiple phases.
33
33
34
34
.Let's say you have the following data:
35
35
[source, javascript]
@@ -82,7 +82,7 @@ Both results are sorted by `age`; however, having a stable sorting is better if
82
82
===== In-place
83
83
(((Sorting, in-place)))
84
84
An ((in-place sorting)) algorithm would have a _space complexity_ of O(1). In other words, it does not use any other auxiliary memory because it moves the items in the collection itself.
85
-
No requiring extra memory for sorting is especially useful for memory constraint environments like robotics, smart devices, or embedded systems in appliances.
85
+
No extra memory for sorting is especially useful for large amounts of data or in memory constraint environments like robotics, smart devices, or embedded systems in appliances.
86
86
87
87
===== Online
88
88
(((Sorting, online)))
@@ -111,7 +111,7 @@ include::quick-sort.asc[]
111
111
<<<
112
112
==== Summary
113
113
114
-
We explored many algorithms some of them simple and other more performant. Also, we cover the properties of sorting algorithms such as stable, in-place, online and adaptive.
114
+
We explored the most common sorting algorithms, some of which are simple and others more performant. Also, we cover the properties of sorting algorithms such as stable, in-place, online, and adaptive.
115
115
(((Tables, Algorithms, Sorting Complexities)))
116
116
(((Tables, Algorithms, Sorting Summary)))
117
117
@@ -162,7 +162,7 @@ We explored many algorithms some of them simple and other more performant. Also,
162
162
163
163
// end::sorting-q-merge-intervals[]
164
164
165
-
// _Seen in interviews at: X._
165
+
// _Seen in interviews at: Facebook, Amazon, Bloomberg._
0 commit comments