Book Summary: Grokking Algorithms

Book Summary: Grokking Algorithms

Chapter 1

  • Logs are the flip of exponentials
  • Big O notation tells the worst case runtime
  • Linear time = if max number of guesses is the same as the size of the list - O(n)
    • n : number of operations
  • Logarithmic (log) time - O( log n ) - Eg: Binary Search
  • Linear time - O( n ) - Eg. Simple Search
  • O( n log n ) - Eg. Quick Sort
  • O( n^2 ) - Eg. Selection Sort
  • O( n! ) - Eg. Traveling Salesman Problem
  • It is not enough to know how long an algorithm takes to run - you need to know how the running time increase as the list size increase.

Chapter 2

  • Types of Access:
    • Random Access (Arrays)
    • Sequential Access (Linked Lists)
  • Linked lists are great if you are going to real all the items one at a time, BUT if you're going to keep jumping around, linked lists are terrible.
  • Arrays are great if you want to read random elements.
  • Arrays have fast reads and slow inserts, linked lists have slow reads and fast inserts.

In Summary

  • Reading: Arrays: O(1) Lists: O(N)
  • Insertion: Arrays: O(N) Lists: O(1)
  • Deletion: Arrays: O(N) Lists: O(1)
  • Lists are better if you want to insert elements into the middle.
  • Lists are better if you want to delete.
  • Insertion and deletion takes O(1) in lists only if you can instantly access the element to be deleted (1st and last element in a list, since we usually track these two elements using pointers).

Chapter 3

  • Recursion breaks the problem down into:
    • Base case (when to stop)
    • Recursive case

"Loops may achieve a performance gain for your program. Recursion may achieve a performance gain for your programmer. Choose which is more important in your situation"

  • Because a recursive function calls itself, it's easy to write a function incorrectly that ends up in an infinite loop.
  • Stacks has only 2 actions:
    • Push (insert)
    • Pop (remove and read)
  • When you call a function from another function, the calling function is paused in a partially completed state (suspended state).
  • The stack used to save the variables for multiple functions is called the "Call Stack"
  • Using the stack is convenient, but there's a cost: saving information of function calls can take up a lot of memory if your stack is too tall. At this point you've 2 options:
    • You can rewrite your code to use a loop instead.
    • You can use something called "Tail Recursion" (this is not supported by some languages)

Chapter 4

  • When you get a new problem, you don't have to be stumped. Instead, you can ask, "Can I solve this if I use divide-and-conquer?"
  • D&C algorithms are recursive algorithms, and recursion keeps track of the states.
  • According to D&C, with every recursive call, you have to reduce your problem.
  • To solve a problem using D&C, there are 2 steps:
    • Figure out the base case. This should be the simplest possible case. [simplest input to your solution]
    • Divide or decrease your problem until it becomes the base case.
  • Euclid's Algorithm:
    • The Euclidean algorithm is based on the principle that the greatest common divisor of two numbers does not change if the larger number is replaced by its difference with the smaller number. For example, 21 is the GCD of 252 and 105 (as 252 = 21 × 12 and 105 = 21 × 5), and the same number 21 is also the GCD of 105 and 252 − 105 = 147. Since this replacement reduces the larger of the two numbers, repeating this process gives successively smaller pairs of numbers until the two numbers become equal. When that occurs, they are the GCD of the original two numbers.
  • When you're writing a recursive function involving an array, the base case is often an empty array, or an array with one element. If you're stuck, try that first.
  • Functional Programming:
    • Functional programming languages like Haskell don't have loops. So you've to use recursion to write functions.
  • Note about Binary Search:
    • The base case for binary search is: an array with one item. If the item you're looking for matches the item in the array, you found it! Otherwise, it's in the array.
    • The recursive case for binary search is: you split the array in half, throw away one half, and call binary search on the other half.
  • Quick sort uses divide-and-conquer.

Nice Programming Questions on Recursion:

  • Write a recursive function to get the sum of a list of numbers

      def sum(list):
          if list == []: 
              return 0;
          return list[0] + sum(list[1:])
    
  • Write a recursive function to count the number of items in a list

      def count(list):
          if list == []:
              return 0
          return 1 + count(list[1:]
    
  • Find the maximum number in a list

      def max(list):
          if len(list) == 2:
              return list[0] if list[0] > list[1] else list[1]
          sub_max = max(list[1:])
          return list[0] if list[0] > sub_max else sub_max