Basic Algorithms

Some basic algorithms for coding interviews. Codes here are written in Python-style pseudocode. Actual tested codes will be put on my GitHub Repo.

Sorting

  • Insertion sort
    • Alg:
      Assume the head of the array is sorted. For each element afterwards, insert it to the corresponding proper location in the sorted head.
    • Time: $O(n^2)$ average, $O(n^2)$ worst, $O(n)$ best
  • Bubble sort
    • Alg: Repeat (for at most n times). For each pass, traverse the array to check if it is sorted (a.k.a: A[i]<=A[i+1]). If not, swap the locations of A[i] and A[i+1]. Then increment i. Repeat until no swap is needed in a traversal.
    • Time: $O(n^2)$ average, $O(n^2)$ worst, $O(n)$ best.
  • Radix sort (Bucket sort)
    • Alg: If A[i] is M, then place it at the $M^{th}$ bucket. Traverse through the buckets to get all numbers in place
    • Time: O(M)
    • Space: O(M)
  • Merge sort
    • Alg: Divide the array recursively into two subarrays. Merge them together to replace the original one.
    • Time: $O(n log(n))$ average, $O(n log(n))$ worst, $O(n log(n))$ best.
  • Quick sort
    • Alg: Divide and rearrange the array into two subarrays so that everything in the left subarray <= pivot <= everything in the right subarray. Now the location of pivot is decided. Apply the aforementioned procedure to each of its subarrays.
    • Time: $O(n log(n))$ average, $O(n^2)$ best, $O(n log(n))$ best.
    • Heap sort
      Alg: construct a heap (O(n)). Repeatedly extract the root to append to the sorted array (O(n log(n))).

ADT

  • Linked list
    • Insertion / Deletion: O(1) time
    • Implementation
  • Stack
    • Property: LIFO
    • Implementation in Python: Using list.pop(), list.append(x)
  • Queue
    • Property: FIFO
    • Implementation in Python: Use del(list[i]), list.insert()
  • Hash Table
    • Property: Constant Query time
    • Libraries available: Python dictionary, Java Map.
  • Tree
    Following descriptions for tree traversal are for binary trees; they can be generalized to other trees.

    • Level-order traversal
      • Iterative:
        Dequeue an element, enqueue left kid, enqueue right kid, repeat until queue is empty.
    • Pre-order traversal

      • Recursive
        1
        2
        3
        4
        def preOrder(A):
        print(A.data)
        preOrder(A.left)
        preOrder(A.right)
      • Iterative
        Pop and print current, push right, push left. Repeat until Stack is empty.
    • In-order traversal

      • Recursive
        1
        2
        3
        4
        def inOrder(A):
        inOrder(A.left)
        print(A.data)
        inOrder(A.right)
      • Iterative
        (1) Init Stack and set current to root
        (2) While current is not null, push current and current=current.left
        (3) If current is Null, then:
        Current = pop(), print current, current=current.right, then go to (2)
        (4) If current is Null and stack is empty, finish.
    • Post-order traversal

      • Recursive
        1
        2
        3
        4
        def postOrder(A):
        postOrder(A.left)
        postOrder(A.right)
        print(A.data)
      • Iterative
        (1) Init two stacks, s1 and s2. Push root to s1.
        (2) While s1 is not empty:
        Pop n from s1, and push to s2.
        Push to s1 n.left;
        Push to s1 n.right;
        (3) After s1 is empty, pop s2.
  • Graph
    • Representation: a set of Edge, and a set of Vertices (Nodes).

Special Trees

  • Heap
    • Peek
    • Insert. O(log(n))
    • RetrieveMax. O(log(n))
    • Heapsort. O(nlog(n))
    • Heap and priority queue
  • Binary Searching Tree

    • Peek
    • Insert. O(log(n))
    • Delete(k).
  • AVL tree

    • Balance Factor of a node is height of right - height of left child tree.
    • An AVL invariant tree has Balance Factor between [-1, 1] for all nodes.
    • An AVL tree is an AVL invariant BST.
    • To insert an element to an AVL within O(h):
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    def insert(D, Dparent, k):
    if (D == None):
    Dparent.child = Node(k)
    elif(D.data < k):
    insert(D.right, D, k)
    else:
    insert(D.left, D, k)
    AVLify(D) # needs only to handle the imbalance of node D, with BF either -2 or 2. If BF=-2, perform either right rotation or left-right rotation. If BF=2, perform either left rotation or right-left rotation.

    def right_rotation(D):
    root = D.left
    tmp = root.right
    root.right = D
    D.left = tmp

    def left_rotation(D):
    root = D.right
    tmp = root.left
    root.left = D
    D.right = tmp

    def left_right(D):
    left_rotation(D.left)
    right_rotation(D)

    def right_left(D):
    right_rotation(D.right)
    left_rotation(D)
  • Red-black tree
  • Trie

Graphical Algorithms

  • Breadth-first Search (BFS)
    • Iterative solution
      For each node dequeue’d, push left child to queue, push right child to queue. Repeat until the queue is empty.
    • Notes:
      (1) BFS does not visit an edge twice.
      (2) In a unit length edged graph, if node u is visited before v, then d(s,u)<=d(s,v) where d denotes the shortest distance and s is the source node.
      (3) The time and space complexity of BFS are both O(|V|+|E|).
  • Depth-first Search (DFS)
    • Recursive
      1
      2
      3
      4
      5
      def DFS(v):
      visit(v)
      for n in v.neighbours:
      if (!n.isVisited()):
      DFS(n)
    • Iterative
      Initialize a stack. For each visiting node, push all unvisited neighbours to a stack. Pop() from the stack to visit a node.
  • Notes about DFS:
    (1) In CLRS the DFS starts from all nodes, where here DFS only traverses from one.
    (2) DFS does not visit an edge twice.
    (3) The time and space complexity of DFS are both O(|V|+|E|). However, if you are DFS-ing a tree with branch factor b and depth k, the time complexity is still , but the space complexity is .
  • Shortest Path: Dijkstra
    Alg:
    (1) Initialization: set dist of all vertices to $\infty$. Mark all nodes unvisited.
    (2) Start from the initial node. Set dist[start]=0:
    Pick the unvisited node with least dist as u
    Update all neighbours of u:
    For v in u.neighbours:
        v.dist = min(v.dist, u.dist + W(u,v))
    
    Then remove u from the unvisited set.
    (3) Stop when the destination node is visited, or when u has infinity distance.
    • Proof of correctness: proof of greedy algorithm.
    • Time complexity: O(|E| + |V|log|V|)
  • Shortest path with negative weighted edge: Bellman-Ford

  • All-source shortest Path: Floyd-Warshall

  • All-pair shortest paths with negative weight circle: Johnson

Network Flow Algorithms

  • Flow Network Definitions

  • Finding maximum flow: Ford-Fulkerson

  • Improved maximum flow: Edmond-Karp

  • Faster maximum flow: Push-relabel

  • Applications of network flow: bipartite matching

Linear Programming

  • Definitions

  • Simplex Algorithm

Complexity Theory

  • Definitions: CNF, P, NP

  • Definitions: NPC

Approximation Algorithms