Classical Algorithms

2 Pointers

schematic

/projects/ccs/dsa/classical/
/2pointers.gif
fast and slow pointers

this paradigm works well on arrays, reducing the time complexity of some popular problems from $\mathcal{O}(n^2)$ to $\mathcal{O}(n)$.

there are a few flavours of this technique: fast and slow; opposite pointers, but ultimately we do keep track of two indices left and right. sometimes abbreviated to l and r in the code.

common problems that this algorithm works well on include:

  • finding cycles in a linked list (fast and slow pointers)
  • finding pairs
  • comparing elements from opposite ends of a data structure

Sliding Window

this is a special case of the two-pointers strategy above, and maintains a fixed-window length across the array.

it can be used to solve problems of the sort:

it is often combined with hashmaps to keep track of the elements within the window

Breadth First Search

this algorithm is good for finding the shortest path between two nodes.

it is worth realising that a tree is just a graph without cycles.

BFS uses a queue data-structure. we can implement bfs by using a queue which is built with a deque:

  from collections import deque

  def bfs_by_queue(root):
      queue = deque([root])
      while len(queue) > 0:
	  node = queue.popleft()
	  for child in node.children:
	      if OK(child):
		  return FOUND(child)
	      queue.append(child)
      return NOT_FOUND

see leetcode problem 102

Backtracking

this is just an extension of DFS (?), with the caveat that there is no longer a pre-built structure (i.e. an underlying tree)

see leetcode problem 17

Dynamic Programming

this is one of the more difficult patterns (allegedly). there are two types:

top-down

back-tracking + memoisation

bottom-up

solve smallest subproblems first.

fill out a table

Depth First Search

this algorithm is better for finding a route to a vertex that is deep within the search-tree.

we use an implementation of a "stack" to execute this algorithm, but in practice we can often use recursion paired with a "call-stack"!

DFS is good for:

  • exploring all paths
  • detecting cycles
  • memory efficient searchs
  • deep searching

see leetcode problem 200

Heap / Priority Queue

heaps are implementations of priority queues (?)

this data structure is used to solve problems of the type: "top K" and is slightly unintuitive, because to find the top K "largest", we require a min-heap, and to find the top K "smallest", we require a max-heap:

min-heap, k-largest

max-heap, k-smallest

see leetcode problems:

Divide and Conquer

Trie

this refers to the data-structure, but algorithms for insertion, deletion and search can be documented here.

there is also a relation with a Radix Tree which is a compressed version of a trie, but I'm not sure how it all connects.

Union Find

again, I am not sure where to draw the line between data-structure and algorithm here.

it seems that this is a data-structure for handling a collection of "disjoint-sets" and efficiently performs two primary operations:

  • union: merging two sets and;
  • find: determining which set an element belongs to

Greedy

Graph

we have already covered a couple of "graph" algorithms on this page: binary search, breadth first search, depth first search. but there are many more such algorithms:

  • prim
  • kruskal
  • dijkstra

    dijkstra fail
/projects/ccs/dsa/classical/
/bellman-ford.gif
bellman ford algorithm

Linear Programming

a "sledgehammer" of the trade according to Dasgupta

Recursion

recursion is probably a concept that deserves at least a small amount of recognition as a concept, and designated place wherein which to place notes.

this algorithmic technique goes hand-in-hand with trees and divide and conquer

Searching

searching is one of the main concerns of graph algorithms: finding a route to a particular vertex or finding the value of a vertex itself.

we have covered binary search, breadth first search and depth first search, but we have not yet made notes about:

  • heuristic searches
  • a* star search
  • monte carlo

Sorting

we have not covered sorting at all on this page, it seems to be one of the less examined topics in leetcode. nonetheless, it is an important and interesting application of the data structures and algorithms we know.

time complexities1

Algorithm Time Space (worst)
Best Average Worst
quicksort$\Omega(n\log n)$$\Theta(n\log n)$$\mathcal{O}(n^2)$$\mathcal{O}(\log n)$
mergesort$\Omega(n\log n)$$\Theta(n\log n)$$\mathcal{O}(n\log n)$$\mathcal{O}(n)$
timsort$\Omega(n)$$\Theta(n\log n)$$\mathcal{O}(n\log n)$$\mathcal{O}(n)$
heap$\Omega(n\log n)$$\Theta(n\log n)$$\mathcal{O}(n\log n)$$\mathcal{O}(1)$
bubble$\Omega(n)$$\Theta(n^2)$$\mathcal{O}(n^2)$$\mathcal{O}(1)$
insertion$\Omega(n)$$\Theta(n^2)$$\mathcal{O}(n\log n)$$\mathcal{O}(1)$
selection$\Omega(n^2)$$\Theta(n^2)$$\mathcal{O}(n^2)$$\mathcal{O}(1)$
bucket$\Omega(n+k)$$\Theta(n+k)$$\mathcal{O}(n^2)$$\mathcal{O}(n)$
radix$\Omega(nk)$$\Theta(nk)$$\mathcal{O}(nk)$$\mathcal{O}(n+k)$

Footnotes


1

these time complexities, along with a number of others were taken from https://bigocheatsheet.com.