# Notes about algorithms and data structures

## Data structures

### Basic data structures

#### Static

• Array

Provides:

• constant time access
• space efficiency
• memory locality

Arrays have fixed size, unless the programming language of choice provides dynamic arrays.

#### Dynamic

• List

Provides:

• dynamic size (no overflow)
• simpler insert/delete operations

Lists require extra space and do not have efficient random access. Memory allocation is sparse.

• Stack

Implements a last in, first out management policy. Useful when retrivial order is not important.

• Queue

Implements a first in, first out management policy. Useful when retrivial order is important.

• Dictionary

Provides data access by key.

• Binary search tree

A tree consists of a root and left and right leaves. It operates under the assumption that:

`val(left) < val(root) < val(right)`

• Prioprity queue

Items have a priority value associated.

## Combinatorial search & Heuristic methods

### Backtracking

Systematic iteration through all possibile configurations of the search space. Avoids repetitions and missing configurations and find the optimal value for small problems.

Construct a tree of partial solutions. At each step:

• start from a partial solution `S`
• add an element to `S`
• check if `S` is a still a solution

If `S` is a partial solution that can't be extended drop it.

### Heuristic search methods

Need a cost function to evaluate how good a solution is, and a representation of the solutions space.

#### Random sampling

Pick random solutions and evaluate them; report the best one among the lot.

Use when there's a high proportion of acceptable solutions and the solutions space is not coherent.

#### Local search

Pick a solution from the solutions space and search any neighbours solution for an improvement (a neighbour solution is just a transition away from the picked solution).

Use when the solutions space is coherent and the incremental evaluation is cheaper than global evaluation. The method becomes useless as soon as a local optimum is found.

#### Simulated Annealing

Pick a solution from the solution space, define a delta and search the next neighbour solution which is better or costs less than delta. Decrease delta and repeat the process.

Spends most of the time on good solutions and avoids getting trapped in the same local optimum.

## Dynamic programming

Dynamic programming implements a recursive algorithm by storing partial results. It is useful if the same value is calculated again and again and removes the need for recursive calls.

It can be applied if the problem observes the principal of optimality: a partial solution can be extended regardless how it is obtained.

Left to right ordering improves efficiency of the algorithm.

## Graphs

It is made of a set of vertices and edges, and has the following properties:

• directed (undirected): direction of edges is important
• weighted (unweighted): each vertex/edge has a weigth associated to it
• non simple (simple): has self loops, loops and multiedges
• sparse (dense): only few vertices have edges
• cyclic (acyclic): contains cycles
• embedded (topological): vertices and edges are assigned geometric positions
• implicit (explicit): graph is constructed as it is used
• labeled (unlabeled): vertices are assigned a unique label

### Data structures

Graphs can represented with the following data structures:

It is a `V x E` matrix `M` which satisfies the property:

`M[i,j] = 1` if `(i,j)` is an edge of `G` connecting vertex `i` and vertex `j`.

It is a linked list which stores vertex neighbours (a list for each vertex).

### Traversal

The basic idea is to classify vertices as:

• undiscovered: initial state;
• discovered: found but no edges checked;
• processed: all edges checked.

Then check edges:

• consider only edges that go to undiscovered vertex;
• undirected edges considered twice;
• directed edges considered from source vertex only.

Use a queue: find all adjacet vertices of `v` and repeat for each unprocessed vertex.

Some applications:

• find a path;
• find connected components;
• vertex coloring problem.

#### Depth-first search

Use a stack: move to an unprocessed adjacent vertex `u` of `v` or go back if all adjacent vertices of `v` are processed.

Some applications:

• finding cycles in undirected graphs;
• find articulation vertices;
• topological sorting (applies on Directed Acyclic Graphs only; vertices are sorted with edges going from left to right).

### Weighted graph algorithms

#### Minimum spanning tree

The spanning tree is a subset of the graph edges that form a tree and connect all vertices. Being the edges weighted, the minimum spanning tree is the one which have the smallest sum of those weigths.

#### Shortest path

It's the shortest sequence of edges connecting two vertices.

## Sorting and Searching

### Divide and Conquer

Split the problem into two smaller problems of the same type and then use recursion to solve the subprolems. Join results if needed.