UW-Madison
Computer Sciences Dept.

International Collegiate Programming Contest

Basic Graph Algorithms

This session focuses on graphs and basic graph algorithms. If you are not familiar with the relevant concepts and algorithms, or you need a refresher, you can use the resources below.

Problem Set (#1)

  1. Vertex
  2. Poor Trade Advisor
  3. Dark Roads
  4. Robot
  5. Wormholes
  6. The Orc Attack
  7. Highest Paid Toll
  8. Challenge Problem (I will be very impressed if you are able to solve this!): Routing

Note: All problems expect I/O to be done using standard input and standard output. The solution to problems must be written in a single file. If you are programming in Java the application class should be called "Main" and should be in the default package.

Resources

The graphs here are collections of vertices, joined by (possibly directed, possibly weighted) edges. For notions like neighborhoods, paths, cycles, connectedness, trees, and so on, the Glossary of graph theory on Wikipedia is a good source of information. This Wikipedia page has some information on the representation of graphs. The basic algorithmic procedures that are most relevant for the ICPC include:

  • graph traversal,
  • finding connected components,
  • finding shortest paths, and
  • finding minimum spanning trees.

Graph Traversal

There are two basic types of searches in a graph: breadth-first search (BFS) and depth-first search (DFS). Both of them produce a spanning tree/forest on undirected graphs.

Connectivity

One important notion for solving problems involving graphs is that of connected components. Connected components can be computed using depth-first or breadth-first search. It might be necessary to keep track of additional information, such as the number of vertices in the subtrees of the BFS or DFS trees. Sometimes one has to keep track of this information as a graph is dynamically generated, e.g., in the union-find algorithm.

Shortest Paths

Often, a graph is weighted, with each edge having a real number (often positive) as some sort of cost associated with it. The classic example is of a network of roads joining cities, with the weight of each edge being the length of the corresponding road. A natural problem is to find a shortest path between two vertices on the graph. There are a number of algorithms for finding shortest paths. Three important ones are Dijkstra's algorithm, Bellman-Ford, and Floyd-Warshall.

  • Dijkstra and Bellman-Ford

    Dijkstra's algorithm finds, for a given source vertex, the length of a shortest path from it to each other vertex in the graph, provided that all edge weights are nonnegative.The Wikipedia page for Dijkstra's Algorithm explains the basic algorithm. A naive implementation of Dijkstra's Algorithm runs in O(|V|2) time, where |V| is the number of vertices. More sophisticated implementations (using a priority queue) give a runtime of O(|E|log|V|) or even O(|E|+|V|log|V|), (here, |E| is the number of edges) which is significantly better if the graph is sparse (has far fewer than |V|^2 edges). Note that Dijkstra's algorithm can fail if the graph has edges with negative weights. In that case, a different algorithm, such as Bellman-Ford, can be used.

    We recommend memorizing dijkstra (you will need to for the placement test). For Bellman-Ford, an implementation can be found here.

  • Floyd-Warshall

    The Floyd-Warshall algorithm finds out more information: shortest paths between every pair of nodes in the graph. Of course, this is at the cost of running time -- the Floyd-Warshall algorithm runs in O(|V|^3) time. The Wikipedia article for Floyd-Warshall has more information.

    The 3-line Floyd-Warshall algorithm should be memorized and used in place of dijkstra when input is small (at most 500 or so). To recover paths in Floyd-Warshall, the code is more complicated, but can be found here.

Two observations. One often-useful special case is when all the edge weights are equal to 1. In this case, there is an even easier and faster algorithm -- breadth-first search, which runs in linear time! Just keep track of how far you are from your starting vertex when you first reach each vertex. Another important observation is the following: the simplest versions of these algorithms only find the length of a shortest path. If you want an actual shortest path and not just its length, you need to keep track of enough information to allow you to extract it! Fortunately, this is not difficult to do and doesn't significantly increase the running time of the algorithms.

Minimum Spanning Trees

Another important notion is that of a minimum spanning tree. If a graph is connected, there is guaranteed to be at least one tree which is a subgraph of the original graph and contains every vertex; this is a spanning tree. In a weighted graph, a minimum spanning tree is a spanning tree with the minimum possible (over all spanning trees) sum of edge weights.

  • Prim

    Prim's algorithm builds a minimum spanning tree by starting with an arbitrary vertex, and grows a tree by greedily connecting new vertices to the existing subtree. The resulting code is very similar to the one for Dijkstra's algorithm.

  • Kruskal

    Kruskal's algorithm iterates over the edges from smallest to largest weight, and includes an edge unless it creates a cycle. The subgraph may only become connected at the very end. On order to implement Kruskal efficiently, one uses an efficient union-find data structure to keep track of the connected components of the subgraph constructed thus far.

Both Prim and Kruskal run in the same time complexity - O(|E| log |V|), so one should be memorized and used. The greedy nature of constructing a MST is another notion that should be recognized, as it may help in problem variants.

Searching State Spaces

Imagine you are playing a game. At any point the game has some state, and some moves you could follow to go to one of a number of different states. You need to know whether, or how fast, you can get from a given start state to a given end state. How do you figure this out?

In general this is a difficult problem, and unless your game has some very nice properties that allow you to greatly simplify it, there is not much else you can do besides "brute force". The (intelligent) implementation of such a brute force tactic often takes the form of a breadth-first search on the space of states of the game.

You can imagine the states of the game being the vertices of a graph, with directed edges correponding to moves that take you from one state space to another. Often this graph will be very large -- so large that computing all the vertices and edges in the graph is prohibitively expensive -- but you are still interested in trying to find paths through the graph. The general approach is to search, breadth-first, along the edges of the graph, finding edges dynamically by computing how applying each move would change the current state. Since it is often too difficult to keep all possible states in memory, instead one can store encodings of all the previously-visited states in a self-balancing tree in order to relatively quickly check whether a state has been visited before, so as to avoid wasting time repeating already-completed searching.

 
Computer Sciences | UW Home