Floyd/Warshall algorithm to compute the min cost path between any two vertices. n is the number of vertices C is the cost matrix, where C[i, j] is the cost of taking the edge from vertex i to vertex j. If there is no edge the cost is infinite. Costs are always non-negative. C[i, i] = 0; M[l, i, j] is the min cost of going from i to j using l or fewer edges. Note that a) M[0, i, i] = 0, and for i != j, M[0, i, j] = infinity b) M[1, i, j] = C[i, j] c) for l >= n then M[l-1, i, j] = M[l, i, j] since you cannot have a path of more than n-1 edges in the graph. So M[n, i, j] is the min cost of a path from vertex i to vertex j in a graph with n vertices. Then the relationship between M for increasing l is as follows: M[l+1, i, j] = Min over k of M[l, i, k] + C[k, j] Computing this in the naive way is O(n^4), and needs two n x n matrices for l in range(2, n): for i in range(n): for j in range(n): M[l+1, i, j] = min over k of M[l, i, k] + C[k, j] Note C[i, j] = M[1, i, j] so we can rewrite the min as: M[l+1, i, j] = min over k of M[l, i, k] + M[1, k, j] This suggests that M is computed using a kind of matrix multiplication where instead of using the (+, *) inner product, we use the (min, +) inner product: M[n] = M[1] x M[1] x ... x M[1] So we could use the recursive doubling trick from the Diffie-Hellman key exchange to comput M[n] by log n squarings of M[1]. for l in range(log(n)) for i in range(n): for j in range(n): M[l+l, i, j] = min over k of M[l, i, k] + M[l, k, j] Which is of time complexity O(n^3 log n) This is still not particularly good. Consider a graph of 1000 vertices.`