Define A[k,i,j] = the cost of going from vertex i to j using possibly intermediate vertices 0 upto k. Then A[n-1,i,j] = min cost of a path from vertex i to j. Consider A[k,i,j]. The least cost path between i and j either includes k or it doesn't include k. If it includes k then A[k,i,j] = A[k-1,i,k] + A[k-1,k,j]. If it does not include k then A[k,i,j] = A[k-1,i,j]. So, A[k,i,j] = min ( A[k-1,i,j], A[k-1,i,k] + A[k-1,k,j] ) k ~ ~ A[k-1,i,k] ~ ~ A[k-1,k,j] ~ ~ i ~~~~~~~~~~~~ j A[k-1,i,j] We can define the base case A[-1,i,j] = C[i,j], that is, the cost of going directly from i to j (i.e. not passing through any intermediate vertex). Now, unlike the previous algorithm, where one needs two matricies to store the current best cost for paths of length l edges and the new updated cost for paths of length l+1, this requires only one matrix A. The key is observing that A[k, i, j] <= A[k-1, i, j], that is, the cost can only get lower as you increase k. Also, A[k,i,k] = A[k-1,i,k], A[k,k,j] = A[k-1,k,j]. That is, the kth version of A does not change entries where i=k or j=k. So, rather than iterating over k in the inner loop, we move k to the outer loop and add new vertices (k) to consideration for each cost matrix. Now from the diagram above, for each new k we have A[k, i, j] = min ( A[k-1, i, j], A[k-1, i, k] + A[k-1, k, j] ) Now this expression has useful properties for i=j, i=k, or k=j since A[k-1, i, i] is 0 and A[k-1, k, k] = 0. This means that updating the [i, j] entry is always independent of the values of the [i,k] and [k,j] entries. So you can actually do the update in place. So our loops go: for k in range(n) for i in range(n) for j in range(n) A[i, j] = min ( A[i, j], A[i, k] + A[k, j] ) which O(n^3) time and O(n^2) space. This is still not a big improvement.