I just finished working on another implementation of Kruskal’s algorithm. This version depends more on user input to generate the minimum spanning path tree.

# Tag Archives: minimum spanning tree

# Kruskal’s Algorithm

I’ve just written a script that executes Kruskal’s algorithm on a randomly generated graph.

Given a weighted graph, many times we are interested in finding a minimum spanning tree (MST) for that graph. These have several applications in areas like transportation and the network simplex method. We already discussed Prim’s algorithm. Another method for generating minimum spanning trees is Kruskal’s algorithm. A spanning tree is a subset of the edges of a graph that connects to every vertex, but contains no cycles. This spanning tree is called a minimum spanning tree if in addition the sum of the weights of the edges included in this tree is less than or equal to the sum of the weights of the edges of any other spanning tree for this graph.

Kruskal’s algorithm works by the following procedure.

1. Initially each vertex is a stand-alone tree, so for each *v* in *V*, we define the tree Tree_{v} = {v}. The set of selected edges *E** is initially empty.

2. Find the edge *e* = (*u*, *v*) of minimum weight such that *u* and v belong to different trees. If no such edge exists, go to 6.

3. Merge the trees T_{lookup(u)} and T_{lookup(v)}.

4. Add the edge *e* to *E** and remove the edge *e* from the graph.

5. If the size of E* is less than *n* – 1, go to step 2. Else go to step 7.

6. If you reached this step. then the graph is not connected.

7. If you reached this step, then E* is a minimum spanning tree.

For example, consider the graph represented by the following adjacency matrix:

0 | 1 | 2 | 3 | 4 | |

0 | – | – | – | 13 | 12 |

1 | – | – | – | – | 16 |

2 | – | – | – | – | 24 |

3 | 13 | – | – | – | 13 |

4 | 12 | 16 | 24 | 13 | – |

Initially we have 5 distinct trees and E* = {}/

T_{0} = {0}

T_{1} = {1}

T_{2} = {2}

T_{3} = {3}

T_{4} = {4}.

The first step of Prim’s algorithm says to find the cheapest edge such that its two endpoints belong to different trees. This will be the edge (0, 4) with a cost of 12. So E* = {(0, 4)}. We then merge the two trees so that our trees are now:

T_{0} = {0, 4}

T_{1} = {1}

T_{2} = {2}

T_{3} = {3}

Again, we look for the cheapest edge such that the endpoints of the two edges are in different trees. There are two edges with a cost of 13 (either (0, 3) or (3, 4)) so we will arbitrarily choose (0, 3) and add it to our tree. So E* = {(0, 4), (0, 3)}. We again merge the associated trees and it results in the following trees:

T_{0} = {0, 3, 4}

T_{1} = {1}

T_{2} = {2}

The cheapest edge that has endpoints in distinct trees will be the edge (1, 4) with a cost of 16. We add this edge to our tree. So E* = {(0, 4), (0, 3), (1, 4)}. Once we merge the associated trees we have the following:

T_{0} = {0, 1, 3, 4}

T_{2} = {2}

The cheapest remaining edge that has endpoints in distinct trees will be the edge (2, 4) with a cost of 16. This makes E* = {(0, 4), (0, 3), (1, 4), (2, 4)}. We merge the associated trees and arrive at:

T_{0} = {0, 1, 2, 3, 4}

Because T_{0} contains all the nodes in the graph it is a spanning tree. Its total cost is 12 + 13 + 16 + 24 = 65.

To learn more and see more examples, view Kruskal’s Algorithm at LEARNINGlover.com

# Prim’s Algorithm

I have just written a script that executes Prim’s Algorithm that finds the minimum spanning tree on a randomly generated graph.

Given a weighted graph, many times we are interested in finding a minimum spanning tree (MST) for that graph. This has many applications including the very important network simplex method. Prim’s algorithm is a greedy method which does finds this MST. A spanning tree is a subset of the edges of a graph that connects every vertex, but contains no cycles. This spanning tree is called a minimum spanning tree if in addition the sum of the weights of the edges included in this tree is less than or equal to the sum of the weights of the edges of any other spanning tree for this graph.

Prim’s algorithm works by the following procedure.

1. Let *Tree _{v}* be the set of vertices included in the tree, and

*Tree*be the set of edges included in the tree. Initially

_{E}*Tree*and

_{v}*Tree*are empty.

_{E}2. Add an arbitrary vertex to

*Tree*(

_{v}*Tree*is still empty).

_{E}3. Find the edge

*e*of minimum weight such that one vertex is in

*Tree*and vertex is not in

_{v}*Tree*. Add the associated vertex to

_{v}*Tree*, and add

_{v}*e*to

*Tree*.

_{E}4. If edge was found in step 3, goto 5, else go to 6.

5. If the number of vertices in

*Tree*is less than the number of vertices in the original graph, then the graph is not connected and thus does not contain a minimum spanning tree. Goto 8.

_{v}6 If the number of vertices in

*Tree*is less than the number of vertices in the original graph, go to 2, else go to 7.

_{v}7. Output “The Minimum Spanning Tree is “,

*Tree*.

_{E}8. Output “This graph does not have a minimum spanning tree because it is not connected. ”

For example, consider the graph represented by the following adjacency matrix:

0 | 1 | 2 | 3 | 4 | |

0 | – | – | – | 13 | 12 |

1 | – | – | – | – | 16 |

2 | – | – | – | – | 24 |

3 | 13 | – | – | – | 13 |

4 | 12 | 16 | 24 | 13 | – |

Initially our tree (T_{v} is empty). The first step says to choose a random vertex and add it to the tree, so lets choose vertex 2.

Iteration 1: Now our tree contains the vertex 2 (i.e. T_{v} = {2}) and likewise T_{E} contains the edges coming from T_{v}. Thus T_{E} = {(2, 4)}.

We want to choose the cheapest edge that has one endpoint in T_{v} and one endpoint not in T_{v}. These edges are represented by T_{E}. Notice that T_{E} only contains one edge, so we select this

edge, which has a cost of 24.

Iteration 2: Our tree thus contains the vertices 2 and 4 (i.e T_{v} = {2, 4}) and likewise T_{E} contains the edges coming from T_{v}. Thus T_{E} = {(0, 4), (1, 4), (3, 4)}.

Again, we want to choose the cheapest edge that has one endpoint in T_{v} and one endpoint not in T_{v}. This will be the edge (0, 4) which has a cost of 12.

Iteration 3: Now T_{v} = {0, 2, 4} and T_{E} = {(1, 4), (3, 4), (0, 3)}. The cheapest of these three edges is the edge (0, 3) with a cost of 13, which means we will add it to our tree.

Iteration 4: Now T_{v} = {0, 2, 3, 4} and T_{E} = {(1, 4)}. Since (1, 4) is the only edge connected to our tree we add it and it has a cost of 16.

Iteration 5: Now T_{v} = {0, 1, 2, 3, 4} and T_{E} = {}. Because our tree contains all the vertices of the graph it is now spanning tree. The cost of this spanning tree is 24 + 12 + 13 + 16 = 65.

To learn more and see more examples, view Prim’s Algorithm at LEARNINGlover.com