Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Graph Theory: Splitting a Graph

I have this problem. I have a graph of n nodes that I want to split into two subgraphs of x nodes and n-x nodes subject to the constraint that the number of remaining edges is maximized (or minimizing the number of edges that are cut).

Not sure if that makes sense. Not a graph theory person but this is the abstract version of my problem. What algorithms should I look at that might help me?

This is NOT a homework problem. Interesting problem though I think!

I plan on implementing in C.

like image 845
JoshDG Avatar asked Apr 17 '12 05:04

JoshDG


People also ask

What is a splitting diagram in math?

A split graph is a graph whose vertices can be partitioned into a clique and an independent vertex set.

Is a complete graph a split graph?

In a complete graph or complete bipartite graph, every cut is a split.

How do you decompose a graph?

Abstract—A decomposition of a graph G is a set D = {H1, ··· , Hk} of pairwise edge-disjoint subgraphs of G that cover the set of edges of G. If Hi is isomorphic to a fixed graph H, for 1 ≤ i ≤ k, then we say that D is an H-decomposition of G. In this work, we study the case where H is a path of fixed length.


2 Answers

The special case where x = n - x is called the minimum bisection problem and is NP-hard. This makes your problem NP-hard as well. There are several heuristics described in the Wikipedia article on graph partitioning, including local search (e.g., start with a random cut and repeatedly swap pairs of vertices that decrease the size of the cut) and spectral methods (e.g., compute and threshold the second eigenvector). If n is small, integer programming is also a possibility.

like image 144
uty Avatar answered Sep 28 '22 02:09

uty


Perhaps a depth first search over nodes. We assign nodes one at a time and count the number of cuts so far. If that number exceeds the best solution's number, then we abort this one and backtrack.

  1. Given the full set of nodes S, let P and P' be setsuts of nodes, initially empty, and K by the number of cut edges, initially 0.
  2. Let S*, K* be the best known solution and its number of cut edges, with K* initially infinity.
  3. Pick a node N to start with and assign it to S.
  4. For each unassigned node M:
    1. Assign M to S', and add the number of edges from M to nodes in S to K.
    2. If K > K*, then no solution based on this one will beat the best, so assign M to S.
    3. If |S| > X, then the set has grown too big; backtrack.
    4. Otherwise, recurse from 4.
  5. If all nodes are assigned and K < K*, then let S* = S and K* = K.

I've been imagining this as a Prolog-type algorithm, but doing it in C shouldn't be too hard. Backtracking just means unassigning the last assigned node.

like image 24
Edmund Avatar answered Sep 28 '22 02:09

Edmund