I want to know the cyclomatic complexity of two section of code,
IF((A>B) AND (C>D))
{ a=a+b;c=c+d;}
as far my knowledge the cyclomatic complexity of above code=2+1=3,
Another code
IF((A>B) OR (C>D))
{a=a+b;c=c+d;}
Complexity of the above code is=4+1=5,
whether the above complexities are correct or not?
3) Cyclomatic complexity V(G) = P +1 V (G) = 2 + 1 = 3 Where P is predicate nodes (node 1 and node 2) are predicate nodes because from these nodes only the decision of which path is to be followed is taken.
Cyclomatic complexity of a code section is the quantitative measure of the number of linearly independent paths in it. It is a software metric used to indicate the complexity of a program. It is computed using the Control Flow Graph of the program.
Cyclomatic complexity is a metric that indicates the possible number of paths inside a code artifact, e.g., a function, class, or whole program. Thomas J. McCabe Sr. developed this metric, first describing it in a 1976 paper.
Cyclomatic complexity is a measurement developed by Thomas McCabe to determine the stability and level of confidence in a program. It measures the number of linearly-independent paths through a program module. Programs with lower Cyclomatic complexity are easier to understand and less risky to modify.
I think they have the same cyclomatic complexity of 3; this can be shown using De Morgan's Law.
IF((A>B) OR (C>D)) {a=a+b;c=c+d;} ELSE {}
IF(!((A>B) OR (C>D))) {} ELSE {a=a+b;c=c+d;}
IF(!(A>B) AND !(C>D)) {} ELSE {a=a+b;c=c+d;}
Another way of looking at it is to take the graph and swap the conditional block and the exit point (and reverse the edge between them) and this transforms it from an AND to an OR without changing the number of nodes or edges.
Both complexities are same and equal 3, counted in 4 ways. I agree with Neil on using De Morgan proving that they are same, I think same can be seen from graphs where it matters for complexity counting.
Let's start with graphs for both code pieces.
Word of explanation:
if and if
and if or if
, so two ifs. Thus my second node being a separate node. As you can see, between both code pieces there's no difference in numbers. Nodes, edges, regions all are the same. The difference is which node connects with which node, and this comes from how short-circuiting works. Obviously for languages without it, graphs need to be different.
There is more than one. Complexity equals
Edges = 5; Nodes = 4; exits = 1;
Complexity = 5-4+2*(1) = 3 in both cases. This definition doesn't need a strongly connected graph so we drop the added edge.
Edges = 6; Nodes = 4; exits = 1;
Complexity = 6-4+1 = 3 in both cases. This definition came to be for it makes more sense topologically and is simpler to think of in terms of cycle counting (graph-wise). It is more useful when you think of counting complexity for many routines / functions, like all methods in a class. It makes then sense to think that function may be called in a loop. But I digress.
Regions: 3.
This comes from Eulerian formula that Regions + Nodes - Edges = 2 Rewording this: Regions = Edges - Nodes + 2
So number of regions is same as complexity (assuming one exit point). This was meant to simplify counting complexity from graph, in one-exit sub-routine.
McCabe himself noted that
in practice compound predicates such as IF "c1 AND c2" THEN are treated as contributing two to complexity since without the connective AND we would have IF c1 THEN IF c2 THEN which has two predicates. For this reason and for testing purposes it has been found to be more convenient to count conditions instead of predicates when calculating complexity
In both code pieces we have one compound conditional, so Decisions = 2;
Complexity = 2+1 = 3.
It's worth noting that cyclomatic complexity started as counting cycles, but ended as counting conditionals for practical purposes.
First try McCabe's paper itself: http://www.literateprogramming.com/mccabe.pdf
Wikipedia has a good article relying on the paper, though I found it insufficient without following BASIC BLOCKS and CONNECTED COMPONENTS:
I found a concise yet good summary at Chambers' page: http://www.chambers.com.au/glossary/mc_cabe_cyclomatic_complexity.php
The book "Integrated Approach to Software Engineering" in chapter 8 has an example illustrating complexity calculation (though I think they ate one edge on a graph, figure 8.7).
http://books.google.pl/books?id=M-mhFtxaaskC&lpg=PA385&ots=jB8P0avJU7&d&hl=pl&pg=PR1#v=onepage
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With