There has already been a question on What is Cyclomatic Complexity?
However, there is another term called - Essential Cyclomatic Complexity.
What are the differences and similarities between these two metrics of the code? What are their typical accepted values? Also, I have learned that, for understanding the code, Essential Cyclomatic Complexity is a more relevant metric. Whereas from implementation point of view Cyclomatic Complexity is most relevant. If it is so, why?
Essential complexity is a measurement developed by Thomas McCabe to determine how well a program is structured. It measures the number of entry points, termination points, and nondeductible nodes. The closer to 1 this value is, the more well structured the program is.
Cyclomatic complexity is a measurement developed by Thomas McCabe to determine the stability and level of confidence in a program. It measures the number of linearly-independent paths through a program module. Programs with lower Cyclomatic complexity are easier to understand and less risky to modify.
Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and understand.
For example, if source code contains no control flow statement then its cyclomatic complexity will be 1 and source code contains a single path in it. Similarly, if the source code contains one if condition then cyclomatic complexity will be 2 because there will be two paths one for true and the other for false.
Cyclometric Complexity as you know effectively measures the number of possible independent paths through a method or function. This tells us how complex the method is to test.
Essential cyclometric complexity however tells how much complexity is left once we have removed the well-structured complexity. An example of well structured complexity is a for loop where the condition for the loop is stated at the start of the loop. However if we break out of the loop with a break statement for example somewhere in the middle of the path we break our structured component. A similar situation is where we have a number of return statements in a single function.
So what does this tell us ?
Well imagine we have a function with a high CC : a function difficult to test . Now if this function has a low essential CC it means it's fairly easy to break up this function into other smaller functions which are easier to test individually. When essential complexity is high this refactoring is more difficult as the complexity is more difficult to understand.
So code that has a high essential complexity means that he code is harder to maintain and understand. This code we can say is of lower quality. Code that has a high complexity is harder to test but in general we can do someting about this more easily when the essential complexity is low.
What values to use are always up for argument and depend somewhat on the type of application and the language used. For example throwing an exception inside a function makes that function un-structured. Clearly exceptions when used properly are deemed to be a good practise. Similarly validating a parameter at the top of a function and returning immediately is a common practise that (in my opinion) can result in clearer code. Again this is an un-structured construct. So we can imagine that we can accept a basic level of essential complexity.
My personal limits for an enterpise-style application in .NET or Java would be :
CC <= 16 and ECC <= 6
For more 'complex' applications say in C/C++ I would propose tighter limits :
CC <= 10 and ECC <= 4
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With