I have a piece of code that has Cognitive Complexity of 21
for (String item1 : itemList1){
for (String item2 : itemList2){
for (String item3 : itemList3){
for (String item4 : itemList4){
for (String item5 : itemList5){
for (String item6 : itemList6){
methodToRun(item1, item2, item3, item4, item5, item6);
}
}
}
}
}
}
Our linter specifies a maximum Cognitive Complexity of 15, so I should reduce this by the standards we've been following.
Can anyone suggest an alternative solution to this piece of code? Or is leaving it like this acceptable despite the complexity being too high?
I know this could be a personal opinion, but I'm looking for genuine solutions or answers from people who have had similar situations before.
EDIT : I cannot access a lot of libraries and packages from the dev machine I'm working on. I have access to some (too many to list), so please take note of this before suggesting use of one.
You can go for a recursive solution. It is arguably less readable, but has a much smaller level of nesting, which reduces the complexity measure:
static void recursiveRun(List<List<String>> list, int pos, String[] item) {
if (pos == 6) {
methodToRun(item[0], item[1], item[2], item[3], item[4], item[5]);
} else {
for (String s : list.get(pos)) {
item[pos] = s;
recursiveRun(list, pos+1, item);
}
}
}
The initial call looks like this:
recursiveRun(
Arrays.asList(itemList1, itemList2, itemList3, itemList4, itemList5, itemList6)
, 0
, new String[6]
);
Here is an iterator based solution.
class CartesianProductIterator<T> implements Iterator<List<T>>, Iterable<List<T>> {
private List<List<T>> data;
private int size;
private int[] sizes;
private int[] cursors;
private boolean done;
public CartesianProductIterator(List<List<T>> data) {
this.data = data;
this.size = data.size();
this.sizes = new int[this.size];
this.cursors = new int[this.size];
setSizes(data);
}
@Override
public boolean hasNext() {return !done;}
@Override
public List<T> next() {
if (! hasNext()) throw new NoSuchElementException();
ArrayList<T> tuple = new ArrayList<>();
for (int i = 0; i < size; i++) {tuple.add(data.get(i).get(cursors[i]));}
updateCursors();
return tuple;
}
private void updateCursors() {
for (int i = size - 1; i >= 0; i--) {
if (cursors[i] < sizes[i] - 1) {
cursors[i]++;
break;
} else {
cursors[i] = 0;
if (i == 0) done = true;
}
}
}
private void setSizes(List<List<T>> data) {
for (int i = 0; i < size; i++) {sizes[i] = data.get(i).size();}
}
@Override
public void remove() {
throw new UnsupportedOperationException("remove is not supported here");
}
@Override
public Iterator<List<T>> iterator() {return this;}
}
can be used to create cross products on demand
List<List<String>> data = new ArrayList<>();
data.add(Arrays.asList("a", "b", "c"));
data.add(Arrays.asList("1", "2"));
data.add(Arrays.asList("A", "B", "C"));
Iterator<List<String>> dataIterator = new CartesianProductIterator<String>(data);
while (dataIterator.hasNext()) {
System.out.println(dataIterator.next());
}
now with dual Iterable/Iterator interface can alternatively used as
for(List<String> combination: new CartesianProductIterator<>(data)) {
System.out.println(combination);
}
Once your data is packed in List<List<String>>
then you can use n-ary Cartesian Product preserving the order of elements (lexicographical) implemented in Google Guava.
import com.google.common.collect.ImmutableList;
import com.google.common.collect.Lists;
List<List<String>> input = Arrays.asList(
ImmutableList.of("Mary", "Alice"),
ImmutableList.of("Smith", "Darcy", "Brown"),
ImmutableList.of("Ford", "Saab")
);
List<List<String>> result = Lists.cartesianProduct(input); //Cognitive Complexity of 0
for (List<String> shuffle: result) {
System.out.println(String.join(",", shuffle));
}
... produces:
Mary,Smith,Ford
Mary,Smith,Saab
Mary,Darcy,Ford
Mary,Darcy,Saab
Mary,Brown,Ford
Mary,Brown,Saab
Alice,Smith,Ford
Alice,Smith,Saab
Alice,Darcy,Ford
Alice,Darcy,Saab
Alice,Brown,Ford
Alice,Brown,Saab
Here is a quick solution with hard-coded values for 3 lists which potentially could get generalized without incurring too much of complexity penalty.It basically does some neat (a.k.a hard to follow) index calculations.
String[] list0 = new String[] {"0", "1"};
String[] list1 = new String[] {"4", "5"};
String[] list2 = new String[] {"8", "9"};
int[] indexes = new int[3];
long totalPermutations = list0.length * list1.length * list2.length;
for(int i = 0; i < totalPermutations; i++) {
indexes[0] = i % list0.length;
indexes[1] = (i / list0.length) % list1.length;
indexes[2] = (i / (list0.length * list1.length)) % list2.length;
System.out.println(list0[indexes[0]] + "," + list1[indexes[1]] + "," + list2[indexes[2]]);
}
Pure Java solution is a perfect example where for the sake of keeping the metric happy, we had actually increased the complexity and maintainability.
That whole index calculation is tbh quite horrible and took few goes to get right. It will most likely cost a penalty in general solution anyway as iteration will be required. Other solutions I have found on the web (including recursive and functional) are not clearer than the bunch of nested loops.
Invented here Cartesian product routines will IMO be more complex (even if scoring lower complexity) to comprehend.
Software has to build on abstractions, and using open, well designed 3rd party dependency makes the whole issue go away nicely.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With