Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How reduce the complexity of the searching in two lists algorithm?

Tags:

java

algorithm

I have to find some common items in two lists. I cannot sort it, order is important. Have to find how many elements from secondList occur in firstList. Now it looks like below:

int[] firstList;
int[] secondList;
int iterator=0;
for(int i:firstList){
 while(i <= secondList[iterator]/* two conditions more */){
       iterator++;
       //some actions
   }
}

Complexity of this algorithm is n x n. I try to reduce the complexity of this operation, but I don't know how compare elements in different way? Any advice?

EDIT: Example: A=5,4,3,2,3 B=1,2,3 We look for pairs B[i],A[j] Condition: when

B[i] < A[j]
         j++ 

when

B[i] >= A[j]
         return B[i],A[j-1]

next iteration through the list of A to an element j-1 (mean for(int z=0;z<j-1;z++)) I'm not sure, Did I make myself clear? Duplicated are allowed.

like image 764
Janusz Lece Avatar asked Dec 26 '22 11:12

Janusz Lece


1 Answers

My approach would be - put all the elements from the first array in a HashSet and then do an iteration over the second array. This reduces the complexity to the sum of the lengths of the two arrays. It has the downside of taking additional memory, but unless you use more memory I don't think you can improve your brute force solution.

EDIT: to avoid further dispute on the matter. If you are allowed to have duplicates in the first array and you actually care how many times does an element in the second array match an array in the first one, use HashMultiSet.

like image 121
Ivaylo Strandjev Avatar answered Mar 01 '23 05:03

Ivaylo Strandjev