I was doing a CodeSignal practice, and I had the following question:
You are given an array of integers a and an integer k. Your task is to calculate the number of ways to pick two different indices i < j, such that a[i] + a[j] is divisible by k. The constraints are: 1 ≤ a.length ≤ 10^5, 1 ≤ a[i] ≤ 10^9, 1 ≤ k ≤ 10^9.
For a = [1, 2, 3, 4, 5] and k = 3, the answer should be 4.
I found a solution that uses buckets and could be done in O(n) time, or specifically just a single pass:
def solution(a, k):
buckets = [0 for _ in range(k)]
res = 0
for num in a:
mod_value = num % k
res += buckets[(k - mod_value) % k]
buckets[mod_value] += 1
return res
But I was getting a time limit exceeded (9/11 test cases passed). I thought it was a Python issue, so I tried implementing the same solution in Java but I got hit with a runtime error. Investigating with a custom test case with k = 10^9
resulted in an out of memory error, suggesting the failing test cases were for extreme input values.
I am not sure if there is an even better solution, or if this problem is bugged on CodeSignal's side.
You should try to use a hash table instead of a bucket/array. Since there could be a lot of empty slots in the bucket if k
can be 10^9, and your memory can grow to ~1GB. If you use a hash table, your memory is restricted to your input size n
at most, which you assume it is much smaller than 10^9.
try this:
import collections
def solution(a, k):
store = collections.defaultdict(list)
total = 0
for i in range(len(a)):
complement = (k - a[i] % k) % k
total += len(store[complement])
store[a[i] % k].append(i)
return total
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With