I have a c# BitArray that is fairly large (500,000) in length, and I am trying to get the index of all the positive bits set in the array. currently I am achieving this by:
public int[] GetIndexesForPositives()
{
var idIndexes = new int[GetPositiveCount + 1];
var idx = 0;
for (var i = 0; i < Length; i++)
{
if (Get(i))
{
idIndexes[idx++] = i;
}
}
return idIndexes;
}
I create an empty array of the size of known positive bits, then i lopp over the bitarray and add the index value to the return array.
This means I have to perform 500,000 loops over the array and its not exactly fast. (takes around 15ms).
I know the BitArray uses an integer array under the covers (i used it to write the GetPositiveCount function - via an alogrithm I got off stack), I wonder if there is an algorythm to do this aswell?
If you can swap out the BitArray from the BCL in favour of a "roll your own", you can do better than that. Here's a few things you can do:
x & (x - 1)
and your favourite fast 2log found here (using the naive 64-step method won't give any kind of speedup)All four of these only help if the bitarray is expected to be sparse, and the worst case is still O(n) if it isn't sparse. If bullet 3 is applied until the top is a single ulong then it can in O(1) determine whether the entire bitarray is empty or not.
If you are able to get a int array underlying the BitArray, this should provide much better performance:
Assuming you don't know the number of bits that are set:
public static int[] GetIndexesForPositives()
{
var idIndexes = new List<int>();
System.Reflection.FieldInfo field = data.GetType().GetField("m_array", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance);
int[] values = field.GetValue(data) as int[];
for (var i = 0; i < values.Length; i++)
{
int _i = values[i];
if (_i != 0)
{
for (var j = 0; j < 32; j++)
{
if ((_i & (1 << j)) != 0)
{
idIndexes.Add(i * 32 + j);
}
}
}
}
return idIndexes.ToArray();
}
If you do know the number of bits that are set you can do this instead:
public static int[] GetIndexesForPositives(int length)
{
var idIndexes = new int[length];
var idx = 0;
System.Reflection.FieldInfo field = data.GetType().GetField("m_array", System.Reflection.BindingFlags.NonPublic | System.Reflection.BindingFlags.Instance);
int[] values = field.GetValue(data) as int[];
for (var i = 0; i < values.Length; i++)
{
int _i = values[i];
if (_i != 0)
{
for (var j = 0; j < 32; j++)
{
if ((_i & (1 << j)) != 0)
{
idIndexes[idx++] = i * 32 + j;
}
}
}
}
My tests have these two working faster than your method, even the one that doesn't know how large the return array will be in the first place.
My results tested using a random BitArray of 50million records:
1) 25001063 records found in 50000000, took 1415.5752ms
2) 25001063 records found in 50000000, took 1099.67ms
3) 25001063 records found in 50000000, took 1045.6862ms
4) 25001063 records found in 50000000, took 745.7762ms"
1) is your code but using an arraylist instead of using some `GetPositiveCount` to get the output length.
2) is your code
3) is my (revised) first example
4) is my (revised) second example
edit: furthermore it is worth pointing out that this is a problem that could really benefit from being made multi-threaded. Break the ByteArray up into 4 parts and there you have 4 threads that could run checking the data at once.
Edit: I know this is already accepted but here's another bit you can do to improve performance if you know that most of the time your list will be very sparse:
for (var j = 0; j < 32; j++)
{
if (_i == 0)
break;
if ((_i & (1)) != 0)
{
idIndexes.Add(i * 32 + j);
}
_i = _i >> 1;
}
it is slightly slower when the list is >40% or more populated however if you know the list is always going to be 10% 1s and 90% 0s then this will run even faster for you.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With