I am trying to "chunk" up the bytes of an image. This will allow me to upload a large image in portions. I have the image currently stored as one large byte[]. I would like to split the byte array into byte[]'s with a maxlength of 512 elements. However, I'm not sure how to do this in the most efficient way. 
Does anyone know how I can do this in the most efficient manner?
I wrote an extension for this, originally for strings, but decided to make it generic.
    public static T[] CopySlice<T>(this T[] source, int index, int length, bool padToLength = false)
    {
        int n = length;
        T[] slice = null;
        if (source.Length < index + length)
        {
            n = source.Length - index;
            if (padToLength)
            {
                slice = new T[length];
            }
        }
        if(slice == null) slice = new T[n];
        Array.Copy(source, index, slice, 0, n);
        return slice;
    }
    public static IEnumerable<T[]> Slices<T>(this T[] source, int count, bool padToLength = false)
    {
        for (var i = 0; i < source.Length; i += count)
            yield return source.CopySlice(i, count, padToLength);
    }
Basically, you can use it like so:
byte[] myBytes; // original byte array
foreach(byte[] copySlice in myBytes.Slices(10))
{
    // do something with each slice
}
Edit:  I also provided an answer on SO using Buffer.BlockCopy here but BlockCopy will only work on byte[] arrays, so a generic version for strings wouldn't be possible.
The most efficient method would be: not to. If you already have the image as a single byte[] then for local code, just specifying the offset and length (perhaps som ArraySegment-of-byte) is usually sufficient. If your upload API only takes byte[], then you still shouldn't chunk it completely; just use a single 512 buffer and use Buffer.BlockCopy to load it will successive pieces of the data. You may need to resize (Array.Resize) the final chunk, but at most 2 arrays should be needed.
Even better; avoid needing a byte[] in the first place: consider loading the data via a streaming API (this will work well if the data is coming from a file); just use Read (in a loop, processing the returned value etc) to populate chunks of max 512. For example (untested, just of illustration):
byte[] buffer = new byte[512];
while(true) {
    int space = 512, read, offset = 0;
    while(space > 0 && (read = stream.Read(buffer, offset, space)) > 0) {
        space -= read;
        offset += read;
    }
    // either a full buffer, or EOF
    if(space != 0) { // EOF - final
       if(offset != 0) { // something to send
         Array.Resize(red buffer, offset);
         Upload(buffer);
       }
       break;
    } else { // full buffer
       Upload(buffer);
    }
}
                        public static IEnumerable<byte[]> Split(this byte[] value,int bufferLength){
   int countOfArray = value.Length / bufferLength;
   if(value.Length % bufferLength > 0)
      countOfArray ++;
   for(int i=0;i<countOfArray;i++)
   {
      yield return value.Skip(i * bufferLength).Take(bufferLength).ToArray();
   }
}
This is my extension what I used
I know this is old but needed the same solution and following works perfectly for me hope this helps someone
private byte[][] ByteArrayToChunks(byte[] byteData, long BufferSize)
{
    byte[][] chunks = byteData.Select((value, index) => new { PairNum = Math.Floor(index / (double)BufferSize), value }).GroupBy(pair => pair.PairNum).Select(grp => grp.Select(g => g.value).ToArray()).ToArray();
    return chunks;
}
                        If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With