Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Different encryption results between platforms, using OpenSSL

I'm working on a piece of cross-platform (Windows and Mac OS X) code in C that needs to encrypt / decrypt blobs using AES-256 with CBC and a blocksize of 128 bits. Among various libraries and APIs I've chosen OpenSSL.

This piece of code will then upload the blob using a multipart-form PUT to a server which then decrypts it using the same settings in .NET's crypto framework (Aes, CryptoStream, etc...).

The problem I'm facing is that the server decryption works fine when the local encryption is done on Windows but it fails when the encryption is done on Mac OS X - the server throws a "Padding is invalid and cannot be removed exception".

I've looked at this from many perspectives:

  1. I verified that the transportation is correct - the byte array received on the server's decrypt method is exactly the same that is sent from Mac OS X and Windows
  2. The actual content of the encrypted blob, for the same key, is different between Windows and Mac OS X. I tested this using a hardcoded key and run this patch on Windows and Mac OS X for the same blob
  3. I'm sure the padding the correct, since it is taken care of by OpenSSL and since the same code works for Windows. Even so, I tried implementing the padding scheme as it is in Microsoft's reference source for .NET but still, no go
  4. I verified that the IV is the same for Windows and Mac OS X (I thought maybe there was a problem with some of the special characters such as ETB that appear in the IV, but there wasn't)
  5. I've tried LibreSSL and mbedtls, with no positive results. In mbedtls I also had to implement padding because, as far as I know, padding is the responsibility of the API's user
  6. I've been at this problem for almost two weeks now and I'm starting to pull my (ever scarce) hair out

As a frame of reference, I'll post the C client's code for encrypting and the server's C# code for decrypting. Some minor details on the server side will be omitted (they don't interfere with the crypto code).

Client:

/*++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*/
void
__setup_aes(EVP_CIPHER_CTX *ctx, const char *key, qvr_bool encrypt)
{
    static const char *iv = ""; /* for security reasons, the actual IV is omitted... */

    if (encrypt)
        EVP_EncryptInit(ctx, EVP_aes_256_cbc(), key, iv);
    else
        EVP_DecryptInit(ctx, EVP_aes_256_cbc(), key, iv);
}

/*++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*/
void
__encrypt(void *buf,
    size_t buflen,
    const char *key,
    unsigned char **outbuf,
    size_t *outlen)
{
    EVP_CIPHER_CTX ctx;
    int blocklen = 0;
    int finallen = 0;
    int remainder = 0;

    __setup_aes(&ctx, key, QVR_TRUE);

    EVP_CIPHER *c = ctx.cipher;
    blocklen = EVP_CIPHER_CTX_block_size(&ctx);

    //*outbuf = (unsigned char *) malloc((buflen + blocklen - 1) / blocklen * blocklen);
    remainder = buflen % blocklen;
    *outlen = remainder == 0 ? buflen : buflen + blocklen - remainder;
    *outbuf = (unsigned char *) calloc(*outlen, sizeof(unsigned char));

    EVP_EncryptUpdate(&ctx, *outbuf, outlen, buf, buflen);
    EVP_EncryptFinal_ex(&ctx, *outbuf + *outlen, &finallen);

    EVP_CIPHER_CTX_cleanup(&ctx);
    //*outlen += finallen;
}

Server:

static Byte[] Decrypt(byte[] input, byte[] key, byte[] iv)
    {
        try
        {
            // Check arguments.
            if (input == null || input.Length <= 0)
                throw new ArgumentNullException("input");
            if (key == null || key.Length <= 0)
                throw new ArgumentNullException("key");
            if (iv == null || iv.Length <= 0)
                throw new ArgumentNullException("iv");

            byte[] unprotected;


            using (var encryptor = Aes.Create())
            {
                encryptor.Key = key;
                encryptor.IV = iv;
                using (var msInput = new MemoryStream(input))
                {
                    msInput.Position = 0;
                    using (
                        var cs = new CryptoStream(msInput, encryptor.CreateDecryptor(),
                            CryptoStreamMode.Read))
                    using (var data = new BinaryReader(cs))
                    using (var outStream = new MemoryStream())
                    {
                        byte[] buf = new byte[2048];
                        int bytes = 0;
                        while ((bytes = data.Read(buf, 0, buf.Length)) != 0)
                            outStream.Write(buf, 0, bytes);

                        return outStream.ToArray();
                    }
                }
            }
        }
        catch (Exception ex)
        {
            throw ex;
        }

    }

Does anyone have any clue as to why this could possibly be happening? For reference, this is the .NET method from Microsoft's reference source .sln that (I think) does the decryption: https://gist.github.com/Metaluim/fcf9a4f1012fdeb2a44f#file-rijndaelmanagedtransform-cs

like image 654
Ricardo Ferreira Avatar asked Oct 08 '15 13:10

Ricardo Ferreira


1 Answers

OpenSSL version differences are messy. First I suggest you explicitly force and veryify the key lengths, keys, IVs and encryption modes on both sides. I don't see that in the code. Then I would suggest you decrypt on the server side without padding. This will always succeed, and then you can inspect the last block whether it is what you expect.

Do this with the Windows-Encryption and MacOS-Encryption variant and you will see a difference, most likely in the padding.

The outlen padding in the C++ code looks odd. Encrypting a 16 byte long plaintext results in 32 bytes of ciphertext, but you only provide a 16 byte long buffer. This will not work. You will write out of bounds. Maybe it works just by chance on Windows because of a more generous memory layout and fails on MacOS.

like image 94
Johannes Overmann Avatar answered Sep 17 '22 13:09

Johannes Overmann