I'm trying to get simple encryption/decryption working with AesManaged, but I keep getting an exception when trying to close the decryption stream. The string here gets encrypted and decrypted correctly, and then I get the CryptographicException "Padding was invalid and cannot be removed" after Console.WriteLine prints the correct string.
Any ideas?
MemoryStream ms = new MemoryStream(); byte[] rawPlaintext = Encoding.Unicode.GetBytes("This is annoying!"); using (Aes aes = new AesManaged()) { aes.Padding = PaddingMode.PKCS7; aes.Key = new byte[128/8]; aes.IV = new byte[128/8]; using (CryptoStream cs = new CryptoStream(ms, aes.CreateEncryptor(), CryptoStreamMode.Write)) { cs.Write(rawPlaintext, 0, rawPlaintext.Length); cs.FlushFinalBlock(); } ms = new MemoryStream(ms.GetBuffer()); using (CryptoStream cs = new CryptoStream(ms, aes.CreateDecryptor(), CryptoStreamMode.Read)) { byte[] rawData = new byte[rawPlaintext.Length]; int len = cs.Read(rawData, 0, rawPlaintext.Length); string s = Encoding.Unicode.GetString(rawData); Console.WriteLine(s); } }
Padding is a way to take data that may or may not be a multiple of the block size for a cipher and extend it out so that it is. This is required for many block cipher modes as they require the data to be encrypted to be an exact multiple of the block size.
This class uses a symmetric key algorithm (Rijndael/AES) to encrypt and decrypt data. As long as encryption and decryption routines use the same parameters to generate the keys, the keys are guaranteed to be the same.
The trick is to use MemoryStream.ToArray()
. I also changed your code so that it uses the CryptoStream
to Write, in both encrypting and decrypting. And you don't need to call CryptoStream.FlushFinalBlock()
explicitly, because you have it in a using()
statement, and that flush will happen on Dispose()
. The following works for me.
byte[] rawPlaintext = System.Text.Encoding.Unicode.GetBytes("This is all clear now!"); using (Aes aes = new AesManaged()) { aes.Padding = PaddingMode.PKCS7; aes.KeySize = 128; // in bits aes.Key = new byte[128/8]; // 16 bytes for 128 bit encryption aes.IV = new byte[128/8]; // AES needs a 16-byte IV // Should set Key and IV here. Good approach: derive them from // a password via Cryptography.Rfc2898DeriveBytes byte[] cipherText= null; byte[] plainText= null; using (MemoryStream ms = new MemoryStream()) { using (CryptoStream cs = new CryptoStream(ms, aes.CreateEncryptor(), CryptoStreamMode.Write)) { cs.Write(rawPlaintext, 0, rawPlaintext.Length); } cipherText= ms.ToArray(); } using (MemoryStream ms = new MemoryStream()) { using (CryptoStream cs = new CryptoStream(ms, aes.CreateDecryptor(), CryptoStreamMode.Write)) { cs.Write(cipherText, 0, cipherText.Length); } plainText = ms.ToArray(); } string s = System.Text.Encoding.Unicode.GetString(plainText); Console.WriteLine(s); }
Also, I guess you know you will want to explicitly set the Mode of the AesManaged instance, and use System.Security.Cryptography.Rfc2898DeriveBytes to derive the Key and IV from a password and salt.
see also:
- AesManaged
This exception can be caused by a mismatch of any one of a number of encryption parameters.
I used the Security.Cryptography.Debug interface to trace all parameters used in the encrypt/decrypt methods.
Finally I found out that my problem was that I set the KeySize
property after setting the Key
causing the class to regenerate a random key and not using the key that I was initially set up.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With