Thanks to Loading an ECC private key in .NET, I'm able to load ECC private keys into .NET Core 3 and performing signature tasks with them.
I have, however run into one key that cannot be loaded by ECDSA.ImportPrivateKey
. What's weird is that looking at it with openssl
changes the key bytes to something that .NET Core 3 can understand.
Code to import the failing private key (this is the actual key that fails):
ecdsa = ECDsa.Create();
var pem = "MHYCAQEEH5t2Xlmsw5uqw3W9+/3nosFi6i3V901uW6ZzUpvVM0qgCgYIKoZIzj0DAQehRANCAASck2UuMxfyDYBdJC0mHNeToqMBhJuMZYSgkUNbK/xzD7e3cwr5okPx0pZdSMfDmyi1dBujtIIxFK9va1bdVAR9";
var derArray = Convert.FromBase64String(pem);
ecdsa.ImportECPrivateKey(derArray, out _);
The ImportECPrivateKey
call fails with System.Security.Cryptography.CryptographicException : ASN1 corrupted data
inside System.Security.Cryptography.EccKeyFormatHelper.FromECPrivateKey(ReadOnlyMemory`1 keyData, AlgorithmIdentifierAsn& algId, ECParameters& ret)
The original PEM file looks like this:
$ cat private_key_cert_265.pem
-----BEGIN EC PRIVATE KEY-----
MHYCAQEEH5t2Xlmsw5uqw3W9+/3nosFi6i3V901uW6ZzUpvVM0qgCgYIKoZIzj0D
AQehRANCAASck2UuMxfyDYBdJC0mHNeToqMBhJuMZYSgkUNbK/xzD7e3cwr5okPx
0pZdSMfDmyi1dBujtIIxFK9va1bdVAR9
-----END EC PRIVATE KEY-----
openssl converts the private key to something else:
$ openssl ec -in private_key_cert_265.pem
read EC key
writing EC key
-----BEGIN EC PRIVATE KEY-----
MHcCAQEEIACbdl5ZrMObqsN1vfv956LBYuot1fdNblumc1Kb1TNKoAoGCCqGSM49
AwEHoUQDQgAEnJNlLjMX8g2AXSQtJhzXk6KjAYSbjGWEoJFDWyv8cw+3t3MK+aJD
8dKWXUjHw5sotXQbo7SCMRSvb2tW3VQEfQ==
-----END EC PRIVATE KEY-----
Using this form of the PEM file, .NET Core 3 can import the private key.
My question is: What is going on; Why is openssl changing the private key to another format (how can I tell which format is which?), and why can .NET Core 3 understand one format and not the other?
The initial key is incorrectly formatted. The private value S has not been left-padded correctly. The value S is transmitted as an unsigned, big endian integer within an ASN.1 octet string. This octet string (byte array) must have the same size as the key size in bytes, by definition. That means that it must be 32 bytes for the 256 bit curve that you are using (secp256r1, known under different names).
So your code and the .NET code is correct. OpenSSL seems more liberal what it accepts, but it writes out the value S as it should be.
Below is the initial private key value, taken from here:
SEQUENCE (4 elem)
INTEGER 1
OCTET STRING (31 byte) 9B765E59ACC39BAAC375BDFBFDE7A2C162EA2DD5F74D6E5BA673529BD5334A
[0] (1 elem)
OBJECT IDENTIFIER 1.2.840.10045.3.1.7 prime256v1 (ANSI X9.62 named elliptic curve)
[1] (1 elem)
BIT STRING (520 bit) 0000010010011100100100110110010100101110001100110001011111110010000011…
and here is the OpenSSL-corrected one:
SEQUENCE (4 elem)
INTEGER 1
OCTET STRING (32 byte) 009B765E59ACC39BAAC375BDFBFDE7A2C162EA2DD5F74D6E5BA673529BD5334A
[0] (1 elem)
OBJECT IDENTIFIER 1.2.840.10045.3.1.7 prime256v1 (ANSI X9.62 named elliptic curve)
[1] (1 elem)
BIT STRING (520 bit) 0000010010011100100100110110010100101110001100110001011111110010000011…
and note the size of the OCTET STRING value and preceding zero.
This private key now has been compromised, of course. Note that there is a smaller chance of two 00
bytes missing, or three (etc.).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With