I am working on an implementation of C# SignedCms functionality in Java.
I am using bouncycastle libs. The problem is I get java signature that is different from the one generated with SignedCms.
X509Certificate2 certificate = new X509Certificate2("myCertPath", "myPass");
String text = "text";
ContentInfo contentInfo = new ContentInfo(System.Text.Encoding.UTF8.GetBytes(text));
SignedCms cms = new SignedCms(contentInfo, false);
CmsSigner signer = new CmsSigner(certificate);
signer.IncludeOption = X509IncludeOption.None;
signer.DigestAlgorithm = new Oid("SHA1");
cms.ComputeSignature(signer, false);
byte[] signature = cms.Encode();
print(signature);
Security.addProvider(new BouncyCastleProvider());
char[] password = "myPass".toCharArray();
String text = "text";
FileInputStream fis = new FileInputStream("myCertPath");
KeyStore ks = KeyStore.getInstance("pkcs12");
ks.load(fis, password);
String alias = ks.aliases().nextElement();
PrivateKey pKey = (PrivateKey)ks.getKey(alias, password);
X509Certificate cert = (X509Certificate)ks.getCertificate(alias);
java.util.List certList = new ArrayList();
Store certs = new JcaCertStore(certList);
CMSSignedDataGenerator gen = new CMSSignedDataGenerator();
JcaSimpleSignerInfoGeneratorBuilder builder = new JcaSimpleSignerInfoGeneratorBuilder().setProvider("BC").setDirectSignature(true);
gen.addSignerInfoGenerator(builder.build("SHA1withRSA", pKey, cert));
gen.addCertificates(certs);
CMSTypedData msg = new CMSProcessableByteArray(text.getBytes());
CMSSignedData s = gen.generate(msg, false);
print(s.getEncoded());
They both don't include x509 certificates.
length=434 308201AE06092A864886F70D010702A082019F3082019B020101310B300906052B0E03021A0500301306092 A864886F70D010701A006040474657874318201723082016E0201013081CB3081B6310B3009060355040613 02555331173015060355040A130E566572695369676E2C20496E632E311F301D060355040B1316566572695 369676E205472757374204E6574776F726B313B3039060355040B13325465726D73206F6620757365206174 2068747470733A2F2F7777772E766572697369676E2E636F6D2F7270612028632930393130302E060355040 31327566572695369676E20436C617373203320436F6465205369676E696E6720323030392D322043410210 1763F9A88334A01FFB3B7BAB384A9B93300906052B0E03021A0500300D06092A864886F70D0101010500048 1800B866A9A7045E3C86E5DB69CDAD5CED211A4A2362BCC4DDB2742BF0CDB65BC88556C97A6C08D68F8070D 89CC78ACD84A636F15B40D166E461411C6A04D5EC379283988DA4258B684FFEF9F08B293A03A0B40900E245 874D8C0587BBD58BDD915A50D27456E6EEB883846CAC485853BA5E22E45D333C940A958E641A00C9602B9
length=428 308006092A864886F70D010702A0803080020101310B300906052B0E03021A0500308006092A864886F70D0 107010000318201723082016E0201013081CB3081B6310B300906035504061302555331173015060355040A 130E566572695369676E2C20496E632E311F301D060355040B1316566572695369676E205472757374204E6 574776F726B313B3039060355040B13325465726D73206F66207573652061742068747470733A2F2F777777 2E766572697369676E2E636F6D2F7270612028632930393130302E06035504031327566572695369676E204 36C617373203320436F6465205369676E696E6720323030392D3220434102101763F9A88334A01FFB3B7BAB 384A9B93300906052B0E03021A0500300D06092A864886F70D01010105000481800B866A9A7045E3C86E5DB 69CDAD5CED211A4A2362BCC4DDB2742BF0CDB65BC88556C97A6C08D68F8070D89CC78ACD84A636F15B40D16 6E461411C6A04D5EC379283988DA4258B684FFEF9F08B293A03A0B40900E245874D8C0587BBD58BDD915A50 D27456E6EEB883846CAC485853BA5E22E45D333C940A958E641A00C9602B9000000000000
I am stuck on this issue.
The Java output was BER encoded. I needed DER encoded signature. To convert BER to DER I used
ByteArrayOutputStream bOut = new ByteArrayOutputStream();
DEROutputStream dOut = new DEROutputStream(bOut);
dOut.writeObject(s.toASN1Structure().toASN1Primitive());
dOut.close();
bytep[ encoded = bOut.toByteArray();
Now the outputs are the same.
The good news: nothing is wrong.
Take a look at the beginning of both resulting DER encodings:
C#: 308201AE...
Java: 3080...
The C# encoding is in definite length form, i.e. 30
indicates a SEQUENCE
, 82
indicates a definite length encoding using the next two bytes, and 01AE
is the actual length value of 430. 430 bytes that follow plus the 4 read so far make up for the total of 434 bytes.
The Java encoding on the other hand differs in that it indicates an indefinite length encoding (the 80
). Strictly speaking, this is no longer a DER encoding but a BER encoding. This means that no explicit length is given for that element, but that the element ends with a special END OF CONTENTS
element instead, which is encoded as 0000
. You'll notice quite a few of them at the end of the Java encoding. More about the details in this guide to BER/DER.
The rest of the two structures is exactly identical, even the signature value itself. It's just that the Java version uses indefinite lengths while the C# version uses definite lengths. If the verifying party understands both BER and DER encodings, the two signatures will be identical up to encoding. And the encoding won't play a role in the signature verification process. Here's what the CMS RFC says with regard to this:
With signedAttrs
present:
Specifically, the initial input is the encapContentInfo eContent OCTET STRING to which the signing process is applied. Only the octets comprising the value of the eContent OCTET STRING are input to the message digest algorithm, not the tag or the length octets.
Without signedAttrs
:
When the signedAttrs field is absent, only the octets comprising the value of the SignedData encapContentInfo eContent OCTET STRING (e.g., the contents of a file) are input to the message digest calculation. This has the advantage that the length of the content being signed need not be known in advance of the signature generation process.
In other words: Only the bytes comprising the actual value of the eContent
are hashed, and really only those. Neither its tag nor its length and also not the tags and lengths of its chunks (in the case of an indefinite constructed encoding) may be hashed in the process. I'll admit, there are implementations that get this wrong and it's clearly a quite complicated issue.
While it adds a lot of complexity and interoperability issues, it makes sense for one reason (besides being a few bytes smaller): If you produce 'attached signatures' (the ones where the original document is embedded within the EncapContentInfo
element), choosing indefinite lengths allows you to create and verify the signature in a streaming manner: you can read or write chunk by chunk. Whereas for definite lengths you have to read/write the whole thing at once because you need to know the length in advance in order to create the final Tag-Length-Value format of DER encoding. The idea of being able to do streaming IO is very powerful in this context: imagine you want to create an attached signature of a log file several GB large - any non-streaming approach will quickly run out of memory.
The Java version of Bouncy Castle added streaming support in the context of CMS a while ago, chances are high that it won't be too long until the C# version picks it up.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With