Kuboid
Open Luck·Kuboid.in
Black Hat2024
Open in YouTube ↗

The Hack@DAC Story: Learnings from Organizing the World's Largest Hardware Hacking Competition

Black Hat825 views37:55about 1 year ago

This talk details the methodology and educational impact of the Hack@DAC hardware capture-the-flag (CTF) competition, which focuses on identifying vulnerabilities in open-source system-on-chip (SoC) designs. The presenters demonstrate how hardware security flaws, such as improper finite state machine implementations and key leakage, can be exploited by software. The session highlights the importance of a 'shift-left' security mindset in hardware development and the need for security-aware design automation tools. It also discusses the integration of these findings into the MITRE Common Weakness Enumeration (CWE) database.

Hardware Security is Broken: Why Your SoC Design is Leaking Secrets

TLDR: Hardware security is often ignored until it is too late, but research from the Hack@DAC competition proves that SoC vulnerabilities like key leakage and privilege escalation are easily exploitable by software. By shifting security testing left into the RTL design phase, developers can catch these flaws before they are baked into silicon. This post breaks down how to identify these hardware-level weaknesses and why your next penetration test should include a review of the underlying hardware architecture.

Hardware security has long been treated as a black box. Pentesters and researchers focus on the software stack, assuming the silicon underneath is a trusted foundation. That assumption is a liability. When a system-on-chip (SoC) is designed with improper finite state machine (FSM) logic or insecure debug interfaces, the entire security model of the device collapses. The Hack@DAC competition, which has been running for seven years, provides a masterclass in how these hardware-level bugs are not just theoretical—they are practical, exploitable, and often trivial to trigger once you understand the hardware's internal state.

The Anatomy of a Hardware Vulnerability

Most hardware security research focuses on side-channel attacks or physical fault injection. While those are valid, the most dangerous bugs are often logic errors in the register transfer level (RTL) code. These are the "software-exploitable hardware bugs" that allow an attacker to bypass security features without needing a soldering iron or a laser.

Consider the common requirement for key clearing. When a device enters a debug mode, it should ideally wipe sensitive cryptographic keys from memory to prevent extraction. A common implementation error occurs when the hardware logic fails to check the debug state for every single register bank. If the logic for key_big0 and key_big1 correctly checks the debug_mode signal, but the logic for key_big2 omits that check, the key remains readable. An attacker with basic software access can simply read the memory-mapped register associated with that key, effectively bypassing the entire hardware-based security objective.

This is not a hypothetical scenario. It is a direct result of design complexity. As SoCs grow in size, engineers often copy-paste logic blocks. If the original block had a security flaw, or if the integration of that block into the larger fabric misses a specific signal check, the vulnerability propagates. You can find the full list of these hardware-specific weaknesses in the MITRE Hardware CWE database, which now includes a dedicated section for hardware design flaws.

Shifting Security Left in Hardware

The cost of fixing a bug in hardware is exponential compared to software. Once a chip is fabricated, a logic error is permanent. This is why the industry is pushing for a "shift-left" approach, where security verification happens at the RTL design phase.

For a pentester, this means your engagement should start earlier. If you are auditing an embedded device, don't just look at the firmware. Look at the OpenTitan or Pulpino architectures. These open-source projects are the gold standard for understanding how modern SoCs are built. By analyzing the Verilog or SystemVerilog code, you can identify potential FSM flaws or insecure access control lists (ACLs) before you even touch the physical hardware.

If you want to test your own tooling, you can use the Hack@DAC framework. It provides a "buggy" SoC design that mimics real-world flaws. You can run static analysis tools or formal verification methods against this RTL to see if your scanners can catch the key leakage bug mentioned earlier.

The Pentester’s Toolkit for Hardware

Testing hardware logic requires a different mindset than testing a web application. You are looking for state-machine transitions that shouldn't happen and signals that aren't properly gated.

When you are on an engagement, look for the debug interfaces. If you can access the JTAG or UART ports, you are already halfway to the internal registers. Use tools like OpenOCD to interact with the processor core. If the hardware designer failed to implement proper privilege separation, you might find that an unprivileged user-mode process can access the same memory-mapped registers as the kernel. This is a classic privilege escalation path that exists entirely in the hardware logic.

The OWASP Embedded Application Security project provides a good starting point for understanding the threat landscape, but it often stops at the firmware level. You need to go deeper. If you can identify a vulnerability in the hardware, you have found a flaw that no firmware patch can ever truly fix.

Why This Matters Now

We are seeing a surge in custom silicon for AI and edge computing. These chips are being designed at breakneck speeds, often by teams that prioritize performance over security. When you combine that speed with the complexity of modern RISC-V architectures, you get a recipe for disaster.

The next time you are tasked with assessing a device, ask for the hardware design specifications. If they don't have a threat model for their SoC, they are already vulnerable. Start by looking for the simple things: are the debug interfaces locked? Is there a clear path from user-space to sensitive cryptographic registers? If the answer is yes, you have found your entry point. Hardware security is no longer a niche field for electrical engineers; it is the new frontier for every serious security researcher. Stop treating the silicon as a black box and start auditing the logic that runs the world.

Premium Security Audit

We break your app before they do.

Professional penetration testing and vulnerability assessments by the Kuboid Secure Layer team. Securing your infrastructure at every layer.

Get in Touch
Official Security Partner
kuboid.in