Is Defense Winning? Measuring if Cyberspace is Becoming More Defensible and Resilient
This talk presents a framework for measuring the effectiveness of cybersecurity defenses by analyzing threat, vulnerability, and impact indicators over time. It argues that the industry lacks a standardized, time-series-based approach to determine if defensive efforts are actually shifting the advantage away from adversaries. The speaker proposes using metrics like mean-time-to-detect (MTTD), zero-day price indices, and software vulnerability trends to create a more disciplined, data-driven assessment of global cyber resilience. The presentation serves as a call to action for the security community to move beyond anecdotal evidence and adopt rigorous, longitudinal data analysis.
Measuring Cyber Resilience: Why Your Current Metrics Are Failing
TLDR: Most security metrics track inputs like headcount or blocked scans rather than actual defensive outcomes. This talk proposes a shift toward longitudinal, time-series data—such as mean-time-to-detect (MTTD) and zero-day price indices—to determine if we are actually gaining ground against adversaries. By adopting a standardized framework for measuring threat, vulnerability, and impact, we can move from anecdotal security to a data-driven understanding of global cyber resilience.
Security teams are drowning in data but starving for insight. Every year, we see the same cycle: a new high-severity vulnerability drops, the industry panics, we patch, and then we wait for the next one. We report on how many patches were applied or how many phishing emails were blocked, but these are vanity metrics. They track activity, not effectiveness. If you are a pentester or a researcher, you know the reality: the adversary only needs to be right once, and they are constantly evolving their tactics, techniques, and procedures (TTPs).
The Problem with Current Security Measurement
For decades, we have lacked a consistent way to measure whether the defensive side of the house is actually winning. We treat security as a series of isolated incidents rather than a long-term, systemic struggle. When we look at the OWASP Top 10, we see categories that have persisted for years. If our defenses were truly improving, we would see these categories shrink or disappear. Instead, we see them shift.
The core issue is that our metrics are optimized for those with purchase authority—CISOs and boards—rather than for those who actually understand the threat landscape. We report on "security posture" or "threat actors," terms that sound professional in a slide deck but provide zero actionable intelligence for a red teamer or a developer. We need to stop measuring how many rain-drops our umbrella is stopping and start measuring how much we are actually staying dry.
Moving Toward Longitudinal Data
To measure progress, we need to look at data that reflects the actual friction we impose on attackers. The speaker highlights several key indicators that, when viewed over time, provide a much clearer picture of the battlefield:
- Mean-Time-to-Detect (MTTD): This is a direct measure of the cat-and-mouse game between attacker stealth and defender visibility. If the time it takes to detect an intrusion is decreasing across the industry, that is a win. We have seen this trend in recent reports from Mandiant, where dwell times have dropped significantly over the last few years.
- Zero-Day Price Indices: Just as economists track the price of a basket of goods to measure inflation, we should track the cost of exploits. If the cost of a high-quality exploit chain is rising, it means we are successfully raising the cost of entry for attackers.
- Software Vulnerability Trends: Using data from sources like Veracode’s State of Software Security, we can track the prevalence of specific vulnerability classes. A downward trend in memory-safety issues in a major codebase is a concrete, measurable improvement in security.
Why This Matters for Pentesters
When you are on an engagement, you are essentially testing the efficacy of these metrics in real-time. If you find that you are consistently able to use the same TTPs—like T1078 (Valid Accounts) or T1566 (Phishing)—against a target year after year, the organization is not improving. They are just maintaining a status quo.
The goal of this research is to build a framework that allows us to see if the "attack surface" is actually becoming more resilient. For example, if we look at the NVD (National Vulnerability Database) for specific software components, we should be able to see if the frequency of critical bugs is decreasing as the development team adopts more secure coding practices. If the data shows no change, the team is not "shifting left"; they are just moving in circles.
A Call for Discipline
We cannot wait another decade for perfect, academic-grade data. We need to start applying discipline to the data we already have. This means:
- Standardizing our definitions: We need to agree on what constitutes a "catastrophic incident" or a "successful exploit" so we can compare apples to apples.
- Reporting in time-series: A snapshot of your security status is useless. You need to see the trend line. Are you getting better, or are you just getting lucky?
- Focusing on the ecosystem: We are all connected. A vulnerability in a core internet infrastructure component affects everyone. We need to measure the health of the supply chain, not just our own internal perimeter.
We are in this field to make a difference, not just to collect a paycheck. If we want to leave the world in a better place for the next generation of researchers, we need to stop relying on gut feelings and start building the data-driven, longitudinal models that prove our work is actually moving the needle. The next time you are writing a report or filing a bug, ask yourself: does this finding represent a systemic failure that we can measure and fix, or is it just another drop in the bucket? Start tracking the trends, and you will quickly see which one it is.
Vulnerability Classes
Tools Used
Target Technologies
Attack Techniques
OWASP Categories
Up Next From This Conference
Similar Talks

Kill List: Hacking an Assassination Site on the Dark Web

Unmasking the Snitch Puck: The Creepy IoT Surveillance Tech in the School Bathroom




