Purple Teaming and Adversary Emulation
This panel discussion explores the strategic and operational implementation of purple teaming and adversary emulation within enterprise security environments. The speakers emphasize the importance of breaking down silos between red and blue teams to improve detection capabilities and security posture. Key takeaways include the necessity of using metrics to demonstrate value to leadership and the importance of aligning emulation exercises with real-world threat intelligence. The discussion also touches on the role of automation and the balance between manual testing and automated security tools.
Beyond the Red Team: Why Your Emulation Strategy is Failing
TLDR: Most organizations treat adversary emulation as a checkbox exercise, running isolated scripts that fail to reflect real-world attacker behavior. This post breaks down why siloed red and blue team operations create blind spots and how to shift toward a collaborative purple team model. By focusing on relevant threat intelligence and actionable metrics, you can move from testing theoretical vulnerabilities to actually hardening your detection stack.
Adversary emulation has become the industry standard for validating security controls, yet most teams are doing it wrong. We see organizations running massive, automated suites of tests that generate noise without providing any real insight into their defensive capabilities. If your red team is dropping payloads that your blue team never sees, or if your blue team is tuning alerts based on outdated threat models, you are wasting time. The goal of any engagement should be to identify the gap between what you think you can detect and what you can actually catch when a real actor is on the wire.
The Problem with Siloed Emulation
Red teaming often devolves into a game of "gotcha" where the goal is to bypass EDR or exfiltrate data without being caught. While this is useful for testing specific controls, it rarely helps the blue team improve. If the red team doesn't share the exact TTPs (Tactics, Techniques, and Procedures) used during the engagement, the blue team remains in the dark.
Effective emulation requires a shared language. When you run a simulation, the blue team should be watching the logs in real-time, verifying that the telemetry is hitting the SIEM as expected. If you are using tools like Caldera or Atomic Red Team, you have the framework to map these actions directly to the MITRE ATT&CK framework. The failure occurs when these tools are treated as black boxes. You need to understand the underlying mechanics of the technique, not just the output of the script.
Moving to a Purple Team Workflow
A purple team approach isn't about merging two departments; it’s about synchronizing their efforts. During an exercise, the red team should act as a coach for the blue team. If a specific technique like T1567.002 (Exfiltration Over Web Service) is being tested, the red team should be transparent about the payload delivery and the C2 infrastructure.
This transparency allows the blue team to focus on the "why" behind the detection. Are you alerting on the process execution, the network connection, or the file modification? If you are only alerting on the file hash, you are already behind. Attackers will change the hash in seconds. You need to be looking for the behavioral indicators that persist across different iterations of the same attack.
Metrics That Actually Matter
Stop reporting on the number of "successful" red team engagements. That metric is vanity. Instead, focus on the mean time to detect (MTTD) and mean time to respond (MTTR) for specific, high-priority TTPs. If you can show your leadership that you reduced the detection time for a common T1566.001 (Spearphishing Attachment) scenario from four hours to ten minutes, you have a compelling story to tell.
When you present these findings, keep it simple. Executives don't need to see the raw packet captures. They need to know that the investment in your EDR or SIEM is paying off. If you can demonstrate that your team identified a gap in coverage—for example, a lack of visibility into T1071.001 (Web Protocols)—and then closed that gap through better configuration, you have provided tangible value.
The Role of Automation
Automation is a force multiplier, but it is not a replacement for human intuition. Use automated tools to handle the repetitive, low-level testing. This frees up your senior researchers to focus on the complex, multi-stage attacks that require manual intervention.
If you are testing against a legacy environment, like Windows 7, you are likely dealing with a different set of risks than a modern cloud-native stack. Tailor your emulation to the environment. Don't waste resources testing for vulnerabilities that don't exist in your infrastructure. Use your threat intelligence to prioritize the techniques that are most likely to be used against your specific industry and geographic location.
Practical Steps for Your Next Engagement
Start small. Pick one TTP that you are confident you can detect, and one that you are fairly certain you will miss. Run the emulation. If you catch the first one, great—verify the alert quality. If you miss the second one, don't panic. That is the point of the exercise. Document the failure, identify the missing telemetry, and work with the blue team to implement a new detection rule.
Remember that the goal is to build a feedback loop. Every emulation exercise should result in a more resilient environment. If you aren't changing your defensive posture after an engagement, you aren't doing purple teaming; you're just running a script. Keep the focus on the technical reality of the attack, maintain transparency between teams, and always prioritize the metrics that demonstrate a measurable improvement in your ability to defend the network. The next time you run an engagement, ask yourself: did we just prove we could be hacked, or did we prove we could stop it?
Vulnerability Classes
Up Next From This Conference
Similar Talks

Kill List: Hacking an Assassination Site on the Dark Web

Counter Deception: Defending Yourself in a World Full of Lies




