Mark stared at his screen for the third time this week, watching another alert cascade across his security dashboard. It was 2 AM, and he couldn’t shake the feeling that he was fighting a war he would never win.
Sound familiar?
If you’re in cybersecurity, you’ve probably been there. That moment when you realize you’re not just protecting data — you’re carrying the weight of your entire organization’s digital existence on your shoulders. And frankly, it’s crushing.
The Exodus Nobody’s Talking About
The modern cybersecurity professional has an average tenure of just 2–4 years in their role. Nearly half are considering leaving the industry entirely due to high stress levels, according to Deep Instinct’s Voice of SecOps Report 2022.
Here’s the thing: it’s not about money. Cybersecurity professionals often command premium salaries, with median compensation exceeding $100,000 annually and senior roles reaching well into six figures.
So what’s driving this exodus? We’ve created a profession where failure isn’t just possible — it’s inevitable. And we’re asking humans to bear the psychological burden of that inevitability.
When Perfect Systems Meet Imperfect Reality
In my previous article, I explored how our entire cybersecurity approach is fundamentally flawed. We’re spending billions while breach numbers skyrocket. We’re deploying AI on top of broken processes. We’re treating symptoms instead of causes.
But here’s what those industry statistics don’t capture: the toll this systemic failure takes on the humans asked to make it work.
The 2023 IBM report revealed a stunning disconnect: while 95% of studied organizations experienced more than one breach, breached organizations were more likely to pass incident costs onto consumers (57%) than to increase security investments (51%).
Think about that. Organizations would rather pass the cost to customers than invest in preventing the problem. Meanwhile, we’re asking security professionals to plug holes in a ship where the captain has decided it’s cheaper to let it leak.
The Weight of Impossible Expectations
Remember those staggering statistics? The 30,458 security incidents and 10,626 confirmed breaches in 2023 — a two-fold increase over 2022. The 34% increase in vulnerability exploitation. The fact that 68% of breaches involve human error.
These numbers represent more than security failures. They represent thousands of security professionals who had to deliver bad news to their executives, who worked around the clock during incident response, who faced the inevitable question: “How did you let this happen?”
When ransoms are paid, only 16% of organizations have no further issues. Imagine being a security professional in that scenario. You’ve been breached despite implementing all the “best practices.” The organization pays the ransom, executives think the problem is solved, but you know that 84% of the time, it’s not actually over.
You’re sitting there, waiting for the other shoe to drop, knowing that when it does, everyone will look at you and ask “how did this happen again?”
Fighting Against Human Nature
We’ve built security systems that fundamentally conflict with how humans actually work. As Andrew Odlyzko noted in his research on cryptographic abundance: “People do not fit easily into the formal structures that any security framework requires. A key problem with strong information security in an office environment is that it would stop secretaries from forging their bosses’ signatures.”
Then we wonder why security professionals burn out trying to enforce the unenforceable while being held responsible for inevitable failures.
As a security professional, you’re not just defending against sophisticated AI-powered attacks. You’re defending against every employee who might click the wrong link, every contractor who might misconfigure a system, every executive who might demand an exception to security policy because “it’s urgent.”
It’s like being a lifeguard at a beach where 68% of the drownings happen because people forgot they can’t swim — while simultaneously being blamed for not preventing people from forgetting how to swim.
The False Promise of Technology (Again)
The 2024 IBM Cost of a Data Breach Report noted that organizations applying AI and automation to security prevention saw the biggest impact, saving an average of $2.22 million over those that didn’t deploy these technologies.
That sounds promising, until you realize what it really means: even with AI and automation, breaches still happen. The technology just makes them less expensive. We’re not preventing the problem; we’re just making it more economically manageable.
Meanwhile, Gartner’s research suggests that by 2030, 75% of SOC teams will experience erosion in foundational security analysis skills due to overdependence on automation and AI. We’re creating a future where security professionals are simultaneously more dependent on technology and less capable of understanding it.
The Personal Cost of Systemic Problems
You’re constantly fighting against business priorities that prioritize speed over security. Every security measure you implement slows something down. Every policy you create frustrates someone. Every breach that happens reflects on your competence.
The psychological weight of being the person who has to say “no” all the time, while simultaneously being blamed when the broken system inevitably fails, is enormous.
From an organizational perspective, this turnover is devastating. Training a cybersecurity professional takes years. When they leave after 2–4 years, organizations lose institutional knowledge precisely when they need it most.
Breaking the Cycle
Based on the research, several approaches emerge:
Acknowledge the Systemic Nature of the Problem
Instead of expecting security professionals to overcome systemic failures through individual heroics, organizations should acknowledge that security failures often reflect broken systems, not incompetent people.
Align Incentives with Reality
Ross Anderson’s work suggests we need to align incentives properly. Instead of expecting security professionals to overcome economic misalignment through sheer force of will, organizations should structure incentives so that security becomes the economically rational choice.
Design for Human Psychology, Not Against It
Stop expecting perfection from both users and security professionals. Start building resilience that accounts for human limitations.
Redefine Success Metrics
If 95% of organizations experience multiple breaches, we need to redefine what success looks like. Perhaps success isn’t “never getting breached” — perhaps it’s “detecting quickly, responding effectively, and recovering completely.”
Distribute Responsibility, Not Just Blame
Security needs to become everyone’s job, not just the CISO’s problem. But this requires systemic changes to make security feasible for non-experts, not just more training.
Invest in Human-Centric Resilience
The IBM research shows that 75% of the increase in average breach costs was due to lost business and post-breach response activities. Maybe we should spend less time trying to prevent every possible attack with broken tools and more time building systems — and supporting the people who operate them — that can fail gracefully.
The Path Forward
The cybersecurity industry is at a crossroads. We can continue burning through talented professionals at an unsustainable rate while pursuing fundamentally flawed strategies, or we can acknowledge that both our technical and human systems are broken and work to fix them together.
This isn’t just about individual burnout — it’s about the security of our entire digital infrastructure. When we burn out the people responsible for protecting it through systemically impossible expectations, we all become less secure.
The solution isn’t to find more resilient people or better technology in isolation. The solution is to build more resilient systems — technical systems, economic systems, and human systems that work together rather than against each other.
What You Can Do
If you’re a leader in cybersecurity or technology, start by acknowledging that systemic issues are creating unsustainable human costs. Ask your security team not “why did this happen?” but “what can we learn from this?” and “how can we build better systems?”
If you’re a security professional feeling the weight of impossible expectations, know that the problem isn’t you — it’s the system. The widespread burnout in our field isn’t a sign of individual weakness; it’s evidence that we’re asking humans to succeed within fundamentally broken frameworks.
The first step toward building better security is acknowledging that the people responsible for it are human, operating within systems that are often designed to set them up for failure.
It’s time we start respecting those limits while also fixing the systems that make those limits so painful to navigate.