Introduction

It started with a single click.

A finance associate, rushing to process invoices before quarter-end, opened an email that looked like it came from her manager. It wasn’t. Within minutes, credentials were compromised and funds were on their way overseas.

When the incident was analyzed, the conclusion was familiar: “Human error.”

But here’s what really bugged me about that assessment — it’s lazy. It’s the easy answer.

Your people aren’t the weakest link. I’ve seen this play out dozens of times in my career. The weakest link isn’tthe employee who made a mistake. It’s the organization that made that mistake inevitable.

The real problem isn’t carelessness. It’s that the organization’s systems and culture made secure behavior harder than insecure behavior. Think about it — no one wakes up wanting to screw up security. But if the secure path takes 10 minutes and the insecure path takes 30 seconds, guess which one wins? Every time.

The real challenge for leaders isn’t trying to control people (spoiler: you can’t). It’s empowering them to make the right choices because those choices actually make their jobs easier.

Fear Fades. Ownership Lasts.

I‘ve sat through too many security awareness training sessions where the approach is basically “scare them straight.”

Flash breach statistics. Show dramatic red warning slides. End with “Don’t click suspicious links.”

Here’s what happens: employees forget it. All of it. Within days.

But ownership? That sticks.

I worked with an organization that completely reframed their training approach. Instead of “here’s all the ways you’ll screw up,” they said “you’re part of our defense team.” They even gave each department a leaderboard showing who reported the most phishing attempts.

Within a month, reporting went through the roof. Not because people were scared of punishment. Because they took pride in defending their colleagues.

This wasn’t some revolutionary insight. It’s basic human motivation. But I was shocked how many organizations still don’t get it.

The 2023 MOVEit Transfer breach actually illustrated this perfectly. In some organizations, employees caught unusual file transfer behavior early because they understood why it mattered. They reported it. They stopped breaches in their tracks. In others? The data walked out the door. Same vulnerability. Different cultures.

Empowerment > Enforcement. Full stop.

Culture Isn’t a Poster — It’s the Operating System

You can’t just bolt security culture onto a company like you’re installing software. It’s not an app. It’s the operating system everything runs on.

Most executives think culture is handled by putting posters in the break room or running annual training. I’veliterally seen this. A bunch of security posters. Everyone ignores them.

Real culture? It shows up in the small moments:

  • Do your people feel safe admitting mistakes, or do they hide them?
  • When the CEO bypasses security because they’re in a hurry, what message does that send?
  • When someone reports a security issue, do they get thanked or blamed?

I consult with a lot of organizations, and I can usually tell within the first 15 minutes if they have a real security culture. It’s not the number of controls they have. It’s whether people actually trust the system.

The Microsoft Exchange Server incidents in 2022 made this crystal clear. Organizations where IT teams felt comfortable raising concerns early? They contained things quickly. Organizations where people were afraid to speak up? The attackers were in the walls for months.

The Psychology of Mistakes — and What It Teaches Us

Here’s something psychologists have known for decades but cybersecurity somehow missed: most “errors” aren’t actually carelessness. They’re workarounds.

People don’t bypass your security controls because they’re stupid. They do it because your controls are in the way of actually getting work done.

Let me give you a real example from my consulting. A finance team needed to verify wire transfers through a secure portal. Sounds good, right? Except the portal was slow. Ten minutes per request. Meanwhile, email approvals took 30 seconds.

Guess what happened? People started using email.

When they got breached, management pointed at the employees. “They broke policy!”

But here’s the thing — they fixed the real problem by redesigning the process to be faster than the workaround. And compliance went up, not down. Because people weren’t fighting the system anymore.

I saw this exact same dynamic play out with the 2023 Change Healthcare attack. An employee’s compromised credentials were the entry point. Everyone wants to blame the employee. But if you dig into the root cause, it’sthe organization’s own access management failures that created the vulnerability in the first place.

Fast forward to 2024 with the MOVEit incidents. The teams that had trained employees to notice unusual behavior? They caught things early. The teams operating in a blame culture where people hide incidents? By the time anyone knew what happened, it was too late.

The real lesson: Fix the friction, not the people.

From Awareness to Action: Building Real Security Engagement

Traditional security awareness campaigns are basically the cybersecurity equivalent of flu shots. You do them because you’re supposed to. They might help. But they’re not going to change much.

Real resilience comes from actually engaging people, not just checking compliance boxes.

I’ve seen what works and what doesn’t. Here’s the honest breakdown:

What Actually Works:

  • Short, practical lessons tied to real stuff that just happened at your company (not generic examples)
  • Recognizing people for good security behavior — people respond to that
  • Sharing real stories from your own organization, not theoretical scenarios
  • Making it dead simple to report suspicious stuff — ideally one click
  • Creating actual feedback loops so people know they made a difference

What’s a Waste of Time:

  • Long mandatory modules that everyone hates
  • Publicly embarrassing people who make mistakes
  • Treating training like a compliance checkbox
  • Generic content that has nothing to do with people’s actual jobs

Here’s the thing I’ve learned: empowerment starts when your people actually believe you trust them. Not when they believe you’re testing them.

According to Mimecast’s State of Human Risk 2025 report, organizations that built continuous, contextual learning into daily workflows saw 40% higher incident reporting. That’s not a small number. That’s a cultureshift.

When Leadership Leads by Example

I’ve been doing this long enough to know: culture flows downward. Period.

If your C-suite treats cybersecurity like it’s an IT problem, that’s what your employees will think. If your leaders visibly champion it? People follow.

And here’s what’s interesting — you don’t need to be a technical expert to lead on this. You just need to actually participate:

  • Mention security wins in company meetings. Seriously, just do it.
  • Join the phishing simulations. Get tested like everyone else. Share your results.
  • Talk openly about mistakes. I almost fell for a spoofed email last week. And I’ll tell people about it.
  • Put actual time and resources toward security training. Your calendar shows your priorities.

When leaders humanize security, it stops being this scary abstract thing. It becomes relatable. And whenemployees see vulnerability from leadership? They respond with honesty and ownership.

The 2023 Okta incident showed this really well. In organizations where security leadership communicated openly about what happened and what they were doing about it, people actually felt more confident in security. In the organizations that went dark? Trust was shot for months.

The Business Case for Human-Centered Security

Let me cut to the chase: the numbers support this approach.

According to Mimecast’s State of Human Risk 2025 report, insider-driven breaches cost organizations $13.9 million on average. That’s almost 3x a typical breach. But here’s the kicker — just 8% of employees cause 80% of incidents. So the risk is actually concentrated and manageable.

Organizations that actually invest in empowerment see real returns:

  • 87% report that security awareness programs improve threat detection
  • Those that embed empowerment and psychological safety? They see incident rates drop 30–50%

According to KnowBe4’s 2024 Security Culture Report, organizations with strong security culture had 60% fewer incidents than those with weak culture. Same technology investments. Different culture. That’s the difference.

Now the other side of the equation: what does a breach cost?

According to IBM’s 2024 Cost of a Data Breach report, average breach is $4.44 million. One click. One mistake. $4.44 million.

But here’s what keeps me up at night — organizations spend millions on technology while penny-pinching on culture. And then they’re shocked when a single employee causes a multi-million dollar incident.

The math is simple: spend $100K building real security culture and prevent a $4M breach. That’s 40x ROI. Most CFOs would kill for those numbers.

According to Verizon’s 2024 Data Breach Investigations Report, 68% of breaches involved a human element. But here’s what’s important — 32% of those could have been detected in the first 24 hours if people had proper training and reporting channels.

Tools catch anomalies. People catch intent. And that human intuition? It’s still our best defense.

Real-World Incidents: When Empowerment Worked — and When It Didn’t

I work with enough organizations and review enough incident reports to see the patterns clearly. Culture matters. A lot.

Case Study 1: The SolarWinds Supply Chain Attack (2020)

SolarWinds hit thousands of organizations. But here’s what’s interesting — some companies caught the lateral movement early. Why? Because employees spotted something unusual on the network and reported it.

But — and this is the big but — only the organizations with strong reporting cultures and psychological safety actually contained the damage quickly. In shame-based cultures? People were too afraid to raise alerts. By then the attackers were everywhere.

Case Study 2: Twitch Security Breach (2021)

An employee flagged security misconfigurations months before the breach. Flagged them. To management.

You know what happened? They got pushed back on. “Not our priority right now.”

Post-breach analysis showed that Twitch had all the technical controls in place. They just had a culture where employees didn’t feel safe escalating concerns. That’s not a technical failure. That’s a culture failure.

Case Study 3: MGM Resorts Ransomware (2023)

Security teams later said several employees noticed weird authentication patterns days before the full attack. Days.

But the reporting mechanisms were slow. Management response was slow. It was like watching a disaster in slow motion.

A culture where people could rapidly escalate and where management actually responded? That could have changed the timeline significantly.

Case Study 4: Change Healthcare Attack (2024)

This one hit me hard because it was so preventable. Employee credentials were compromised — everyone focused on that. But the real story? The organization’s access management was a mess. Multiple systems weren’ttalking to each other. Credentials weren’t being rotated.

Organizations with similar configurations but better employee training and reporting culture? They detected the same attack attempts within hours, not days.

The pattern is so obvious once you see it: technical controls matter, but employee engagement determinesspeed and scope of damage.

The Regulatory Landscape: Why Employee Security Is Now Compliance Priority

This is where things get real for boards and executives.

Regulatory bodies have figured out what I’ve been saying for years: human security culture isn’t optional. It’snow a legal requirement.

The EU’s NIS2 Directive (in effect January 2025) literally mandates that organizations demonstrate “security awareness and training.” This isn’t a nice-to-have. It’s a compliance requirement.

GDPR Article 32 requires “ongoing training” for anyone handling personal data. And regulators now check — they don’t just count training completion. They audit whether culture actually shifted.

ISO/IEC 27001:2022 explicitly requires documented evidence that organizations are building security culture, not just checking training boxes.

I’ve watched organizations get dinged by regulators not because they didn’t do training, but because they treated it like a compliance checkbox rather than a real culture change initiative.

NIST Cybersecurity Framework (2024 update) elevated organizational culture to foundational governance level. Not an optional nice-to-have. Foundational.

Here’s the real consequence: organizations treating employee security as a “check-the-box” compliance item now face regulatory scrutiny. But those building genuine culture? They get credit. They gain competitive advantage in audits.

Challenges Organizations Face When Building Security Culture

I need to be honest about this: building real security culture is hard. Really hard. Understanding the obstacles helps.

Challenge 1: The Blame-to-Culture Transition

Some organizations have spent decades blaming employees. That’s deep. Reversing it requires sustained commitment and visible leadership behavior change. And employees? They’re skeptical. They’ve heard “we’re changing” before.

I’ve worked with organizations where it took 18-24 months before people actually believed the culture was shifting. Even then, underreporting continued in the early stages because people didn’t trust the new approachwas real.

Challenge 2: Scale and Consistency

One manager praising security vigilance works great. Scaling that across thousands of employees, multiple geographies, different management styles? That’s exponentially harder.

Inconsistent messaging creates confusion about what security actually means at your company.

Challenge 3: Remote and Distributed Workforces

The shift to hybrid/remote work removed a lot of the informal touchpoints where culture actually happens. Water cooler conversations about security incidents. Peer mentoring. Visible leadership. You lose all that with remote work.

Virtual training often loses the engagement factor that made in-person sessions work.

Challenge 4: Competing Priorities

Your salespeople are focused on quota. Your engineers are focused on velocity. Your product team is focused on features. For them, security feels like overhead that gets in the way.

Leaders need to reframe security as enabling business, not inhibiting it. But that requires alignment between security and business teams that a lot of organizations don’t have.

Challenge 5: Measuring Culture Change

Training completion rates are easy to measure. But meaningless. Actual behavior change? That’s what matters. And it’s hard to track.

This creates a measurement gap where security leaders struggle to justify culture investments to CFOs who want ROI metrics.

Challenge 6: Fatigue and Desensitization

Continuous phishing simulations. Mandatory trainings every quarter. Security alerts all day. At some point people tune it out.

You get the “boy who cried wolf” effect where genuine threats are treated as noise because people are just tiredof the alerts.

People Are the Firewalls of Trust

Technology can lock down your infrastructure. Policies can define what should happen. But your people — making hundreds of micro-decisions every single day — those are your real security defense.

If you keep telling them they’re the weakest link, they’ll eventually behave like it.

If you treat them as partners in defense? They rise to the challenge.

Security Culture in 2025: The Emerging Landscape

Here’s what I’m actually seeing in organizations that are ahead of the curve:

Trend 1: AI-Assisted, Human-Verified Workflows

Instead of replacing human judgment, smart organizations are using AI to augment it. “This email looks suspicious” — AI flags it. But humans make the decision. Best of both worlds.

Trend 2: Security as Business Enabler

The messaging is changing from “don’t make mistakes” to “here’s how to move fast securely.” This removes the false choice between speed and security that’s plagued us for years.

Trend 3: Psychological Safety as Competitive Advantage

Organizations are beginning to compete on this. They recruit talent by saying “we actually have a security culture people trust.” They win RFPs by demonstrating real security culture, not just compliance.

Trend 4: Continuous, Contextual Learning

The era of “annual training” is dead. Next-gen approaches embed learning into daily workflows, triggered by actual threats and behaviors, not calendar dates.

Trend 5: Regulatory Mandates for Culture Documentation

NIS2 and emerging regulations now require documentation of culture change. Not just training completion. Culture change. This creates accountability at the board level.

A Call to Action for Decision-Makers

If you’re running an organization, your cybersecurity posture isn’t determined by your tools. I don’t care how much you spent on your SIEM or your endpoint protection.

Your security posture is determined by whether your people actually think like defenders.

Ask yourself one question this week:

“Where are we making secure behavior harder than insecure behavior?”

Then actually listen to the answer. Don’t defend the system. Fix it.

That one small change — removing one friction point for secure behavior — will strengthen your defenses more than most major security investments.

Here’s a simple implementation approach:

  1. Actually audit your current security culture (use NIST or ISO frameworks as reference)
  2. Find one process that creates friction for secure behavior
  3. Simplify it
  4. Measure employee reporting rates before and after
  5. Share results with leadership
  6. Scale what works

Your biggest security asset is sitting in your office right now. They’re just waiting for permission to help.

Executive Takeaways

  • Empowerment scales better than enforcement
  • Fix friction, not people
  • Leadership behavior defines culture
  • People don’t fail security — systems fail people
  • Security culture is now a regulatory compliance requirement
  • Human-centered security delivers measurable ROI
Gurdeep Singh
Sr. Governance, Risk & Compliance (GRC) Analyst at  |  + posts

Gurdeep Singh, CISSP, CISM, is a senior cybersecurity professional specializing in audit readiness, risk management, and AI-enabled compliance. He helps global organizations strengthen ISO 27001, SOC 2, and NIST programs through automation and continuous assurance frameworks that bridge governance, technology, and business risk management.

Leave a Reply

Your email address will not be published. Required fields are marked *