
Cybersecurity Isn’t a Knowledge Problem — It’s a Fabric Problem
I was discussing cybersecurity and why it persists with Paul Kelly and here are some insights from what we discussed.
Organisations don’t fall victim to cyberattacks because their people are stupid.
They fall victim because, under real-world pressure, smart people behave in predictable ways — ways that are shaped by systems, incentives, culture, and fear. And no amount of training alone can change that.
Each year, businesses spend millions on cybersecurity training. Modules are completed. Quizzes passed. Posters printed. Awareness campaigns launched. And yet, the breaches keep coming. Not because employees don’t know what to do — but because in critical moments, they don’t doit.
Why?
Because we’ve confused information with transformation. And because we’ve failed to understand the true nature of behaviour in large organisations.
The decay of training
The training decay is a term we use to describe the rapid decline in the impact of one-off training when it isn’t reinforced through daily behaviour, system design and leadership modelling. It is the steady erosion of knowledge when it isn’t reinforced, contextualised, or turned into habit.
Most cyber training operates on a compliance model: tell people what to do, test their memory, and move on. But information delivered in a one-off module doesn’t change what happens when someone’s inbox is overflowing, their manager is chasing them for speed, and an email from a “supplier” asks them to click something urgently. People remember the slide, but not what to do when 200 unread emails are stacked in their inbox and a suspicious message slips through.
That’s when training collapses. Not because the training was wrong — but because the moment wasn’t safe, supported, or designed for better behaviour.
And this is where leadership matters deeply.
Training decay isn’t about ignorance. It’s about disconnection. The training may be technically sound — but unless the organisation’s fabric supports the behaviour (through habits, incentives, and leadership), the learning withers.
Awareness, without repetition or relevance, decays into background noise.
The Missing Layer: leadership modelling
Leaders in many organisations talk about cybersecurity. But far fewer model it.
They don’t share their own near misses. They don’t ask about vigilance in team meetings. They don’t recognise employees who raise false alarms, they often don’t even hear about them. In some cases, the first time a senior leader hears about a cyber risk is when it’s already a crisis.
When leaders treat cybersecurity as a technical or compliance issue, rather than a behaviour and trust issue, they signal that it’s someone else’s responsibility. And people follow those signals. Always.
If leaders don’t make vulnerability safe, if they never say, “I almost clicked that too,” then others won’t speak up either. Silence becomes self-protection.
In organisations like this, the culture subtly rewards people for keeping quiet, not for raising concerns. That’s how threats are buried. That’s how attackers get through.
The Behaviour Is Rational — For the Wrong System
In one incident FabricShift analysed, a staff member clicked a phishing email. They’d done the training. They understood the risk. But the context made that knowledge irrelevant.
They had a backlog of 200 emails. Their manager was on their case. The inbox was full of supplier invoices, and this one looked normal enough. They clicked.
That one click led to malware, stolen credentials, and widespread system compromise.
And worse, when alerts started firing, no one flagged them clearly. Everyone hesitated. Because the systems were noisy, the team was overwhelmed, and the fear of getting it wrong was greater than the safety of speaking up.
This wasn’t stupidity. It was rational behaviour in a badly designed system.
The training had been delivered. But the fabric — the daily routines, team norms, system design, leadership tone, and cultural signals — all pointed away from vigilance and toward silence, speed, and risk suppression.
What Needs to Change?
Cybersecurity will never be solved by training alone. Not even good training. Because it’s not just about knowing what to do. It’s about doing it consistently, especially under pressure.
The first step is recognising that habits drive behaviour — not knowledge. In moments of stress or pressure, people don’t rely on what they’ve been told once in a training session. They default to what they’ve done often, what feels normal, and what’s been reinforced over time. If the right behaviours haven’t become habits, they won’t show up when it matters most.
Habits are built through repetition, social reinforcement, and meaningful feedback. They’re shaped by what leaders model, what teams talk about, what systems make easy or hard, and what people feel safe to do. And without those reinforcing conditions, even the best training decays into forgotten advice.
That’s why cybersecurity will never be solved by training alone. Not even good training. Because it’s not just about knowing what to do — it’s about doing it consistently, especially under pressure.
In the organisations we work with, this looks like:
- Leaders modelling vigilance— not just talking about it, but living it and being open about their own near misses or questions.
- Managers building ritualsinto their teams — weekly “cyber minutes,” visible habit tracking, open discussions of mistakes.
- Building repeatable habits— supporting employees to adopt small, consistent cyber behaviours that can be practised daily, reinforced by teams, and tracked over time.
- Systems that reward reporting— recognising people for spotting threats, even if they turn out to be false alarms.
- KPIs that include behaviours— measuring not just security outcomes, but the habits that prevent failure.
And most importantly: a fabric where people feel safe enough to act. Because in the end, every policy, tool, and firewall depends on human action. And human action depends on how the organisation makes people feel.
A Final Thought
The next breach your organisation faces probably won’t be because someone didn’t knowwhat to do.
It will be because, in that moment, doing the right thing felt too difficult, too risky, or too unsupported.
Fixing that isn’t about adding another training module. It’s about rewiring the environment people work in every day. It’s about fabric.
Until we understand that, and lead accordingly and we’ll keep repeating the same breach story, just with different names.