The 5 Psychological Barriers to Cybersecurity, a.k.a. Why people don't care about security
- Ashraf Aboukass
- Sep 26
- 5 min read
Dysfunction in the Brain's Motivation & Reward Circuitry
The Barrier: "Security doesn't feel rewarding."
Motivation relies on the brain's reward system (cost vs. reward). If security doesn't seem rewarding, people tend to deprioritise it, even if they know what to do. Cybersecurity falls into this trap because it's about preventing something that hasn't happened. Not being hacked feels like nothing happened, with no dopamine spike and no sense of reward.
Instead, the brain registers the small costs (extra login steps, forced updates) and weighs them against the invisible, abstract reward (no breach). The scales don't tip in favour of the breach; the burden of extra security is too heavy.
Example: Employees often avoid enabling MFA because the benefit (a reduced risk of future attacks) is abstract, while the cost (time and inconvenience) is immediate and tangible.
To address this, use "Behavioural Reinforcement" to give small, quick rewards for safe behaviour. Try some of these ideas:
Gamification (badges) to make security feel rewarding, rather than invisible.
Display positive personal metrics, such as "Your team blocked 40 phishing attempts this week."
Public recognition to elevate security "champions" and provide social rewards.
Key Takeaway: This taps into the desire to feel important, special, and respected by peers. The reward provides immediate, tangible feedback (a dopamine spike) that the action was valuable to both the individual and the group.
(Inspiration: Le Heron, C., Apps, M. A., & Husain, M. (2018). The Computational Anatomy of Motivation Deficits in Neurological Disease. Trends in Cognitive Sciences.)
Reward Devaluation / Diminishing Value
The Barrier: "Warnings lose their impact over time."
Humans quickly adapt to repeated stimuli, a phenomenon called "hedonic adaptation" or habituation. Neuroscience shows that the motivational pull of a stimulus declines when it becomes predictable or repetitive. That's why security training fades. The first warning catches attention. By the fifth or tenth, the brain perceives it as routine and stops paying attention.
Example: After six months of monthly phishing simulations, click-rates rise again, not because staff don't know better, but because the exercise has lost its urgency and novelty.
To address this, maintain psychological novelty by keeping security messaging fresh through variation and surprise.
Integrate 'Surprise and Delight' into messaging by rotating styles (video, interactive module, comic) and carrying out unannounced campaign delivery methods.
Vary phishing scenarios dramatically (invoices, HR notices, package deliveries, vendor alerts).
Use real-world breach stories or case studies to restore urgency and emotional relevance.
Key Takeaway: Novelty breaks routine and forces the brain to process new information, bypassing habituation.
(Inspiration: Schultz, W. (2016). Dopamine reward prediction error coding. Dialogues in Clinical Neuroscience.)
Cognitive Inertia / Resistance to Change
The Barrier: "Old habits die hard."
Even when people are aware of the risks, a well-known behaviour called "cognitive inertia" often prevents them from taking action. This isn't laziness; it's the brain's tendency to conserve energy. Switching habits, adopting new workflows, or updating mental models takes effort. People default to what's familiar unless the incentive to change is overwhelmingly strong. Inertia drives the reuse of passwords, the sharing of accounts, and the use of shadow IT. Training can raise awareness, but it doesn't dissolve inertia. Change requires designing environments where the secure choice is the easiest one to make.
Example: Employees continue to reuse weak passwords year after year, not because they don't understand the risk, but because updating dozens of logins feels overwhelming.
To address this, make the secure option the easiest by removing friction and emphasising design over willpower.
Reduce friction by utilising password managers, SSO, and auto-patching to automate security measures.
Apply "Choice Architecture" by enforcing secure defaults, such as auto-enabling MFA or automatically generating strong passwords.
Don't count on people's willpower—make being secure the easiest choice.
Key Takeaway: Reduce friction and mental effort (Tranquillity) and enforce a predictable, stable routine (Order/Certainty), thereby overcoming the Status Quo Bias by making the new path less effortful than the old one.
(Inspiration: Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision-making. Journal of Risk and Uncertainty.)
Limited Attention & Cognitive Load
The Barrier: "The brain can only process so much information."
The human brain has limited working memory and a short attention span. When a person is confronted with too many tasks, alerts, or complex demands, performance declines rapidly; this state is known as "Cognitive Overload". The brain's survival strategy under load is to prioritise simple, familiar actions and ignore anything that requires deliberate, high-effort attention. Cybersecurity constantly increases this load with every complex password, required two-factor code, and security warning pop-up. When employees are already multitasking or rushing to meet deadlines, they are hard-wired to focus only on the primary work goal, making it highly probable they will ignore, bypass, or make mistakes on security steps.
Example: A user trying to provision a new cloud resource rapidly is presented with seven security configuration options. To save time and reduce mental strain, they skip the unfamiliar settings and choose the default or easiest pathway, even if it's less secure.
To address this, minimise cognitive friction and prioritise information to reduce the mental effort required for secure behaviour.
Minimise the number of required decisions by setting secure configurations as the non-negotiable default.
Use simple (direction) reporting buttons or prompts only where the risk is high.
Offload the cognitive work by utilising automation and Single Sign-On (SSO) to streamline complex security steps, freeing up the user's limited attention for their primary job responsibilities.
Key Takeaway: Minimise pain and stress by removing the requirement for self-control when energy is low. By providing structure and automation, it increases the feeling of stability and control (Certainty).
(Inspiration: Baumeister, R. F., Bratslavsky, E., Muraven, M., & Tice, D. M. (1998). Ego depletion: Is the active self a limited resource? Journal of Personality and Social Psychology.)
Compassion Fade & Emotional Numbing
The Barrier: "The problem feels too big to matter."
Humans are wired to respond emotionally to individuals, not to large groups or masses. A phenomenon known as "Compassion Fade" suggests that as the number of people affected increases, our empathy and willingness to act often decrease. In cybersecurity, this manifests as breach fatigue and emotional desensitisation. After repeated headlines about massive breaches, people feel hacking is inevitable and lose motivation to stay vigilant.
Example: After hearing about dozens of global breaches, employees shrug off a phishing simulation: "Everything gets hacked anyway, so why should my behaviour matter?"
To address this, make security personal and relatable by humanising the impact to restore motivation.
Show human-scale consequences and frame security as protecting family photos, salary deposits, or personal bank accounts.
Focus on local success stories, such as "Sarah detected a phishing attempt that saved us days of recovery," rather than global failures.
Replace abstract numbers with personal narratives that bring the impact closer to home.
Key Takeaway: Shift focus from abstract numbers to a personal narrative, tapping into the fundamental need for closeness and belonging (Love & Connection) and the desire to help and protect one's immediate circle (Contribution).
(Inspiration: Cameron, C. D., & Payne, B. K. (2011). Escaping affect: How motivated emotion regulation creates insensitivity to mass suffering. Journal of Personality and Social Psychology.)
Conclusion: The Design Imperative
Indifference is the natural reaction, not a sign that people don't care about security. Their brains react to rewards, tiredness, and overload. Leaders should avoid relying solely on reminders or punishments as a means of motivation. Instead, they should design systems that align with human cognitive patterns by following these five pillars.
The 5 Pillar Framework for Human-Centric Cyber Design (Five P's):
Positive Reinforcement: Reward secure actions immediately.
Psychological Novelty: Maintain fresh and surprising awareness.
Path of Least Resistance: Make the secure path the easiest path.
Preserve Cognitive Resources: Reduce mental load and fatigue.
Personalise the Impact: Humanise the Consequences.

