Why Better Tools Alone Won’t Achieve Perfect Cybersecurity

0
4

Key Takeaways

  • Investment in cybersecurity tools often overlooks the human element, creating a gap between technology spend and actual security outcomes.
  • Phishing and social‑engineering attacks succeed because they exploit how people process information under stress, not merely because users are careless or untrained.
  • Behavioral science models such as COM‑B (Capability, Opportunity, Motivation → Behavior) reveal that security actions depend on environment and motivation, not just knowledge.
  • Effective cybersecurity programs treat employees as partners, soliciting their feedback to uncover work‑arounds, pressure points, and hidden vulnerabilities.
  • Practical first steps for technologists and leaders include informal conversations with staff, listening without defensiveness, and using those insights to make small, iterative adjustments to processes and tools.
  • Building trust—where mistakes can be reported without fear—turns employee behavior into a security asset rather than a liability.

Understanding the Cybersecurity Investment Gap
Terry Gerton opens the discussion by noting that despite frequent conversations about cybersecurity tools, training, and regulations, phishing and social‑engineering attacks continue to grow more sophisticated and breaches persist across all levels of government. He asks Nicole Togno, senior director for civilian experience and policy research at Fors Marsh, to explain why substantial investments in technology have not yielded proportionate improvements in security outcomes. This framing sets the stage for a deeper look at the mismatch between spending on technical defenses and the reality of human‑centric threats.


Behavioral Science as the Missing Lens
Nicole responds that the core issue is that agencies have been solving the wrong problem. While they pour resources into tools that manage systems effectively, they have under‑invested in understanding the people who operate those systems. Humans are complex, busy, and constantly making judgment calls under pressure. Attackers design phishing campaigns to exploit precisely those cognitive tendencies—especially when individuals are stressed or rushing to meet deadlines—rather than relying on user ignorance or negligence. Thus, the gap is not a technological shortfall but a failure to account for how human behavior actually shapes security risk.


Introducing the COM‑B Framework
To make this concept concrete, Nicole references the COM‑B model from behavioral science: Capability, Opportunity, and Motivation interact to produce Behavior. She emphasizes that behavior is rarely a logical outcome of knowledge alone; instead, it hinges on whether the environment makes the desired action easy or difficult and whether individuals are genuinely motivated to perform it. In cybersecurity terms, simply telling employees what a phishing email looks like (increasing capability) does not guarantee they will avoid clicking it if the opportunity to bypass security measures exists or if motivation to stay secure is low amid competing priorities.


Why Traditional Training Falls Short
Terry recalls clicking through mandatory phishing‑awareness trainings and wonders if that is sufficient. Nicole agrees that such trainings are often compliance exercises: they convey knowledge but do not translate into lasting behavioral change. Decades of research show that information alone rarely alters behavior, especially when the surrounding workflow incentivizes speed over caution. When faced with a login prompt that adds several steps during a tight deadline, employees naturally seek work‑arounds—not because they disregard security, but because their cognition is optimized for task completion. Placing the entire burden of security on the individual therefore becomes ineffective and unfair.


Shifting from Compliance to Partnership
Given that most cybersecurity professionals are technologists rather than behavioral scientists, Terry asks what practical mindset shift is needed. Nicole advises treating employees as partners in the security process rather than as assets to be managed or risks to be mitigated. This shift requires leaders to view staff as valuable sources of insight into where real vulnerabilities lie. By recognizing that the people using the systems daily understand the friction points and work‑arounds, agencies can harness that knowledge to improve defenses rather than merely policing behavior.


Listening as a First Practical Step
To operationalize this partnership, Nicole recommends a simple, informal approach: sit down with a handful of employees from different roles and ask two questions—Where does security get in the way of your work? and What do you do when that happens? The key is to listen without defending existing policies or explaining procedures. This uncorks valuable, often uncomfortable truths about processes that have been abandoned, secure steps that feel too burdensome, and informal shortcuts that employees have adopted to keep their work flowing. Such qualitative data reveals where the security program is breaking down in real‑world conditions, beyond what penetration tests or technical audits can show.


Turning Feedback into Action
After gathering this feedback, the next step is to present the insights to the technologist team. Nicole suggests showing them evidence of the elegantly designed system they have built, then highlighting the newly discovered work‑arounds or pain points. Armed with this knowledge, technologists can decide what small adjustments—such as streamlining a multi‑factor authentication step, redesigning a confusing alert, or simplifying a reporting channel—are needed to align security controls with actual workflows. The process does not require a massive overhaul; iterative tweaks based on real‑user experience can progressively close the gap between intended security and lived practice.


Building Trust as a Security Asset
A crucial cultural element that emerges from these conversations is trust. Nicole stresses that employees must feel safe to report mistakes without fear of punitive repercussions. Currently, many workplaces foster a climate where clicking a malicious link leads to hopes of staying unnoticed, which ultimately undermines security because threats go unreported and unaddressed. By cultivating trust—both leadership‑to‑staff and peer‑to‑peer—organizations turn honest reporting into a proactive defense mechanism. Trust is not a “soft” nicety; in this context, it functions as a tangible security asset that enhances detection, response, and resilience.


Mindset Shift for Decision‑Makers
Terry acknowledges that adopting this approach may feel like swimming upstream for a CISO accustomed to purchasing tools or adding checklists. Nicole agrees that the shift begins with a mindset change: moving from viewing security as something imposed on people to seeing it as something co‑created with them. Agencies often overlook that their own staff are a rich source of intelligence about where real vulnerabilities exist. Empowering technologists to engage directly with employees, learn about cumbersome processes, and surface unspoken work‑arounds unlocks actionable insights that internal penetration tests alone cannot provide. This collaborative mindset yields fresh data and innovative solutions that are more aligned with how work actually gets done.


Starting Small: Talk, Listen, Adjust
For federal CIOs or CISOs unsure where to begin, Nicole’s advice is straightforward: start with informal conversations. No need for formal town halls or lengthy surveys—just a few candid chats across roles, asking the two core questions, and truly listening. The discomfort that may arise from hearing about existing shortcuts or policy bypasses is precisely the signal needed to improve. Those insights become the foundation for small, evidence‑based adjustments—whether tweaking a workflow, clarifying a guideline, or simplifying a technical control. Over time, these incremental changes accumulate into a security posture that is both technically sound and behaviorally realistic, reducing the persistent gap between investment and outcome.


By reframing cybersecurity as a behavioral challenge and leveraging employee insight, government agencies can transform their greatest perceived vulnerability—people—into a decisive strength in defending against ever‑evolving cyber threats.

SignUpSignUp form

LEAVE A REPLY

Please enter your comment!
Please enter your name here