Employees will naturally follow their cognitive biases unless organizations proactively engage them in security processes, according to Georgia Crossland, a Ph.D. researcher at Royal Holloway’s Centre for Doctoral Training in Cyber Security. In an article for Infosecurity Magazine, Crossland describes two common cognitive biases that can increase an organization’s cyber risk. The first is optimism bias, and the second is fatalistic thinking.
“Optimism bias is sometimes used interchangeably with ‘overconfidence’, and refers to the phenomenon whereby individuals believe they are less likely than others to experience a negative event,” Crossland says. “This particular bias is said to transcend age, race and gender....A recent poll of 2000 remote workers by Promon revealed that 77% said that they weren’t worried about security while working at home. This also extends to organizational contexts, where individuals believe their own organization to be at relatively lower risk to information security threats than other competitor organizations.”
Fatalistic thinking, meanwhile, is when employees content themselves with the idea that cyberattacks will occur no matter what they do, so there’s no point in worrying or wasting time and effort on preventative measures.
“Fatalistic thinking refers to an outlook where individuals may believe they have no power to influence risks personally, as risks are controlled by external forces,” Crossland says. “In information security, this might mean believing there is nothing you can personally do to prevent a phishing attack, because you’re going to fall victim to a phishing attack anyway. Or believing that everything is ‘hackable’ and so there’s little point in protection efforts. This feeling may augment with home working, as employees are distanced from usual organizational support.“
Crossland explains that while these two mindsets seem incompatible with each other, a person can simultaneously “be optimistic about their own risk and believe that they have no power to reduce the risk anyway.” She adds that there are ways to overcome both of these biases if people are aware of them.
“It may help to take a ‘human as a solution’ approach to information security,” she writes. “In information security, humans are often viewed as the biggest issue. Therefore, efforts are made to exclude and control them. This removes the opportunity for individuals to contribute to their organization’s cybersecurity. It is really no surprise individuals demonstrate perceptual biases if they are made to feel like the weakest link. Instead, organizations should learn from and involve employees in information security.”
Crossland adds that organizations should be mindful of these biases when they provide training for their employees.
“Understanding biases may also help organizations tailor information and training,” she concludes. “Training people to understand and cope with the risk should be at the forefront. Especially in the case of fatalistic thinking, organizations might endeavor to remove fear appeals as a method for communication, and increasing feelings of morale and employees’ abilities to cope with threats.”
These biases arise when people are scared or feel as if there’s nothing they can do. New-school security awareness training can give your employees a realistic idea of the threats they face and the measures they can implement to protect themselves.
Infosecurity Magazine has the story: https://www.infosecurity-magazine.com/next-gen-infosec/biases-perceptions-threats/