Running phishing tests is a proven way to improve employees' cybersecurity awareness and behavior, but using misleading tactics to simulate malicious attacks could damage employee morale, according to new research.
In phishing tests, security and IT professionals create and send a mock e-mail to employees to help them identify malicious links that, if clicked, could cause workers to inadvertently leak sensitive data or invite damage to company systems.
"Helping employees learn to avoid costly and damaging phishing attacks, through training and internal phishing tests, is vital in protecting a business," said Clark Collett, creative manager for learning and development at ESET North America, an Internet security software firm in San Diego.
Recently published research, however, found that some common phishing tactics, such as dangling financial perks or bonuses as a lure or unfairly tricking employees and then shaming them, can do more harm than good.
"In a large-scale field experiment, we found evidence that phishing tests can indeed cause users to view cybersecurity as an agent of harm, which, in turn, evokes feelings of betrayal," said Ryan T. Wright, a professor of IT in the McIntire School of Commerce at the University of Virginia in Charlottesville.
He cited the example of website hosting company GoDaddy sending its employees an e-mail offering a link to a $650 holiday bonus. But the e-mail was a phishing test. Those who opened the link were rewarded with additional training instead of the expected cash.
Another approach that may frustrate employees is mimicking work communications and practices with minor phishing clues hidden in the document or e-mail, and then penalizing people for not catching the clues, Wright said.
"The problem is when the objective of the IT security department is not providing a learning opportunity but rather tricking the employees and then shaming them when they get it wrong," he said. "Unfortunately, there are too many examples of the 'stick instead of the carrot' approach."
Wright added that incentivizing secure behaviors is better than catching people at making mistakes. "It's better to celebrate people for doing something right than catch them for doing something wrong," he said. "Now, there will be fails and necessary course corrections to adhere to policies, but people are focused on doing their jobs, not on the lookout for phishing e-mails."
Collett believes phishing tests should resemble real-world phishing e-mails as much as possible and said many actual phishing e-mails do promise perks and bonuses.
"That said," explained Tony Anscombe, chief security evangelist at ESET, "the tests should be fair and have similar 'tells' that an employee could spot in a phishing e-mail, such as links that have a suspicious URL when you hover over them, or senders claiming to be someone from a particular company but mailing from a different domain. Trying to trick an employee into clicking on links or providing information in ways that a cybercriminal could never achieve should be avoided."
Collett said cybersecurity training "should be engaging and impactful in order to change behavior, but it should not be belittling or heavy-handed. If the tone of the training is positive, reinforcing and fun, the employee will feel enlightened instead of chastised."
Wright and his research co-authors suggested three criteria to help employers avoid hurting employee morale while maintaining phishing tests.
Test teams, not individuals. "If your organization is team-focused, which many are, you want to support people asking security questions of their teammates," Wright said. "If you let them know that there will be some kind of an incentive if 80 percent of the team gets the phishing test correct, then positive peer pressure is created, and they will start asking each other about suspicious e-mails, which is what you want."
On the other hand, Collett cautioned that phishing and social engineering scams are typically aimed at individuals, not teams, so it's important that an employee demonstrates an ability to independently avoid risky actions. "Discussions and live training, however, can benefit from being done as a team so that everyone can learn from each other," he said.
Encourage, don't embarrass. Wright said employers need to foster a culture of information sharing rather than embarrass employees who click on the wrong link. "Outcomes of phishing tests can often be punitive," he said. "For example, we know of one organization that gives a rubber chicken to people that get caught. Instead of awarding a rubber chicken for failing a phishing test, recognizing employees with a free coffee for correctly reporting the test to IT security and alerting their team can win buy-in for the importance of the task at hand."
Wright added that data security should be made part of an employee's performance management and development. "For most companies, failing a phishing test is separate from performance management. Even if you click on a real malicious e-mail, you just turn your laptop in to IT and get a new one. There isn't any positive development."
Gamify and reward. Some companies have turned to team-based competitions to create positive cybersecurity cultures. "Nobody remembers boring, unengaging training," Collett said. "Gamification … encourages employees to pay attention in order to do well and achieve badges and recognition. Poor-performing users should not be called out, but instead encouraged privately to learn from mistakes and go through training and challenges again with a chance to also receive rewards and recognition."
Wright said phishing detection tournaments could be held during cybersecurity awareness month in October. "It's fun—you build awareness and you reward the team that did the best at identifying phishing messages."
Advertisement
An organization run by AI is not a futuristic concept. Such technology is already a part of many workplaces and will continue to shape the labor market and HR. Here's how employers and employees can successfully manage generative AI and other AI-powered systems.
Advertisement