The Human Factor: Using Behavioral Science to Counter Insider ThreatsOctober 2012
Topics: Social Behavior, Computer Security, Safeguard and Secure Cyberspace
In 2010, when the whistleblower website Wikileaks published hundreds of thousands of state department and military files, experts blasted the lax cybersecurity protocols that allowed for the leak. Critics pointed out that far too many users had far too easy access to material stored on classified networks. In response to the criticism, the government invested much effort and money in strengthening its cybersecurity systems.
But questions remain: If so many people had the opportunity to steal files from the government's secure networks, why is it that only one—alleged leaker Private First Class Bradley Manning—appears to have done so? Should identifying the weakness in a computer system be more important than identifying the type of person who might be tempted to exploit those weaknesses? Is countering insider threats purely a technical challenge?
Deanna Caputo, a behavioral scientist involved in MITRE's cybersecurity program, believes insider threat defenses need to address both computer systems and their users. Her work is part of the multi-pronged approach MITRE applies to the thorny challenge of protecting our nation's information infrastructure.
Behavioral scientists like Caputo, who has a doctorate in social and personality psychology, study the actions and reactions of human beings, seeking to understand individual and societal human behavior. By understanding why employees might be tempted to commit cyber-espionage against their employers, employers can put in place better systems to deter, detect, and mitigate insider threats.
A Spectrum of Responses
Caputo takes the term "deter, detect, and mitigate" from the National Insider Threat Task Force. President Obama called for the taskforce's creation in October 2011, in response to the Wikileaks scandal. The taskforce's goal is to establish governmental standards and guidance for all federal insider-threat detection programs.
Deterring insider threats may seem like an insignificant goal compared to preventing them. But you'll rarely hear Caputo or her MITRE colleagues using the term "prevent."
"Prevention is the Holy Grail of cybersecurity, but in many ways it is unreachable," she says. "So instead we try to deter employees from making bad decisions by first letting them know we're watching them and that they're likely to be caught. That's the detection part. We then let them know their employer cares if they're having a rough time and has programs in place to assist them. That's the mitigation part."
Detecting insider threats often requires cybersecurity engineers to design systems to audit and detect any activity that seems evasive and malicious. Behavioral scientists can provide data on what acts might constitute such negative behavior and what acts might camouflage it.
"Using what we know about human behavior," says Caputo, "behavioral scientists can advise engineers on defining normal, baseline user behavior. We can help them identify abnormal changes to that baseline and explain what those changes might mean."
Triggers: Indicators of Suspicious Behavior
Those deviations from normal behavior are often called "triggers." Common triggers might include employees' use of removable media (Bradley Manning allegedly burned files onto a disc designed to look like a Lady Gaga CD), their use of printers or copiers far from their office, or employees logging onto the computer system in a building despite not being signed in at that building.
Cybersecurity engineers rely on triggers when designing automated monitoring systems. "Engineers are excellent at collecting lots of data neatly into one place," says Caputo. "Our job as behavioral scientists is to help them make meaning of that data. We help them define the triggers that will tell the monitoring systems which data to bubble up to human eyes."
Unfortunately, monitoring systems based on single triggers often produce false alarms. Caputo is conducting research that will lead to monitoring based on multiple triggers. "Instead of receiving alerts because someone spent too long at the wrong printer, we want to be able to tell the system, Don't alert me until someone hits these three triggers.'"
For engineers, mitigating insider threats often means fixing the damage done by an insider attack. For behavioral scientists, however, it also means the mitigating factors within an organization that can increase or decrease the chance of an insider threat.
"Disgruntlement and ego often play a role in motivating insider attacks," says Caputo. "To mitigate disgruntlement, organizations can provide employees with avenues to vent concerns and frustration. To mitigate ego, organizations can implement employee recognition programs that offer more public praise."
Greed, which leads employees to sell organizational secrets, can also be a motivating factor in insider attacks. "There are few mitigation factors likely to deter someone who wants to become rich without earning it," says Caputo. "But organizations can certainly address such issues as grievances stemming from perceived inequities in compensation."
Accounting for Human Nature
Insider threat detection programs should include their own behavioral scientists (or have access to behavioral scientists within other programs) to consult with on insider threat issues.
"Technology always in some way involves human beings," Caputo says. "So you can't tackle a technological challenge without taking into account human nature. And the experts in human nature are behavioral scientists."
As part of one MITRE sponsor task, Deanna organized and led a working group of behavioral scientists from across the intelligence community. The group wrote a concept of operations detailing ways that insider-threat detection programs could integrate behavioral scientists.
Although best known as a technology company, MITRE taking the lead in offering behavioral science guidance to our sponsors seems natural to Caputo. "We are known for our broad range of subject-matter expertise. So when MITRE identifies an expertise that would help our sponsors address their challenges—especially in an area as complex as cybersecurity—we work hard to make it available."
—by Christopher Lockheardt