Here we go again. Earlier this month the Pentagon announced a new effort to build a system aimed at allowing it to scan billions of communications in order to detect "anomalies" in people's behavior that will predict who is about to snap and turn into a homicidal maniac — or, perhaps, leak damaging documents to a reporter.
Citing the case of Maj. Nidal Hasan, the Army psychiatrist charged with killing 13 people in Fort Hood, Texas, the Pentagon's Defense Advanced Research Projects Agency (DARPA) wants to try to identify, before they happen, "malevolent actions" by insiders within the military. (See coverage by Wired, CNN, or Government Security News.)
The new project is called ADAMS, for Anomaly Detection at Multiple Scales, and anyone who remembers the battles over the Bush Administration's "Total Information Awareness" (TIA) program may be experiencing a major flashback right about now. TIA, also a DARPA project, was based on a vision of pulling together as much information as possible about as many people as possible into an "ultra-large-scale" database, making that information available to government officials, and sorting through it to try to identify terrorists. Eventually shut down by Congress, it was probably the closest thing to a truly comprehensive monitor-everyone "Big Brother" program that has ever been seriously contemplated in the United States. And many of the problems with TIA are equally present with this ADAMS project.
For one thing, the idea is naïve and misguided and it won't work. Statistical data mining has been found to be of limited use in some areas, such as in detecting credit card fraud. But as experts have said, data mining is not good at predicting highly unusual events, because it does not have a large body of examples it can use as a basis for identifying patterns. In fact, there are no patterns with some things. As my colleague Mike German often points out — and he used to work undercover on anti-terrorism cases for the FBI — empirical studies show that there is no such thing as a reliable profile that will predict violent behavior. Incidents in which people turn into homicidal maniacs and begin shooting up their offices are extremely rare and each one has unique origins in the individual psychology, circumstances and life history of the perpetrator. It would probably take a treatment of the length and sensitivity of War and Peace to truly get at the factors that cause an individual to turn into a Maj. Hasan or any other workplace shooter.
The notion that a computer program can identify such a person — and before they act, by reading their e-mail? That's just silly.In its announcement DARPA writes,
The fact that "we generally don't have a good understanding" of "normal versus anomalous behaviors" — I think that is how an engineer expresses what is also known as "the human condition." Contrast the boundless variety, complexity and depth of human experience with the boyishly naïve language used by DARPA in defining what they're trying to identify:
Each time we see an incident like a soldier in good mental health becoming homicidal or suicidal or an innocent insider becoming malicious we wonder why we didn't see it coming…. We generally don't have a good understanding of normal versus anomalous behaviors and how these manifest themselves in the data.
So the military wants a computer program that will detect "good guys" who "turn" — as if tragedies such as the Maj. Hasan case result from the simplistic twists of a bad Saturday morning cartoon. And, they want it not only to do this, but also detect accidental bad behavior? They're not after a computer program, they want a fortune teller.
We define insider threat as malevolent (or possibly inadvertent) actions by an already trusted person in a secure environment with access to sensitive information and information systems and sources. The focus is on malevolent insiders that started out as "good guys." The specific goal of ADAMS is to detect anomalous behaviors before or shortly after they turn.
On the other hand, considering that the Pentagon has also spent millions trying to learn how to stop the hearts of goats using mind control and a lot of other nutty things, perhaps none of this is truly surprising. But it does have more serious implications for Americans' privacy than attempts to harness psychic and occult powers by military intelligence units (as described in The Men Who Stare at Goats — a work of investigative journalism not to be confused with the George Clooney movie of the same name, which was only loosely based on the book).
What might those broader privacy implications be?
To begin with, it will of course represent a massive invasion of privacy, and violate the centuries-old Anglo-American principle that we don't spy on everyone just in case someone does something bad. It is true that people who join the military have reduced rights in some respects, and employers have certain rights to monitor their employees' work performance. This, however, goes way beyond even the ridiculous amount of IT monitoring and control that takes place in many American offices today.
Consider of the privacy invasion that will be involved just in trying to create and test this system. ADAMS algorithm developers, according to page 6 of DARPA's proposal, will be provided with "massive amounts of data on an unprecedented scale." And while DARPA is "open to possible restrictions on the accessibility of collected data" in recognition of "the sensitivity of data collected from…live systems," they declare, "the fewer and milder restrictions on data access, the better." What does it say that our soldiers sign up to defend our democracy only to have the government rifle through their personal records in this way?
Second, this kind of monitoring could expand to many other places beyond military bases. DARPA itself says "technology developed for ADAMS will have applicability in many domains," and they are only focusing on the problem of insider threats "to make sure that the work is well grounded." This system may just waste a lot of money and time and collapse of its own stupidity like the goat-staring program — but despite repeated discrediting by experts, the use of data mining to identify "bad guys" has been embraced by elements of our security establishment. In the worst case, this system could become one of those useless yet prevalent things (like ID-checks often are) that security people within a lot of organizations (including the private sector) feel they must impose out of due diligence in order to protect themselves in the event that some incredibly rare attack takes place. And more of our privacy will disappear.
Third, if actually deployed, this system will inevitably expand in function. Note that DARPA defines the threat as insiders with "access to sensitive information and information systems and sources." This suggests the military could be using the Maj. Hasan tragedy to sell this system — but is really thinking more in terms of Wikileaks and other cases where the public learns things the military does not want them to. But even if the military truly believes that this system will be effective in identifying through e-mail analysis people who are about to snap, its uselessness for that function combined with the breadth of spying involved will inevitably lead it being utilized for all kinds of other purposes.
For example, a wide range of behavior by members of the military can result in court-martial, including adultery and any behavior deemed "conduct unbecoming an officer and a gentleman." (Funnily enough, the act of "opening and reading a letter of another without authority" is cited as an example of unbecoming conduct in the official U.S. "Manual for Courts-Martial." With this system they will have their authority no doubt — but still, how ironic that preserved within military tradition is this echo of once-firm ethical standards, which for example led Secretary of State Henry Stimson to terminate a spy program in 1929 with the declaration that "gentlemen do not read other people's mail.")
Some company or companies out there will have a feast on the $35 million DARPA is doling out for this project, that's for sure — but it's not money well spent. It's disappointing that our military would focus its resources on bringing into being this kind of Orwellian (yet in all likelihood useless) tool, when our country is facing so many other needs.
Perhaps what we really need is a computer program that will spot kooky, naïve and un-American initiatives within the Pentagon before anyone starts wasting millions of dollars on them.