Big Brother with Big Data

In a development that threatens to haunt the Obama presidency and smother democracy itself, a new surveillance program employing “Big Data” has been exposed by a 29-year old whistleblower, Edward Snowden, hidden for now in Hong Kong. The New York Times has editorialized that President Barack Obama “has lost all credibility on this issue.” Obama himself says he “welcomes” a conversation on the crisis, as he previously has called on Congress to “rein in” his own administration, both signs that the tiger may be riding the president instead of the way he may have intended. 

Aside from Senators Mark Udall (D-CO) and Ron Wyden (D-OR), Congress is being humiliated by a broad failure to discharge its oversight authority. The public, perhaps stunned by a cascade of crises, registers mixed feeling in the immediate polling. 

While much more will be uncovered in the days ahead, it is important to underscore what is new about this latest crisis as the concerned public attempts to grapple with it. 

It is a crisis of democracy in the digital age. Ninety percent of existing data in the world has been created in the past two years, and is expected to double again every two years, according to IBM. Instead of being drowned by this tyranny of information, corporate and state agencies have found ways to harness it. It may be equivalent in historic significance to the creation of the atomic bomb – an information bomb in the hands of a narrow elite. 

At this point, the public relies only on whistleblowers like Snowden, Julian Assange, Bradley Manning, and John Kiriakou – to cite some of the most well known – to reveal the existence of these classified Big Brother systems. Of course, a rational system under democratic control would be far superior to individual whistle blowing, but there is no such system on the horizon. The Congressional intelligence committees, originally mandated as a result of CIA scandals revealed in the 1970s, have outlived their viability. For example, the Senate leadership cannot even release a 6,000 internal report on the administration’s targeted killing program because the responsible executive branch has deemed it classified. 

The current program has a creeping history. In Vietnam, there was the US computer-driven obsession with the “body count” to measure progress. Years later, Senator Daniel Patrick Moynihan noted that classified information was increasing, not decreasing, even after the Cold War ended in the collapse of the Soviet bloc. According to Garry Wills, in two years of the Clinton administration (1995-96) there was a 62 percent jump in classified material – to six million documents. After the 9/11 attacks, the government classified 11.3 million, 14.2 million and 15.6 million in the following three years. 

The Authorization to Use Military Force and Patriot Act, passed after 9/11, are used to authorize everything from warrantless wiretapping to secret drone killings and now the new “Big Data” escalation. When the Bush administration floated its idea of a Total Information Awareness program in 2003, it was declared suspended after significant outcry. But like a mushroom in the night, the program just keeps growing. The rise of digital companies and Facebook amplified the possibilities, even when those digital companies tried to resist. 

Left to itself, the “Big Data” system upends centuries of struggle for the rule of law, checks and balances, bureaucratic accountability and the privacy of individuals before state and corporation. The crisis cries out for congressional hearings at the very least, yet a complicit Congress is unlikely to act. 

Quite by chance, Foreign Affairs, the forum of the national security establishment, published a revealing cover article on “Big Data” just this month. While expressing alarm at the implications, the Foreign Affairs essay assumes that the program is a logical outcome of technological advancement rather than political failure. 

In its essence, the “Big Data” system replaces the concept of causality with correlation, mining new mountains of data to predict patterns in general. “A bit of inaccuracy can be tolerated,” the authors say, and “the obsession with accuracy and precision is in some ways an artifact of an information-constrained environment.” We are entering the world of “datafication” where “n=s all” in statistical terminology. 

In practical terms, this means racial profiling with a scientific twist. Or “signature strikes” to target victims in a certain age group in a street. Or adopting “predictive policing” using secret data bases lacking meaningful mechanisms of appeal. Using Big Data to spy on consumer and voter behavior is already rampant. The Obama campaign combined its vaunted grassroots volunteer effort with the most sophisticated Big Data techniques in the history of campaigns to identify, classify and motivate voters. 

But Obama himself should know that as a candidate he was an exception to the statistical rules. As a constitutional lawyer, he should understand that individual variance is at the heart of the social order, and the right to be judged as an individual is sacrosanct. Not everything is quantifiable. How could there be inventions if that were so? The Foreign Affairs writers make the stunning assertion that “if Henry Ford had queried big-data algorithms to discover what his customers wanted, they would have come back with ‘a faster horse.’”

It is never too late to retreat from what Barbara Tuchman called “the march of folly.” Whatever responsibility he bears, Obama is smart to call for “conversations” about drones, targeted killing, and Big Data. The public, judiciary, and media should take him up on the offer before the crisis spirals down. 

Leave a comment