Bigger Than You Can Imagine

Paranoia and privacy law

by Patrick McMenamin

Illustration by Julie Kwon

published November 14, 2014


Big Data. Admittedly, it’s a great phrase, encapsulating every bigger-better progress narrative in a tight trisyllabic punch. As a term, it doesn’t mean all that much—merely any collection of data large enough to challenge previous data processing capacities. But its near blind invocation in media distances us from it, makes us nod our heads, awestruck at anyone’s ability to parse through it all and blithely toss it back to us.

The truth is, though, we’ve always swum in seas of data. Walk down the aisles of any modern supermarket and watch as the generic boxed brands array into shifting apparitions of shape and color. Or even in the woods, try really hard to process everything at once. Of course we can’t take in everything; this is obvious stuff. The question then becomes how one can extrapolate from the sensory data one is given—how one can approach the whole.

Discussions around big data amplify a drama already taking place at an individual level—how to make one’s trivial individual perceptions meaningful in the context of others. The rise of the Internet of Things, the touted next wave of the Internet economy, plays this drama out at a nearly parodic level. Referring to the connection of daily devices to the Internet, the Internet of Things takes technologies that are largely kitsch—a fridge device that reminds you when you’re low on eggs, an umbrella that glows blue when it’s raining outside, a jacket that hugs you when someone likes your Facebook post—as the basis of an entirely new economy. Total immersion in the Internet lets every possible need not only be met, but predicted. Cisco predicts $14.4 trillion in profits by 2022 from the Internet of Things.

For Cisco, the real value of the Internet of Things lies not in the individual benefits offered by the products, but in the data it produces—data from every nook and cranny of human life. Collected, analyzed, and put to use, this data becomes the fabric uniting all of human life—tying your egg-eating habits with your workplace productivity and the season’s climate patterns.

Imagine what could be done with the instant knowledge of everything’s connections. Whether the dawn of an increasingly oppressive surveillance state or of techno-communism, big data becomes the very texture of connectedness. It’s the self-serious thinker’s dream: uniting knowledge of everything that has ever happened with a complete presence in the moment itself. The collapse of critical reflection into the technological instant.

But don’t forget that this data is made, that it all emerges from individual experiences. If big data can link everything and everyone, then who threads together the fabric? Who holds on to the sheet ends?


“Privacy-sensitive elements may be only latent in the data, made visible only by analytics (including those not yet invented), or by fusion with other data sources (including those not yet known)” –President’s Council of Advisors on Science and Technology, Big Data and Privacy: A Technological Perspective

This past May, the President’s Council of Advisors on Science and Technology released a report about the possibilities of regulating privacy in Big Data. The report, "Big Data and Privacy: A Technological Perspective," separates Big Data’s process into three categories: its collection from individual devices, its analysis, and its use by corporations or the government. This process describes a cycle, where corporations separate data from individuals, analyze it, and return it to them in the form of new products and services.

The PCAST report concludes that the government should only protect privacy in the use of this data. In part, this emerges from a pragmatic concern: individuals already give away so much of the data they produce. In glossedover privacy agreements and unthought-of filler data—the GPS data produced between uses of Maps—one has already forfeited any claim to data production. There’s a reason this data is often referred to as “data exhaust”: it feels like a byproduct to us—who cares if someone wants to know about my egg consumption? But just as the invisible exhaust of a car becomes meaningful as soon as it combines with others and is trapped in the atmosphere, so too does data exhaust—added and analyzed—become something meaningful and useful. And the uses emerging from this combination seem increasingly endless as data collection extends further and further into everyday life.

By the same logic that ties economic creativity to unregulated markets, the PCAST report opts not to regulate data analysis on the grounds that the new data processes necessary for big data can only emerge independently. The report doesn’t even see analysis as a type of use. It occurs in the passive voice, even as corporations and the government both collect and use the data. The data appears to organize itself, relying on its own connections—fusions “not yet known” to anyone collecting or using it—to create its own “analytic models.” In reality, however, this analysis occurs within the corporations who gather and use the data. And this analysis ties directly in with its uses, allowing corporations to use the First Amendment to even challenge the government’s regulation of use. A similar strategy of regulating use in the Fair Credit Reporting Act of 1970—a law to protect the privacy of people’s credit histories—has done little to curb the near endless applications of these histories. Corporations can claim use regulation violates the freedom of expression, their right to express the meanings found in data in new products, services, and advertising schemes.

Yet, it still remains incredibly hard to care about this data. The separation of analysis from collection and use makes it hard for any one data-creator to see the ultimate power of her creation. Funneled almost exclusively through corporations, this data merely seems to build into better Amazon suggestions or more responsive customer service—the products returned back to us in use. But what about the uses hidden as analyses? What about increased government surveillance, targeted policing, more firmly controlled labor practices? And what if we were to seriously consider big data as the connective fabric of everything?


“It means this War was never political at all, the politics was all theatre, all just to keep the people distracted...secretly, it was being dictated instead by the means of a conspiracy between human beings and techniques, by something that needed the energy-burst of war” –Thomas Pynchon, Gravity’s Rainbow

Thomas Pynchon’s 1973 novel Gravity’s Rainbow finds access points to the data of everything through an extensive depiction of paranoia. The novel’s characters desperately try to connect the sensory data of WWII’s ruins and make meaning out of their absurd individual experience. A wartime psychological agency notices that the locations of Lt. Tyrone Slothrop’s sexual exploits predict V-2 rocket strikes in London, a leader of a separatist group of African rocket technicians begins to treat the rocket’s form as the “holy Text” of the war, a Pavlovian psychologist grapples with human conditioning sent back from the future. Paranoia becomes the only way for Pynchon’s characters to even start to glimpse meaning behind these wartime experiences, each character’s distinctive paranoia revealing unique constructions of data. These paranoias are validated in their ability to be put to use—to facilitate survival, skirt wartime powers, find others.

Yet, Pynchon’s paranoia does not follow directly from its use. Rather, it extrapolates from the distinctive character of one’s experience. As the novel’s chief paranoiac, Slothrop fills out the novel’s descriptions of the V-2 rocket with his sexually bent theories of power. But even his paranoia remains incomplete: there is no moment of reckoning or absolute insight. One comes to understand that the novel’s ellipses make this just one reading of the war’s data, that even in such a long book, a single paranoia cannot be fully carried through in description. Emerging from individual experience, this paranoia opens boundless horizons. At the same time, paranoia connects the characters as a shared way of viewing the world; it projects their personal experience outward. Paranoia becomes the entry point to the full data of humanity, the carrier of personal identity into collective space.

These paranoiacs come to see the War's ruins as being "in perfect working order," the War becoming merely a pretext for the self-organizing creativity of technology. While corporations have a fairly obvious interest in separating analysis from use, it’s harder to imagine why a rocket-worshipping separatist group would found this as the base of his resistance. Pynchon challenges us to understand paranoia as a creative process— projecting one’s own desires and needs into technologies and external data. The characters of Gravity’s Rainbow see their own experience as trivial and absurd—on the level of a Facebook hugging jacket—yet understand that this experience contributes to collective meaning. This contribution becomes their access point, opening up the possibility of their action within collective experience.

So who’s paranoid of big data? By holding analysis at a distance, it would appear that the government is. Within the PCAST report, this makes sense: only corporations and the government are really considered as possible data-collectors or users. For them, paranoia becomes a way to justify increased collection and access to data. Their desire for increased access expands outward under the guise of natural, self-regulating creativity within technology itself. At the same time, by emphasizing the ownership of private data, the PCAST report diverts anyone’s claims to analysis or access. And we already don’t really care about ownership. Privacy protection assumes an interest in the trivial data being collected, even as its real power only emerges in its analysis and use. This helps to explain why while it’s hard to be anything more than reluctantly amused to discover that GE listens to your lightbulbs, it would be uncomfortable to find your electricity being limited by your light-switching habits. Yet privacy protection makes this link harder to see. This plays out with incredibly high stakes too: think of the predictions in policy, corporate strategy, and military action. This data comes back to constitute who you are in the world.

Data’s collection, analysis, and uses have real effects on how one lives. But this only happens once that data comes together, once we see the connections. Paranoia becomes a way to not only access that meaning, but to represent one’s own stake in it. In Gravity’s Rainbow, paranoia’s access becomes the vehicle for collective resistance, understanding, and even romantic love. Pynchon’s characters think of data’s meaning as a process of movement—the traces of the V-2 rocket’s parabola—and not as stable possessions.

“Getting feedback, making connections, reducing the error, trying to learn the real function...zeroing in on what incalculable plot?” Gravity’s Rainbow shows that the connections and meanings contained within any kind of data remain “incalculable” to the individual: one cannot grasp the entirety of any experience in isolation. Rather, in carrying the specific paranoid links of our experience into the shared space where these links congregate, we start to glimpse the movements that shape our experience. And in these movements, the possibilities of our action.