In Black Mirror’s segment “The Entire History of You,” citizens document their entire lives using a seemingly-ubiquitous memory implant called Grain. The implant and eye lense set automatically records first-person video and audio. Grain also provides a clever user interface for navigating digital memories. “Redos,” as they are called, enable users to quickly sift through a lifetime of content and locate a very particular instance of video playback in just a few blinks. From “Grain scans” at security checkpoints to “redo watching” at the dinner table, the first and second-order effects of this technology are vibrant and, from a distance, highly contentious.
The episode’s plot, encapsulated by a mysterious cheating scandal, demonstrates both the ‘macro and micro’ ethical concerns of living in a surveillance society.
Mass surveillance, the supervision of large groups of citizens, is a highly controversial topic. A nuanced upside of Grain, at least in theory, is the forever strengthening of the truth. With always-on broadcasting, no individual is incentivized to lie because he or she knows that, in due time, he or she will be caught and tried via documented footage. Aware of this monitoring, citizens cannot hide from or escape the truth. From an absolutist or utilitarian lens, this is fair and good for society. In this quasi-utopia, bad people are punished for doing bad things and good people are rewarded for doing good things. The simplicity of “the truth” empowers the individual, restoring agency such that people feel as if they have complete control of their realities. Surveillance, in essence, is a great equalizing force.
While romantically appealing, this grand vision of fairness never really comes to fruition in practice, as it is rooted in several faulty assumptions. “Big Brother worlds” assume complete rationality from all participating players. These societies are regulated by 1s and 0s and machine-readable instructions that dictate right and wrong. The world, in fact, is a lot more fluffy and abstract. Humans, uniquely, display ego. Humans make superficially un-reasonable choices all of the time. Humans mess up. Should every individual fear failure? Should small mistakes be treated with blanket punishments? Economic theory tells us that this type of friction, this type of fear, will lower societal productivity – people will be too afraid to start risky projects, experiment with fledgling ideas, and learn new things. Mass surveillance disincentivizes risk taking, a critical component of any high-output society.
Another major ethical problem depicted throughout the episode is that Grain’s “establishment of an absolute truth” is not so entirely absolute. A user has the ability to edit or archive his or her surveillance footage. Proponents of absolute surveillance would find extreme fault with the ephemeral nature of Grain…”what is the point of monitoring citizens if any particular user can simply delete past events like they never even happened?” While privacy advocates would stand by these features, it is also important to realize just how easy it is to manipulate Grain content. The clipping tool is a recipe for disaster, as removing context from a specific situations makes it easy to frame and blackmail others.
The sheer volume of content produced by Grain presents another ‘world-altering moral quandary.’ High-definition memories are extremely addicting. Whether they are reliving time spent with a loved one or analyzing their performance in a job interview, Grain users spend a significant portion of their days watching and rewatching archival history. So readily and easily accessible, the past is effectively a living part of the present. This is reality-TV at scale, a dangerous rabbit-hole of potentially ineffectual time spent on glorifying ego and self-obsession. Are we really living if our mind is fixated on the past? While potentially fruitless, one hidden benefit of reflecting on previous memories is a heightened level of self-awareness. Citizens will better understand their own biases, strengths, flaws, etc.
Above, I have outlined two core themes from the story: surveillance and presence, both of which manifest themselves throughout modern society in a variety of ways.
Transparent monitoring as a vehicle for truth-seeking is, in my mind, a valiant effort. I certainly believe in agency and equal opportunity, and in a lot of ways the truth, revealed by surveillance, surfaces these qualities. Problems arise, though, as citizens begin to question a) who owns whose data and b) what is all of this data really being used for? In 2018, we have seen a serious backlash to major American tech companies (Facebook, Google, etc.), whose primary revenue model relies on the widespread collection of consumer data. This type of tracking, though, is far more heavily anonymized than the scenes depicted in Black Mirror. But it does beg the question, should we be comfortable entrusting our most intimate data – personal information, browser history, communication details – with large corporations. While it is convenient to ‘Checkout with Apple Pay’ or ‘Login with Facebook,’ we must continue to ask ourselves if we are comfortable with the potentially long-term implications of our decisions. Which third parties should we trust? This dilemma is only magnified in countries like China, where a social credit system is responsible for governing the masses.
The solution for this may come in the form of market-disruption, as we are already beginning to see waves of support come for decentralized systems (Bitcoin, Blockstack, etc.) that, at least in theory, empower individuals to own and retain their information. I also believe in distributed surveillance systems, like Citizen, that helps keep cities safe and citizens aware of the dangers around them.
The other subject implicitly discussed throughout the episode is the topic of presence and technological addiction. In the past twenty years, we have seen software eat and swallow the world. While I am super bullish on tech, I worry that “tech-addiction” is a real problem, certainly contributing to widespread depression and loneliness. We already spend hours and hours on end glued to our smartphone screens…what happens when technology, like Grain, is actually embedded into our biology? I think the solution to this “disease” is the re-establishment of fun, community-based networks. I think we can find positive-sum ways to leverage technology that amplifies human potential and helps us all live better, happier lives.
Also published on Medium.