by Sandra Zanetti
Before the invention of smartphones and computers, the spyglass, telephone, and radio revolutionized the way humans could gather information. Spying and surveillance are as as old as civilization itself, but new technologies throughout time have extended the capabilities of espionage. The spying that took place before the industrial revolution was typically limited 1 to one person or a small group of people relying on cohorts to gather information on others they considered enemies.
The organized structure of surveillance that we know today is a product of globally dominant sociopolitical viewpoints and changes in the technological landscape. After the rise of global terrorism in the early 21st century, people all over the world exchanged their privacy for the promise of safety; as governments took action to prevent terrorist attacks, normalization of mass surveillance increased.
So, why would the collection of our data be a threat if “you have nothing to hide”? We all have something to worry about when huge amounts of aggregated data on us exist. Data mining is how most successful internet companies make their money and influence. 2 Your data can be collected and analyzed to profile not only you, but also people in your social sphere. This information can lead to identifying those who are politically involved, have complicated immigration statuses, or practice any religion. Everything you click on, buy, search, or text helps to create your digital fingerprint which reveals a lot more to companies and governments than you think.
Something as innocuous as attending a rally or going to a place of worship can be used to build your digital fingerprint and then used to show you ads on your phone telling you to buy something your friend mentioned in a message last week, or potentially even lead to unwarranted arrests. I can understand people who feel more secure because of mass surveillance precautions— but the private data that is being collected must be transparent, and since it is not, we must consider whether governments and corporations are collecting our data to protect or oppress us. Throughout time, it has been so easy to get people to give up mass amounts of private data for the sake of convenience. Upon it’s mass production, the first thing telephone companies built into their networks were surveillance features. Today, the devices we carry with us daily give3 governments, corporations, hackers, police or even your friends and family deciding to “play spy” huge amounts of information about you. Even if you agree with the government wiretapping your phone and CCTV, those practices are just one layer in the ways you’re being watched. I’m not sure how it should became normal that the default settings of apps like Snapchat allow others to see their exact location at any given moment. Some people don’t know or care about these features, and others use it as a way to feel “seen” as they’re being watched.
Two dominant psychological phenomena have emerged from the culture of surveillance. The first is the dismissive numbness to being watched and the other is the desire to be “watched” as a participant of society. These thought patterns change peoples behavior and in turn the world becomes a sort of strategy game with active players “performing” in reality. Social media plays a big role in the performative realm of surveillance where people “curate” a version of their persona to display online. The behavioral effects of surveillance society erode the pursuit of education and political discourse because it leads us to self-censor as we think twice before we go somewhere, or search something online. Bruce Schneier, from the cybersecurity program of the Kennedy School’s Belfer Center for Government and International Affairs writes, “[The Chinese Government] wants people to self-censor, because it knows it can’t stop everybody. The idea is that if you don’t know where the line is, and the penalty for crossing it is severe, you will stay far away from it. Basic human conditioning. If your goal is to control a population,” Schneier says, “mass surveillance is awesome.”4
The government in the US insists the PRISM program, a mass surveillance program which was illuminated by Edward Snowden’s classified leaks about the scope of surveillance in the US, is vital to national security. The data mining program works by pulling data from private emails, video conferences, file transfers, and cloud data directly from private company servers to NSA databases. But there isn’t much evidence of mass surveillance actually stopping crime. In a 20135 government oversight hearing, General Keith B. Alexander, the former director of the NSA, acknowledged that only “one or perhaps two” cases had been foiled with help from the NSA’s vast phone records database.
The same negative statistics exist with the TSA SPOT program which 6 tasks security screeners with detecting “strange” emotions or behavior in airport security lines. Upon review of the US SPOT program, The Government Accountability Office found that “not a single terrorist was caught with this method”7, and it’s still being used in airports today. The SPOT program has been heavily critiqued for disproportionately targeting Middle Eastern, Black, Latino and nonbinary people while traveling. This is most likely because protocol used by security personal are based on individualized, problematic western-centric understandings of the body and its behavior. “They just pull aside anyone who they don’t like the way they look — if they are black and have expensive clothes or jewelry, or if they are Hispanic,” said an anonymous officer from the Logan International Airport.8
When people are at the airport they are “performing” in security theater, kind of like the way they “perform” the best version of themselves online. People act as “western” as possible. They dress differently, tell their children to only speak english, and conform to gender stereotypes just to attempt to not stand out to security for fear they’ll be detained on the basis of their biology.
Since the body is thought of as a container of information, the surveillance of the body is getting even more dangerous as biometric technology advances. As Russia, China, the United States and other countries invest in technology that track peoples’ body and eye movements to predict their choices, there is possibility to make every aspect of life microscopically observed; that invention is deeply rooted in racial profiling.
With the emergence of the COVID-19 crisis this year we’re seeing facial recognition, ankle bracelets, mandatory phone apps, cameras that look for fevers, sweating, and discoloration, and other geolocation tracking methods used worldwide to track people who may have been exposed to the virus. An “exposure notification” tool has recently appeared in the settings of both Android phones and iPhones as part of an update of their operating systems.9 Poland is even making quarantined citizens use a selfie app to prove they’re staying inside. Similarly, Israel released an app that sends COVID information to authorities who are able to track people down.10
ORB International conducted a recent survey that found 86% of the population of the UK would “sacrifice their human rights to help prevent the disease.” The developers at Apple and Google claim that 11 the technology developed to keep track of the virus will not be used to store individual data, but the fact of the matter is that they could if they wanted to. When this data exists it can easily be abused or compromised in the future. That’s why it’s imperative to know our data is encrypted and for how long it stays around.
Today, the government in the United States is pressuring companies to include backdoor access functionality in communications networks. The problem with having backdoors is if someone “you trust” can go through them, you never know who else will be able to get through them at some point. In 2009, Google and Microsoft’s surveillance systems that were used to respond to lawful surveillance requests from the police were compromised by the Chinese government because they wanted to figure out which of their own agents the US government was monitoring. Another example of a backdoor backfire is when when Vodafone Greece’s surveillance system was hacked by an unknown source and the Prime Minister of Greece and members of the Greek Cabinet were wiretapped. While a lot of your data from your phone is encrypted by default, security flaws are discovered all the time.12
This information exists in an easily searchable database and can be easily abused, as is the case with most failed surveillance states.
Edward Snowden claims the UK is “the most extreme surveillance in the history of western democracy.” In 2015, a writer from the UK, interviewed from the PEN’s International Survey of Writers stated, “I believe that most UK citizens are now regularly under levels of surveillance that make the Stasi seem amateurish.” The files of 13 millions of data collected by the Stasi in the former GDR were collected by a mass amount of spies present in every nook and cranny of society, in order to maintain control over its people. The former East German utilized a psychological warfare technique called Zersetzung to secretly destroy the self-confidence of people so they would censor themselves. Their other goal was to predict what people were thinking before they would take action against the surveillance state. The Stasi even went so far as recording people in their bedrooms, and attempting to capture and store the scent of anyone that said anything negative about the state. The data that is mined in the US is collected 14 with a fraction of the effort used by Stasi operators. The reach of Stasi archive is absolutely nauseating, but it’s actually no where near as large as digitally collected data the NSA collects from everyday Americans. According to a report by NPRthe NSA captures 1 billion times more data 15 than the Stasi. Take a look at this interactive map made by OpenDataCity.
Most of abuses of people in surveillance states typically manifest because of someone’s political views, race, nation of origin, or sexual orientation. For example, American Civil Rights activist Martin Luther King Jr. was labeled “The Most Dangerous Man in America” by the FBI under its Racial Matters Program, which focused on individuals and organizations involved in racial politics.16 And even before that, the 18th Century Lantern Laws in the United States warranted an arrest for any minority outside after dark without a lantern or under the supervision of a Caucasian person.17
Unfortunately, these atrocious violations of human rights still stain the fabric of society today.
With the rise of the Black Lives Matter movement fighting against the ill treatment of people of color by the police and justice system in the United States, we saw police using Cell-Site Simulators, drones, social media monitoring and even facial recognition technology, to identify, profile, and arrest demonstrators, people on the street that had nothing to do with the protests minutes after curfew and even journalists covering the protests in the media.18 In an interview with Popular Mechanics, Dave Maass, the senior investigative researcher for the Electronic Frontier Foundation (EFF) says, “Police departments are doing less outreach, less legwork, and fewer investigations. Instead, they’re relying on untested technology that can result in more abuse and corruption. Not to mention, it’s a waste of taxpayer money.”19 Surveillance today has taken an interesting turn as regular people gain more access to technology featuring cameras. Several videos have been released from protests in the USA where police are seen using irresponsible force by abusing teargas, rubber bullets and batons in inappropriate situations. A similar situation is happening in with brutal police response to the demonstrations fighting for democratic reform in Hong Kong.20 But unfortunately this access often leads to arrests when the identity of people is revealed to authorities when facial recognition software scans through photos and videos by hashtag that are posted online. This technology can easily turn something like a speeding ticket into a potentially dangerous situation, especially if I happen to look just like someone with an intense criminal record. Research has shown that the systems are even less accurate in recognizing the faces of people of color and come with risk that they may be deployed disproportionately in marginalized communities.21 Many Civil-Liberties groups are speaking out against the abuse of this technology because of lack of transparency from government officials and companies, and for the fact that it sounds like something from a Robo-cop dystopian future.
The circumstances surrounding mass surveillance need to be meticulously researched outside the blanket statements of corporations, media, or government officials. As technologies become more easily accessible, the act of collecting data on massive amounts of people has become less expensive and more aptly organized. People living contemporary lives seem to believe that we are all forever subject to being tracked, hacked, and data-mined without a choice. While there are a few steps you can take to protect yourself, like using a VPN or a secure browser; I highly recommend speaking out and advocating for your right to privacy.
SURVEILLANCE – Field guide
SURVEILLANCE – Field guide // zine by Sandra Zanetti
A5 zine, printed on glossy paper
1 (source Keith Laidler, Surveillance Unlimited: How We’ve Become the Most Watched People on Earth.)
5 “NSA Prism program slides,” The Guardian, November 1, 2013.
7 GAO, “Aviation Security” (2013)
17 Dark Matters: On The Surveillance of Blackness: Browne, Simone