When insurances calculate their fees according to the vital data of their customers, this impacts their lifestyle. Governments and intelligence agencies use similar methods, claiming to increase our security. Stock market prices change by the minute and its not humans calculating them anymore. It’s algorithms who decide where to invest – with unforeseeable consequences. In medicine humans are reduced to the data of their vital signs.
The question about total surveillance and control slowly arrived in the mainstream discourse (Thanks, Edward Snowden!) and has sparkled debates about power relations, decentralization and self empowerment. But beyond these debates, the digitalization of the world bears many more fundamental and ethical questions which basically affect everyone. It’s about the future of humanity and the future of our planet.
Which decisions should we as a society or even as humanity at large delegate to algorithms? Which areas of our lives do we want to reduce to measurements, quantifications and calculations? Which areas should we consciously keep apart from such developments? And how and by whom can such decisions be made? Can humanity even limit the application of its technologies, as Ivan Illich once suggested? Or is this an unstoppable momentum which we have to adapt to? Will we loose the autonomy promised by the age of enlightenment in this digital age? And who stands to profit from it?
How big is Big Data?
Currently we produce 2.5 Exabytes (2,5 billion Gigabytes) of new information on a daily basis, and that number will increase to 44 Zettabytes (44 billion Terabytes) in the next three to four years. That is a lot of information, and it’s growing faster and faster. “Data is the oil of the 21st century”, some say. Large corporate information silos accumulate data like oil companies extract fossil fuels. Unlike with oil, there wont be a “peak data” moment.
Much of this data is personal. People share it in exchange for “free” services like social networking, search engines, video platforms and email services. The trust in large corporations is mainly expressed by consenting to terms of services, which are rarely even read, much less understood.
And it’s a hackers paradise: large amounts of data in centralized servers; outdated software which can be easily exploited; more and more devices and infrastructure is connected to the Internet with “always on” as a fatal paradigm. Software is cracked, systems are hacked, data is leaking: not just to Wikileaks, but to a myriad of players, from small hacker groups to large intelligence services. No system connected to the Internet is safe, not even governments, infrastructure or companies servicing the surveillance industrial complex.
Is it possible to escape this enormous accumulation of personal data? Is encryption on a large scale a way to protect the human right to privacy? Or should we disconnect from the status quo, literally?
Meanwhile, trendy buzzwords have recently polluted the political discourse: “Fake News”, “Post Truth” and the especially creative “Alternative Facts”. All of them share the meaning of a distorted reality and result in a sense of confusion among most people. These newly created terminologies seem to be symptomatic for the massive changes underway in the world of journalism and new media. The gatekeepers loose control and a wide array of diverse publishers reshape the perception of the masses. Be it Breitbart News or Infowars, lone wolf reporters, intelligence operatives or even the press secretary of the reality TV president Trump: professional debunkers have a high season. But it’s not just fringe websites which spread falsehoods and distorted facts, it’s the mainstream media too. Being (ab)used for propagandistic purposes is nothing new in their world. “Perception Management” is a term that has been coined by the Pentagon.
Now there’s talk about “fake news task forces” and automatic filtering, censoring and new media regulations. But what is the actual role of algorithms? Filter bubble enhancement? Do we need to regulate more, as a society? Open algorithms, being described like ingredients in an energy drink? Can rules and regulations, censorship and filtering combat a merely social and educational problem?
Algorithms are ubiquitous, but yet exist in parallel to our visible world, in bits and bytes, zeros and ones, appearing to us on screens, although their function is concealed to us. They are living in machines, executing orders and are inherent political. By coming up with abstract definitions for technologies like the Internet or the cloud, they become incomprehensible. But in fact, invisible technologies do take space in the physical world. They take the form of data centers across the globe or underwater fiber optic cables connecting continents.
Artistic practices have the power to render visible and communicate complex systems in a understandable realm. This way, abstract technologies are shifting from beyond comprehensibility into a debatable sphere that can lead to critical engagement. Artists become activists and the act of rendering visible becomes a counteraction against hegemony. According to James Bridle, technologies are not preexisting – they tell us stories of ourselves, unveil a social demand and structures of power.
The counteraction approach to surveillance algorithms is the artistic response of Adam Harvey. His work is orbiting around techniques that often manifest in fashion and make-up as camouflage to trick facial recognition algorithms so they are unable to surveil, track or identify the individual. What unites them is the proposition that if we don’t understand how technologies around us work, we are more likely to be their victims, rather than using them for our own good.
And art can help lead the way.
At the Elevate Festival in Graz, Austria, from March 1st to March 5th, many of these questions will be addressed and debated. Music and contemporary art will play an integral role during the festival, where multiple artists will host talks, participate in panel discussions as well as present groundbreaking installations.