Please, accept cookies in order to load the content.

The history of malware is the history of inventing multiple forms of attack and defence, of borders and breaches, of evolutionary programmes, artificial life and system crashes (Parikka 2016). It is also an invention of different forms of artificiality that vary in scale from individual computers to entire infrastructures, with much in between. Malware such as computer viruses and worms are forms of speculative computing that have a long lineage of ideas about networking, connection, security and contagion. They are speculative software in the manner that Matthew Fuller defined as investigating the possibilities of programming - “Software as science fiction, as mutant epistemology.” (Fuller 2003, 30). As an art of the artificial, computer viruses have been likened to artificial life, but this artificiality also includes a parallel trajectory. Malware is about trickery in the same fundamental sense in which Vilem Flusser described art and design, suggesting that the word 'artifice' can trace its origins to the definition 'trickster' (Flusser: 1999, 18. See also Singleton 2015)

Malware is a bag of tricks for the designer - after infection things don’t look the same, scales are distorted, interfaces are taken over, maps are redrawn, routes are rerouted, connections are slowed down to a snail's pace, much is stolen, and things are twisted to the perpetrator’s advantage. Of course, much of this could be said to pertain to any operation of power. Perhaps, in short, malware is the truth about software.

Malware probes a space of significant mutation that has effects across networked environments and challenges our knowledge of software. What counts as malicious and what counts as accidental? How can software define damage in an age when normalised social media activity already seems to be a form of self-harm?

Inventing a particular form of a computer accident implicitly means inventing an infrastructural accident demonstrated in geopolitical terms. From the mass sabotage of the Ukrainian grid to the viral targeting of Iranian nuclear facilities, we are faced with a multitude of questions. What do contemporary geopolitics tell us about the history of malware, and what does the trickery of malware mean for design? How do the artistic experiments with viruses and malware relate to the theme of operational arts, to use a term defined by Jimena Canales?

Theory and Experiments: Jump Conditions

What was science fiction in the 1970s became a real security threat by the late 1980s. Novels such as John Brunner’s The Shockwave Rider and David Gerrold’s When HARLIE Was One introduced alternative imaginaries of data networks in the mid-'70s, articulating the slow transition towards a new sense of computer security on a vast scale involving AI agents and automated system processes. Viral contagions linked to a variety of themes about the contemporary politics of data and network soon became discussed in computer science, some ten years before the cyberpunk literature of the 1980s.

Fred Cohen’s 1984 paper Computer Viruses – Theory and Experiments is the first focused account of computer viruses as a form of threat to networked systems, implying a plethora of issues about transmission and communication, access protocols and control. It outlines the stakes of computer contagions as part of architectures of connectivity, suggesting that the issue is how we balance communication and security in a way that acknowledges that there is no complete safety in an architecture of connectivity. In the more formalised terms of the paper, this means that one has to find a solution between complete connectionism (and its infectious dangers) and complete isolationism, through the management of computer traffic.

Cohen’s example was presented as one version of the key algorithmic conditions of 'if / then' pairing that becomes an instrumental way of understanding contemporary culture and subjectivity. If a certain condition is met, then it triggers a payload. What Cohen's early paper describes as the basic function of viral malware behaviour is also a generic way of understanding algorithmic culture (Bucher 2018).

This is also the history of obeying commands. Beyond mere mechanical instructions that are followed blindly, conditional pairing introduces what Friedrich Kittler might call 'non-trivial machine subjects' (Winthrop-Young 2011, 136) that sense their environment and include sensorial input in their real-time decision making. The technological equivalent is the self-guiding cruise missile that is constantly aware of its geographical environment and the path towards its target, or the multiple complex forms of autonomous vehicles nowadays, which are a testament to this sort of AI on a larger scale. While these machines are like cruise missiles in some sense, the missile itself becomes a model for malware and for autonomous systems, with 'if / then' capacities responding to system environments and their weaknesses.

Hence, following Nick Dyer-Witheford and Svitlana Matviyenko (2019), one can argue that malware becomes a particular form of cyberwar, while questions of non-military security (Operations Other Than War, see Canales 2014) become embedded in the multiplied scales of complexity since Cohen’s simple model. Place that model into the escalated ecology of technological possibilities of bot nets, zombie computer networks, infrastructural data capture, both in hardware and across digital platforms, multiple forms of executability through weapon systems, and the infrastructural feedback loop that ensures the 'if / then' system can affect power grid functioning, for example.

One starts to see the 40-year history of computer viruses as a slow transition from science fictional autonomous, non-human agents to full-scale distributed systems capable of affecting large planetary events. This is the general accident of malware as the new normal.

A Generalised Accident

Cohen’s early work was one of the key reference points in computer virus literature, but of course not the only one. While real world instances of malware incidents were relatively rare until the early 1990s, the discussion of viruses (and then later worms, Trojans and others) became part of the wider context of security concerns in the cybernetic military sense too. Who controls which systems? Which feedback loops and command structures define the contested metaphorical, epistemological and contemporary infrastructural space?

One can also read the narrative of the symbolic politics of contested space through viruses. An apt example would be the Friday 13th virus (also known as Jerusalem) from 1988, which began in Israel and was said to refer to the last day of pre-Israeli Palestine – 13th May 1948, acknowledging the 40th anniversary as its geopolitical message. There is a clear case to discuss viruses and malware as part of the history of civic resistance and disturbance, perhaps in certain ways as an extension of tactical media, but also to articulate a different sense of territory that is not just symbolically significant.

As insights into geopolitical disturbances, malware can function as an interruption of logistics. They problematise – and reverse – certain control and command structures that define system access and operability. This is where the cross-reading of contemporary assessments of logistics becomes a useful way to see malware as programmable preparations for tactical interruptions.

Malware fits nicely into the lineage of events and practices that Ned Rossiter names as ‘logistical nightmares’ - they range from unruly workers to software glitches, traffic gridlocks to inventory blowouts. Operational chains of transport and communication are vulnerable to multiple scales of hiccoughs where the contagious virus can bog down system processes or engage in a massive scale of service denial, becoming a form of intervention into the service economy itself (on denials of service, see Parikka 2015). The virus, for all its baggage as a symbol of cyberpunk fantasies, becomes one way of modelling spatial protocols of interruption (compare with Easterling 2016, 258) and as such is useful in evaluating the ground-level impact of digital networks, as well as the behaviour of all sorts of distributed agents that roam this logistical space.

Aptly, Benjamin Bratton (n.d.) inverted Paul Virilio’s famous dictum about technologies, suggesting that technologies are not necessarily inventions of specific accidents, but that they also work the other way around: every accident is an invention of a new technology, implying a cascading line of possibilities for technologies that tell the tale of accidents not merely as events bound to one catastrophic failure or malfunction, but as probing alternative possibilities of organising the material world around seeming failures, glitches, hiccoughs, and mutations. While speculative software mutates epistemologies, it mobilises the various accidents that are ontogenetically mutant; alternative worlds emerge that reform our understanding of space, infrastructure, logistics, and power. To quote Keller Easterling (2017), “Culture’s spectacular failures together with the underexploited powers of medium inspire alternative ways to register the design imagination.”

Programming People

From control and command, we shift to control, command and trickery. The executable routines of malware and viruses work in several ways, including social engineering that programmes people. Two earlier, well-known cases in point include the Michelangelo virus of 1992. One of the most feared of the early malware attacks, it started spreading before the actual malicious software payload was triggered. To use a term from Richard Grusin (2010), the virus was constantly ‘premediated’, a term describing the cultural logic that ensures that the future arrives ahead of its time, as an affective reality in the present. This is not just part of the 21st century mediascape after 9/11 (which is Grusin’s focus and framing), but a concept already embedded in some of the security discourses of earlier decades.

Out of five million predicted infections, Michaelangelo actually affected 5,000 to 10,000 machines. Nevertheless, the effects were already felt in the preparations of users for defensive security measures - operating procedures, anti-virus sales, daily office habits, security preparation courses, and other systems that trained and fine-tuned the awareness of users sitting front of computers. This is the wider context of malware as programming on a societal level, with a sensibility for software and security.

In 2000, the infamous ILOVEYOU email worm became part of the exploitation of user interaction. Luring the user into opening an email attachment, the worm is part of the history of clickbait - tricking the drowsy morning user, half-conscious from lack of sleep, boredom, or just the sheer amount of emails lulling them into a zombie-esque state (see Sampson 2012), making them susceptible to mistaking algorithmic bait for a declaration of love. As such, the worm’s trickery incorporates human affectations into the malware scripting language as the clinching part of the trigger. It also proved that besides simple scripts, programming people is a particularly effective part of the automation of digital content, and in ways that exceed the usual characteristics of what we think of as automated malware.

Indeed, if social engineering – a key skill for spies, intelligence agencies, and consumer marketing – is also the active principle of operations for viruses and worms, we should remind ourselves what the Critical Engineers Group’s manifesto said about the powers of technics. Engineering encompasses the engineering of people as much as machines, and our notion of a system incorporates both, as well as the ability to look at processes of influence that must address the deeper level of manipulation and effect (2011). It also responds to what YoHa notes as the cunning skill of evil media, how “the directive aspects of technical objects unfold into the objects’ yearning for completion by the people, objects, and worlds that use them.” (Yokokoji and Harwood 2016, 64).

Operational Arts

As speculative software, malware maps what programming can do across multiple scales. While it functions in operating in system-specific ways, the impact cascades across infrastructure and people, systems and their spatial effects. As speculative software, malware runs off from the confines of computer science and the anti-virus industry into the operations of art, which can be one frame of reference for this cultural practice. The first exhibition to feature viruses and malware was I Love You, shown in Frankfurt in 2002 and curated by Franziska Nori. The exhibition suggested that viral programming is a discursive field inherently significant for network culture in the way it refashions our sense of security and self. It is also a form of art and cultural practice somewhere between art and tactical media.

Eva and Franco Mattes’ Biennale.py virus for the 2001 Venice Biennale was a significant and cheeky example of software viruses meeting the contemporary art world. Since then, the phenomenon has grown to such extents that there is money to be made, as artworks such as the recent Persistence of Chaos by Chinese artist Guo O Dong has demonstrated. The work features a laptop with six infamous viruses from ILOVEYOU to MyDoom and includes BlackEnergy which was used in the cyberattacks against the Ukrainian grid in 2015. Besides the seemingly witty idea resulting in an online auction of said computer, the project is also a marketing campaign for DeepInstinct, a company promising deep neural networked security solutions.

Picking up on the various contexts of art, security, and financial or corporate sectors (see Steyerl 2017), I want to propose that malware can also be understood as operational art (Canales 2014). I am adapting this term from a different context than the writer and science historian Jimena Canales, whose text combines an analysis of Trevor Paglen’s photographs of military and intelligence infrastructures, with insights into the extended field – the operational art exercised by military and intelligence agencies. This marks the art of, and in, fields of logistics and operations as they also infiltrate everyday non-military life. As Canales highlights, this relates to the self-description of Operations Other Than War (but of significance to the military) as operational art. Instead of merely reproducing masculine narratives about the centrality of war, Canales is pointing to the porous border between operations of war and Operations Other Than War. This map of Operations works through Paglen’s photographic work, as well as the cinematic analysis of operational images by Harun Farocki, likewise pointing to the regime of automated, non-human imaging which itself changes our understanding of, and practices with, other images too. What if the same is true of malware, that the existence and operation of malware changes our relationship with, and understanding of, any other software as well?

I propose that malware techniques, codes and practices (especially when bordering art and experimental practices) are forms of operational art in the same way that Operations of Cyberwar and Defence are becoming indistinguishable from Operations Other than War, or Operations Other Than Security. Malware offers an insight into the penetration of security and finance throughout society. Programmable routines become the core of the issue itself. Across the field of speculative, experimental ideas, and programming from operational intelligence of environments to engineering people, malware is a heuristic way to help understand societal changes and as such, much more than a genre of programming or one specific security issue. Malware is the art of tricksters; they turn situations into artifice, that is, turning them into programmable sets of possibilities, inventions of accidents as technological innovations, and inventions for turning the world to your advantage.

 

Jussi Parikka

Jussi Parikka is a Professor at Winchester School of Art (University of Southampton) and visiting fellow at FAMU, Prague. He is the author of Digital Contagions: A Media Archaeology of Computer Viruses (Peter Lang, 2016, 2nd edition) and co-editor with Tony Sampson of The Spam Book: On Viruses, Porn, and Other Anomalous Objects from the Dark Side of Digital Culture (Hampton Press, 2009).

 

Acknowledgements

This work relates to the project supported by the Czech Science Foundation (19-26865X) "Operational Images and Visual Culture: Media Archaeological Investigations" at FAMU and Academy of Performing Arts, Prague. Thank you also to Elise Hunchuck for her expertise and feedback on the text.

 

Bibliography

Bratton, Benjamin (n.d.) “The Cloud, the State, and the Stack: Metahaven in Conversation with Benjamin Bratton” online at

Bucher, Taina (2018) If…Then. Algorithmic Power and Politics. Oxford: Oxford University Press.

Canales, Jimena (2014) “Operational Art” in Visibility Machines: Harun Farocki and Trevor Paglen. Maryland: University of Maryland, 37-54.

Cohen, Fred (1984) “Computer Viruses – Theory and Experiments”.

Critical Engineering Manifesto (2011).

Dyer-Witheford, Nick and Matviyenko, Svitlana (2019) Cyberwar and Revolution. Minneapolis: University of Minnesota Press.

Easterling, Keller (2016) “Things That Shouldn’t Always Work” in Across and Beyond – a transmediale Reader on Post-digital Practices, Concepts, and Institutions, eds. Ryan Bishop, Kristoffer Gansing, Jussi Parikka and Elvia Wilk. Berlin: Sternberg, 253-261.

Easterling, Keller (2017) Medium Design. Moscow: Strelka.

Flusser, Vilem (1999) The Shape of Things. A Philosophy of Design. London: Reaktion.

Fuller, Matthew (2003) Behind the Blip. Essays on the Culture of Software. New York: Autonomedia.

Grusin, Richard (2010) Premediation. Basingstoke: Palgrave.

Parikka, Jussi (2015) “Denials of Service” in There is no Software, There Are Just Services, eds. Irina Kaldrack and Martina Leeker. Luneburg: Meson Press, 103-111.

Parikka, Jussi (2016) Digital Contagions. A Media Archaeology of Computer Viruses and Worms, 2nd revised edition. New York: Peter Lang.

Sampson, Tony (2012) Virality. Minneapolis: University of Minnesota Press.

Singleton, Benedict (2015) “(Notes Towards) Speculative Design.” in Shifter Magazine (September 2015).

Steyerl, Hito (2017) Duty Free Art. London: Verso.

Winthrop-Young, Geoffrey (2011) Kittler and the Media. Cambridge: Polity.

Yokokoji, Matsuko and Harwood, Graham (2016) “Evil Media Distribution Center” in Across and Beyond – a transmediale Reader on Post-digital Practices, Concepts, and Institutions, eds. Ryan Bishop, Kristoffer Gansing, Jussi Parikka and Elvia Wilk. Berlin: Sternberg, 63-75.

 

Malware: Symptoms of Viral Infection was made possible thanks to the generous support of:

Bas van de Poel, Marina Otero Verzier
Astin le Clercq, Bas van de Poel
Astin le Clercq
Tomorrow Bureau, Bas van de Poel
Vera van de Seyp, Marc Vermeeren, Bas van de Poel
Randall MacDonald