Matthew Fuller on Wed, 12 Dec 2007 17:27:27 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> Jussi Parikka interview


Jussi Parrika is author of the book 'Digital Contagions, a media 
archaeology of computer viruses', published by Peter Lang earlier this 
year.  The book is a speculative meditation on the nature of viruses and 
their part in contemporary technocultures.  This interview was carried out 
by email in November and December 2007



Matthew Fuller:  How do you figure 'the body' or the biopolitical in your 
discussion of viruses?  Clearly it would be possible to simply fall into 
the trap of equating computer viruses with biological ones, to mistake the 
metaphor for the thing named.  On the other hand it is possible to trace 
the ways in which the term has been used to mark a cross-over between 
categories that is about a kind of understanding of kinds of behaviours not 
delimited by material instantiation, for instance a certain kind of dynamic 
of proliferation,that makes the term meaningful.  What are the stakes in 
following this through?


Jussi Parrikka:  Following a metaphorical line of thought from the 
beginning would have been the easy way out, writing an analysis of the 
metaphorics and representations of viruses in popular media. Indeed, that 
was the way much of virus discourse was approached especially in the 1990s, 
analysing the translations and linguistic passages between diseases of 
bodies and diseases of networks. Naturally language has been an essential 
part of the creation of the so-called viral discourse, but I am keen on 
insisting at least on two things: 1) language and metaphorics should not be 
seen as primarily or solely signifying systems but as part of wider 
material assemblages and that 2) the biopolitics of computer systems is 
about many other things besides language as well (two related issues of 
course.)

So firstly, following Deleuze and Guattari, language works as order-words, 
which is quite evident in the case of software. Whereas it would be 
interesting to approach software itself as an order-word (where the 
execution is a defining part of the event of computer systems), the 
linguistic acts that frame, stabilised and valorise software could be 
understood as such acts of power and knowledge that try to give a 
consistency to the contested questions of "what is proper software?", "what 
is illegal software?", "what kind of software and network events are 
allowed, by whom?" Here, as you note, it is also a question of cross-overs 
between categories, very tactical cross-overs indeed, of translating and 
smuggling elements from another, foreign realm to for example technological 
networks.  Here "virality" can perhaps be used as a term that flags towards 
this virulence of trespassing categories, something I wanted to integrate 
intimately as part of the methodology of Digital Contagions.

What is troubling with the metaphoric accounts of cultural reality, for 
example technology, is that they reintroduce a dualist ontology of things 
in themselves (which should be left untouched by the cultural analyst) and 
the representations, the linguistic representations we have of them which 
is supposed to be the terrain of cultural studies. Naturally, this 
introduces the age old hylomorphic scheme of matter as passive, waiting for 
a cultural studies scholar to breath life into it. So in other words, I 
would characterize Digital Contagions not being interested in language per 
se, but in how it cuts through, intervenes, frames and engages in the messy 
assemblages not made purely of material "things", or "processes", but 
neither purely of symbolic actions, significations, valorizations.

Hence, the question of biopolitics of network bodies, the biopolitics of 
viruses and other software. I try to think this through via the Deleuzian 
framework of allowing bodies to be of various kinds and scales: from bodies 
of humans, to bodies of software, networks, etc. Michel Foucault and people 
drawing from his work, like Jonathan Crary and Giorgio Agamben, have of 
course paved the way towards understanding the crucial mission of modern 
politics being not that of human being and their linguistic acts (their 
social life as rational, communicating beings) but as having to do with the 
"bare life", the life beyond or in a way "before" human beings as 
metaphor-using communicators. The birth of modern media culture is one of 
tapping into the intensive animal reservoirs of the human being: for 
Foucault this referred to the biological features of the human being (as a 
species), for Crary, this referred to the new physiological experiments 
tapping into this human being as a fleshy, animal body. Braidotti has 
recently wanted to emphasize the animality of this layer by referring 
instead of "bios" to the concept of "zoe".

What I wanted to do was to continue this line of thought to technological 
systems, and biopolitics of software, where the question was not reducible 
to what people say or think about software, networks, digital technologies, 
but how the biopolitics of digital culture is not interested (only) in 
controlling human minds, but the intensive life of software, for example - 
taking the material assemblages as its object, in a way. Thus, this calls 
for an ethology of software, of looking at the objects and processes as 
affects capable of forging relations, making connections, interactions and 
exchanges.



MF: In writing about the cultural aspects of software there is a real 
imperative to technical accuracy.  Firstly because if this is not achieved 
it makes the possibility of dialogue with those in the area primarily 
concerned with technical aspects quite difficult.  Secondly, there is a 
kind of rigour required which is likely to produce new ideas rather than 
act as a blockage.  How have you handled this in Digital Contagions, and 
how do you see this question developing?

JP: This is a question or an agenda that I learned to appreciate through 
German media theory, first via reading Friedrich Kittler, then Wolfgang 
Ernst among others. It also relates to what I just wrote about trying to 
think beyond the metaphorics of media culture and try to understand the 
more accurate expressions, techniques and ways of articulation that a 
medium might use beyond the human representations of it. So technical 
accuracy is a question of ontology (an often banned word in cultural 
studies) but as you suggest, it has the potential of acting as a vector 
beyond the confines of disciplinary boundaries. Now I do not consider 
myself expert concerning the technical characteristics of computer viruses, 
but related to the biopolitics question I see that a meticulous interest in 
this field is of crucial significance.

What recent years of approaches to networks, software and computer systems 
have achieved is a growing understanding of the questions of immanence of 
technology and power. Instead of bracketing the materiality of technology 
in the cultural studies agenda of ideology, much of the research done has 
succeeded in demonstrating how technologies in their very materiality 
channel and refashion power relations. They are not only second order 
phenomena of "social" struggles in the sense of "social" being something 
removed from the material. An understanding of the technologies at hand is 
a key prerequisite for an understanding of what kind of new modulations of 
reality we are dealing with. But I would not perhaps too swiftly call this 
as an aid in communication or dialogue, because it supposes that the 
concepts, or the "understanding of technologies at hand", are transparently 
stable objects. Instead, also this material level is very much contested 
and what is crucial to me is not only an approach
 that takes into account of what kind of technologies we are dealing and 
tries to find the truth of e.g. software there but an approach which 
discusses this in terms of materiality that is continuously processual, not 
pinned down to a certain essence whether technological or social. Instead, 
we are continuously dealing with processes that are translational, in the 
process of being defined and across platforms. Not every computer scientist 
or anti-virus researcher is happy with what I write about viruses, quite 
the contrary, I've encountered arguments that I do not understand the 
technical reality of what I am talking about and that taking into account 
e.g. alternative voices in fiction is just leading my analyses astray. 
Again, in such statements we find the desire to pindown the truth of 
computer viruses to a certain technical knowledge, cut off from the 
translations and processes this weird overdetermined object is articulated 
in. So in addition to valorising technical accuracy, I would like to insist 
more widely on the materiality of the phenomenon at hand, a materiality 
that is irreducible to "agreed on" technical characteristics, a materiality 
that takes into account the various levels of relations and definitions of 
networks and software. Rigour is a good word, as it connotates a different 
thing as "technical accuracy": it takes into account that one can be 
attuned to the materiality of the networks at hand, but without taking such 
a stance that "first you have to sort your facts out, then you can make 
your interpretations of those facts." If we could do that, we would already 
have a fixed framework for those interpretations.



MF:  Your period of study of computer viruses ends in 1995.  Could you say 
something about why you choose this period as being significant, and what 
were the aspects of viruses you'd like to have covered in the subsequent 
period?

JP: Yes, the period my study covers is approximately from the early 
computer era after the World War II onto approximately the emergence of the 
"popular Internet." In a way this is of course stupid to stop there when 
the Internet was becoming an everyday reality instead of just a discursive 
promise of a networked future that was proposed in various platforms from 
professional computer journals to popular culture. But it is also because 
of this seeming paradox that the earlier period is interesting. For example 
the security discourse around viruses emerged at the end of the 1980s, and 
much of the techniques, tactics, and framings we use to make sense and 
control malware were not so evident at first. Focusing on the earlier 
period gives one access to the actual genealogical emergence of the 
phenomena and a truly historical take on the forces that gave consistency 
to the viral and other forms of malware. Here, one sees the recurring 
tropes emerging, like the curious insistence in computer security discourse 
to move from technical issues to social ones. So continuously, from 1960s 
on, you have the idea of "it's the human being that is the problem, not the 
computer or the program" being articulated, similarly as the idea that 
"there is no good virus", since the 1980s. Or then the continuous doom 
laden adverts and discourses warning of "data loss" at least since the 
early 1980s before viruses; "data loss disasters" to databases and personal 
computers due to various reasons from natural phenomena like the lightning 
to malicious intended crime, all of which in a way "paved the way" for 
viruses to fit into the already stated fear of data loss as a key danger of 
digital society.

Also, in terms of programs, much of the interesting stuff was done already 
in the 1950s and 1960s like the Darwin program or early rabbit batch jobs 
in mainframes. One of those, from 1966, included a RUNCOM command script 
repeating itself continuously which would then constipate the system (as 
David Ferbrache suggests in his 'A Pathology of Computer Viruses' book). 
Or how Kevin Driscoll attributed the emergence of viruses not to a specific 
program but to a short piece instruction, MOVE (Program Counter) --> 
Program Counter + 1, where the "virus" is less a program entity than an 
instruction that is continously on the move to the next memory location. 
Besides being curious examples of an "archaeology of the computer virus", 
such processes should be taken as compelling issues that force us to think 
the digital culture in a historically tuned field.

This choice to focus on the pre-1995 period is in accordance with my belief 
that historical and temporal perspectives can bring forth novel rewirings 
and short-circuitings for present discussions and practices. Hence, Digital 
Contagions analyzes the media archaeology of this specific computer 
accident as a symptom of a more abstract cultural diagram. The digital 
virus is not solely an internal computer problem but a trace of cultural 
trends connected to consumer capitalism, digitality and networking as the 
central cultural platforms of late twentieth century as well as the media 
ecology and the so-called biological diagram of the computer where the 
biological sciences are actively interfaced with computer science often 
with a special emphasis on bottom-up emergence. Again, we are moving much 
beyond the more narrow take on recent years of "actual" viruses, and 
focusing on the archaeological transcrossings of the phenomena. Despite the 
often-stated idea of cultural studies, in its broad sense, being an 
approach that takes historical perspectives at its core, most of this is 
done in a very vague fashion, neglecting e.g. historical examples or 
reducing them to curiosities. Another way to consider historical 
perspectives is to contrast them with the affirmative perspective of 
becomings, which repeats a certain Deleuzian dualism: history as the regime 
of the State Archive and becomings as ahistorical creations. Instead of 
repeating this dualism, I wanted to approach the possibility of media 
archaeology as a nomadic cultural analysis, where "history" is not a marker 
of "already beens" but a potential, a potentiality that can be rewired into 
new assemblages of the future. Historically tuned cultural analysis cannot 
be reduced to a status of repeating the sources, but can be seen as one of 
summoning events as Foucault coined it.

Of course, this does not mean that focusing on recent years would not 
provided fresh perspectives. But there are people working already on this, 
like Tony Sampson from University of East London, finishing a book on 
cultural theory and viruses.  I myself would have definitely refined my 
take in relation to e.g. botnets, wrote a few more words on net art viruses 
(which I am doing for the forthcoming Spam Book) and also more carefully 
would have covered the phenomena of terrorism.

MF:  With viruses aimed at mobile phones running Symbian such as Cabir and 
Cardtrp, the latter which also crosses between Windows machines, the 
platforms for viruses are becoming more diverse.  But with events such as 
the attacks on Estonian networks and the apparent existence of very large 
scale botnets, the broader category of 'malware' is itself becoming more 
infrastructural, more built into the internet.  How does the figure of the 
virus work in this wider context?

JP: For sure, the notion of the "virus" or "viral" is in danger of becoming 
a floating signifier, a notion used for anything related to malware or in 
contrast, anything "cool" and "rebellious". This relates to the earlier 
question concerning technical specificity which can be seen as one way of 
getting oneself out of the swamp of metaphoricity and vagueness and looking 
into how on the material level certain types of software function. My point 
was in general that malware has from early on been infrastructural to the 
Internet and network societies, this has been evident from early computer 
security texts since the 1960s on. The shift from protecting computers from 
human beings to protecting them from malicious software started around 
1970s, and the notion of the incidental nature of the viral with networks 
feeds nicely into this as well. This is why I used the notion of the 
"universal viral machine" from Fred Cohen, the computer virus research 
pioneer: to underline that in the age of networked computers, viruses in 
Turing machines can be thought of as potentially semi-autonomous processes, 
a '"Universal Viral Machine" which can evolve any "computable number".' 
Cohen describes in his early work from 1980s (his PhD thesis came out in 
1986) a weird world of computer processes without human interventions, 
there is not much mention of "intentions" or "social constructions" of 
computers, but anonymous processes, turing machines, evolutionary sets and 
also e.g. "Universal Protection Machines" that are aimed to combat the 
Viral Machines by maintaining subject object matrixes, sequences to be 
interpreted, the rights of subjects to objects, scheduling of processes 
etc.

But we should not be blinded to think that because of the underlying Turing 
sequences, the processes are not system specific and material. Botnets are 
not the same as early 1990s viruses, nor is the 1988 Morris worm the same 
thing as current network worms that can spread across the globe in a matter 
of hours. Several of the early viruses got "extinct" because of 
technological obsolescence, their ways of proliferation via e.g. floppy 
disks becoming obsolescent. Much of the talk surrounding the new viruses 
suggests at least implicitly that viruses and their programmers are 
continuously finding new platforms and almost universal ways of propagation 
like via the Bluetooth in mobile phones. However, even though not being an 
expert on this issue, I understand that for example the Cabir worm relies 
much on the "kindness of the user" than on a system vulnerability, as e.g. 
the recipient has to accept to receive the particular piece of data package 
before the worm spreads. With Cardtrap, despite its malicious payload, it 
does not seem to work even with all Windows machines where the phone memory 
card might actually be carrying the Trojan but the autorun file did not at 
least according to F-Secure information work on Windows XP SP2 and Windows 
2000. Again, much more than demonstrating the universality of the viral in 
the sense of cross platform spreading (which in a way is true as well) this 
also refers to the metastability of programs and their environments and how 
easily "things just don't work" so to speak. This is the reason why Mark 
Ludwig flagged in the 1990s already that true evolution in software 
environments - at least the everyday environments like with Windows - is 
quite a far-fetched dream (or a fear) as the operating systems and software 
are just too unstable to allow for a random mutation that would work.

As for botnets, it's the zombie side to them that is interesting. Eugene 
Thacker has been digging into the zombie world of contemporary biopolitics, 
looking at contagion and transmission through this figure of the undead, 
the life on the border of zoe and bios. Again, I would use the idea of the 
botnet to illustrate how power operates (also) on the level of ahuman 
technical, before or between the human social bind. Capturing computers in 
a zombie network is not reducible to a work of ideology, or as in the case 
of attacks against the sites of Estonian government and other public bodies 
to a work of international politics (even if it also was touched as the 
diplomatic relations between Russia and Estonia were  involved), but a 
whole another layer of politics, working at the level of infections, 
software and networks. A lot of the analysis surrounding the attacks was 
seeing this from the viewpoint of international relations of two 
governmental bodies, but more interesting are the sub-governmental forces 
in action and also the sub-social forces that were harnessed as part of the 
international politics.



MF: One of the things that is interesting about viruses and other related 
kinds of software is their approach to computers and networks as a set of 
experimental zones.  Towards the end of your book you mention Stefan 
Helmreich's call for a 'playful science', showing how Artificial Life can 
correspond to this.  At the same time, Viruses seem to have a slightly 
different form of playfulness to them.  If we can adopt the language of 
probability for a moment, we could say that because Alife, generally (aside 
from interesting working done in evolutionary hardware, or in aspects of 
CrystalPunk work) tends to remain within well-defined boundaries, that of 
the model for instance. Whilst it has the capacity of offering a 
'theoretical' playfulness, its is limited to a particular scale of 
activity.  Viruses on the other hand offer a fully 'experimental' that is, 
more multi-dimensional, unpredictable way of inhabiting and shaping the 
networks.  It sets in play are sets of conjunctions that are not simply 
within the domain of the software per se.  The focus on malware tends 
rather to limit this.  Your book calls for a more playful approach, where 
do you see the most useful historical resources for such playfulness? 
Which unexplored viral domains are most potentially interesting?

JP:  In a more straightforward vein, one could see my book as Foucauldian 
mapping of how the notion and powers of viral sets became territorialized 
and captured under the notion of malware, which acted not only as a 
repressive mechanism but produced a huge amount of books, advice, security 
instructions, manoeuvres, software etc. But to track this playfulness works 
a bit further on the issue. This actually relates to the question earlier 
you asked about why I stopped my analysis in 1995. It is just because the 
much more surprising stuff is found earlier, trying to follow the related 
strands of viral programming and the birth of network paradigms in computer 
labs.  I was fascinated to hear from the early pioneers
Like Doug McIllroy, Vic Vyssotsky and Ken Thompson of their early 
experiences with computer ecologies of self-perpetuating programs. In a 
way, the obvious connection with early experiments had to do with the Cold 
War and security discourses, but I would say that much of the work done was 
not reducible to that functionality but also worked on another level of 
fascination with the expressions of these programs. For example, the simple 
game called Darwin that tried to out-populate the game ecology by "killing" 
other programs and spreading its own code is an interesting example. It was 
popularized later by A.K. Dewdney in Scientific American and now known as 
Core Wars. But what for example Mark Ludwig flagged in his "black books of 
computer viruses" is that alife viruses are more or less dysfunctional. Due 
to the fundamental instability of most of computer systems, even small 
changes in code cause most likely only system crashes, no evolution. Hence, 
one has to deal with very limited scales, as you mention, and more 
interestingly speculate on the possibilities of for example evolving 
programs. It is a bit same thing as with artificial life art, where the 
genetically grown forms are indeed interesting and as an idea it has much 
to contribute, but besides the certain amount of forms "grown", it starts 
to get repetitious (without a difference). Another problem in the whole 
artificial life virus discussion was the rigid way of dealing with the 
issue: to come up with a minimum qualifying definitions for an entity to be 
living (definitions adopted from observation of biological entities mostly) 
and then comparing this to computer viruses. Not a very interesting way to 
approach the issue - even though alife research has aspired to move away 
from this model-thinking onto a simulacra-approach, as Claus Emmeche 
suggested some time ago. In any case, instead of merely following such 
paths, I wanted to proposed a Spinozian ethological way of approaching 
"life" not as a substance, not as a form, but as an intensive life of 
affects, of interactions and relations where the life of technical bits is 
not to be removed from the life of other scales, or other assemblages. So 
life is not a metaphor adopted from biology and biology a model used to 
imitate the intensive code life of programs, but life becomes a movement, 
interaction and affects. This is the idea of playfulness as well: that the 
"ecologies" of media are not prefixed, stable natura naturata kind of 
mechanics in the service of capitalism, but also active virtual ecologies 
of natura naturans, of creation, probing and experimenting. To put it into 
Foucauldian vocabulary: let's leave it to the police to decide whether the 
stuff really is alive.

Often the more interesting "living" experiments are the earlier, less 
researched experiments.
What also definitely would need much more research are the wonderful early 
computer ecologies of for example Nils Barricelli, Oliver Selfridge and 
Beatrice and Sydney Rome, all developing already in the 1950s systems that 
are relevant to the topic of experimental sciences of computational life. 
Even if not touching on viruses per se, they speculated in their work on 
how to make ecological and evolutionary models work with a computational 
platform and how to make that kind of computation useful. Now if Cohen 
tried to figure out the usefulness of viral machines in the 1980s, these 
persons were speculating on this stuff already 30 years earlier! For 
example Barricelli did not want his work to be seen under the 
representational paradigm of computers modelled on life, but underlining 
that the stuff on symbiogenesis in computers is really there, as 
simulations. In other words, the simulation did not offer information on 
biological parasites and ecologies, but was an end in itself in offering a 
computer system that could work in terms of interdependencies, 
connectedness, symbiotic relations. As interesting are for example Oliver 
Selfridge's Pandemonium experiments with semi-autonomous code of demons 
that "evolve" at least in a restricted way.  Computation was understood 
there as a statistical mesh, a parallel processing based on the connected 
sum of "shrieks" every data demon of the system communicated to others. 
This also showed a system of distributed intelligence, as already Manuel 
DeLanda noted earlier, where such projects were seen as part of the 
genealogy of passing control from the human to distributed systems. In such 
a system, ideally, control "floats" from a demon to another which can take 
up on various functions, enter into flexible changing relations based on 
the global characteristics of the system that continuously feeds into the 
local relations of the demons. What is of course funny is how there is a 
curious correspondance between such computer system characteristics and the 
post-Fordist notions of e.g. work skills as branded by needed flexibility, 
adaptation to change, fluid communication...

Another theme are the experimental aesthetics of (technological) failure 
that characterise modernity. There is whole history of things breaking 
down, of course, and art has of course been one key practice of modernity 
where the failures of systems of technology, organisation and control have 
been catalyzed and experimented upon. This is the famous Paul Virilio's 
notion of technical modernity: that accidents are incidental to their 
functioning. The accident of any system is a future horizon, a virtuality 
that might not ever actualize but it is still there in reality - often 
expressed only in statistics, worst-case scenarios and like, or then in 
simulated accidents by media artists. How much of the early avantgarde 
"media art" was based on exactly these impossible machines on the edge of 
breaking down, a Dadaist notion of technological modernity. One wonderful 
example would be George Perec's 1960s radio play La Machine where a 
computer programmed to dissect and recompose in variations Goethe's poem of 
The Wanderer's Night Song. As Florian Cramer writes in his Words Made 
Flesh, Perec's imaginary variation computer crashes and the input data 
turns into a program, working like an self-perpetuating email virus. I do 
not know whether I would agree with Cramer's conclusion that this testifies 
with the superiority of semantics resisting syntactical programming, but I 
agree that this is an interesting experiment of aesthetics of failure, 
aesthetics of accidents. So perhaps the playfulness, in general, is trying 
to think beyond the most obvious choices, to think beyond the security 
discourse (which is a highly interesting topic of course) towards the 
experimental takes on viruses and accidents.


MF:  Looking at art viruses, such as Biennale.py or those of Tomasso Tozzi 
in the 1980s there is clearly a further set of parallel imaginaries going 
on here.  With tens of thousands of viruses in the wild, can you imagine or 
identify a particular strain working with a particular pattern of art 
methodologies?

JP:  The art viruses, especially the Biennale.py project, fits nicely into 
this geneaology of aesthetics of accidents in its task to create an 
iconographics of malicious code. I think one of the fundamental successes 
of the project was to question the ontology of software and the distributed 
nature of the coded environment. On what level do micropolitics of software 
function, was an implicit key question of the project, which seemed to 
refuse a simple answer when distributing the code on t-shirts but also in 
expensive CD-ROMs etc. - while at the same time insisting on the harmless, 
invisible nature of the execution of the code. But beyond the way it was 
framed as part of art (as part of the Venice Biennale), what are the 
singular points to focus on?

I think Jaromil put it very poetically in the I Love You-exhibition 
catalogue when referring to digital viruses as a form of making (digital) 
language stutter in the manner Rimbaud and Verlaine made French stutter as 
part of an earlier challenge to transparent ways of seeing language. There 
is a threshold where code turns against itself and into a political 
gesture, or as Jaromil wrote: "In that chaos, viruses are spontaneous 
compositions which are like lyrical poems in causing imperfections in 
machines "made to work" and in representing the rebellion of our digital 
serfs."

>From existing viruses in the wild, one could perhaps extract certain 
methodological principles. Much of them relate to finding the threshold 
just on the border of working and not-working: a virus that destroys 
completely the system is of relatively small use, instead much more 
interesting are the ones who are able to infiltrate the system and still 
keep it working (in a moderated form). That is, to find the threshold, the 
minimum level of a system before its flipping into a crash. In a way, this 
could be of course continued to the point of going over the threshold, of 
letting go of the control structures and seeing what comes up - of exposing 
oneself to the viral algorithms, as Joseph Nechvatal does with his viral 
paintings, which demonstrate how the viral noise is not antithetical to the 
ordered creations of art - virus itself can be turned into an emerging 
explorations of patterns in painting or in music. Here, variation becomes 
primacy, and the planned line and sounds are exposed to continuous slight 
variations of algorithmic kind. The methodological clue in general with 
viruses being: take any banal repetitious action without an inherent 
meaning, repeat the action or habit to the point when it starts to change, 
a point where the pure repetition produces difference from itself. This 
again can be seen as tracking the smallest differences and thresholds 
emerging in any systematic action and/or habit.

Another interesting theme is how the algorithmic logic of viruses feeds 
much beyond the computer code realm and takes advantage of the presumed 
sociability of human relations. Take the I Love You virus, a simple 
exercize in unfilled desire perhaps, feeding on the wish of getting a 
confirmation of love from someone. Or in another form, the gambler virus of 
early 1990s which forced the user play for the contents of the hard drive; 
answer incorrectly, and you will lose. This played with a certain mythology 
of a "demon in the machine", of the computer possessed which was a theme of 
Jodi's early work of course (I think Alessandro Ludovico referred to their 
projects as insurrecting a certain alien presence in the computer which is 
a nice way to put it.) The virus examples mark the passing point or 
interfacing of the human being, but besides just focusing on the idea of 
the human being as the emotional, fallible creature, more interesting is to 
see the viruses, for example I Love You and other attachment viruses, as 
using to their advantage the habits of the user - of tapping into the 
presumed bodily habits where the meaning of an attachment is to open it etc.

Or then, to just track the parasitic movement and logic of the virus 
itself, as a way of exposing the dynamic logic of the net. Recently, the 
Google-Will-Eat-Itself took this parasitical logic of the Net to a new 
level by creating the paranoid-parasitical machine which draws money from 
Google to be used against itself. In a way perhaps this could be connected 
to the methodological ideal of "becoming imperceptible" and a move beyond 
identity politics. As argued by several Deleuzian writers, the becoming 
imperceptible of art is a much needed contrapunctual movement against the 
hegemony of representation analysis and identity thought where often only 
the only already recognized becomes an object of interest. How to come up 
with an action, experimentation that relies on the very notion of 
imperceptibility? An issue related to surveillance for sure, but perhaps 
also to art. In this context, Bertini's Vi-Con is related to the notion of 
invisibility "Yazna and ++ are two viruses in love.  They search for each 
other on the net, running through connected computers.  Apart from other 
viruses, their passages won't cause any damage to your computer [...]. 
Theirs is a soft passage, invisible, and extremely fragile."


#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mail.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org