Florian Cramer on Thu, 29 Jan 2009 12:05:01 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> Digital Humanities Manifesto


Hello Marianne:

> Why do you think it is fruitfull to define digital as any discrete  
> entity? I agree that anything build up by discrete entities can be  
> translated into digital matarial by assigning numbers to to these  
> entities, but countable in itself does not make something computable (by  
> computers).

Thanks for the good points - but I still have doubts that a
differentiation of "countable" and "computable" holds. To my
knowledge, it doesn't in computer science. Since Alan Turing wrote "On
Computable Numbers" in 1936, computable numbers are actually defined
as countable (real) numbers. But I would argue that it's not actually
an issue of computer scientific correctness since much computer
science terminology such as "interpretation", "operational semantics"
and "information ontology" is problematic if not flawed.

The crux lies in the definition of "computer". If we colloquially
define digital computers as the kind of electronic computing devices
that we encounter as PCs, laptops, embedded controllers, game consoles
etc., then indeed there's an obvious difference between a discrete,
countable symbol such as a letter on a piece of paper or the binary
code for an ASCII symbol in an electronic RAM. But you can actually
use paper as a computer, for example in the way Tristan Tzara did
when he cut out the single words of a newspaper article, shuffled
them and made them a Dadaist poem [all the while describing the
algorithm in a manifesto], or Brion Gysin and William S. Burroughs
in their vertical column cut-ups of texts [with their text "Cut-ups
self-explained" explaining the algorithm]. Turing himself used the
term "paper machines" for computers because in his time, paper was
used as the medium for executing algorithms.

All of this is consistent with a computer science definition of
"computer" which also includes non-electronic and analog devices. The
humanities and media studies (for example, scholars like Bernhard
Dotzler and Sybille Krämer) have worked with this general model of
computing as well. I would argue that it is not just relevant from
a historicist point of view, but in order to take into account the
cultural diversity of computing. There is a whole school of research
on (so-called) ethnocomputing that follows Ron Eglash's research on
non-Western practices of non-electronic computing devices. And there
are artistic projects like Wilfried Hou Je Bek's psychogeographical
"dotwalk" computer which put into question the same common sense
equation of computing and electronic circuits. Or what about the
distribution of the PGP source code in books in order to circumvent
U.S. crypto export regulations, or the distribution of DeCSS on, among
others, t-shirts or in limericks describing the algorithm?

If we accept these cases and differentiations, both in terms of the
abstract concept of computing and in terms of concrete cultural
practices, then the question of "analog" or "digital" is not one of
particular material carriers a.k.a. media.

Florian

-- 
blog:     http://en.pleintekst.nl
homepage: http://cramer.pleintekst.nl:70
          gopher://cramer.pleintekst.nl





#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mail.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org