Jon Lebkowsky on Fri, 18 Apr 1997 15:42:03 +0200 (MET DST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> Omissions


Mark Stahlman and I swapped email yesterday, all of which was copied to
nettime but did not come across the list, evidently lost in the xs4all war
zone...so here are those messages, in sequence:

At 09:20 AM 4/17/97 -0700, via RadioMail wrote:

>Pit (Jon, et al):
>
>Thanks for posting Hayles' fascinating strategic essay, which I take to be
>at least partly a response to my recent call for strategic thinking within
>the nettime community.  As is often the case, what is most interesting is
>what is not said (implicit) as much as what is said (explicit).  What are
>Hayles' constraints?  And, what are the strategic implications of these
>constraints?

Mark, I'm not even going to attempt to clarify your total misinterpretation
of Hayles' paper. Suffice to say that her use of the word 'constraint,'
unlike your use of the term, is not vernacular, but is specialized and
context-specific.

>Jon recently described me as a technophobe and contrasted my apocalyptic
>vision (which consists of my merely repeating other's descriptions) with
>his own -- which he termed optimistic and a life-affirming.  My only
>question for him is, "What sort of life are you affirming -- is it human?" 
>Based on what I've written and my 25+ years in the technology business, it
>does seem innaccurate to call me a technophobe. I wonder if Jon would care
>to look closely again my "abstractions" to understand what I have been
>saying. 

Perhaps you could clarify?  There is much air in the cake; I'm looking for
the chewy brownie.  As for 'what sort of life' I'm affirming...I don't have
it sorted.  *8-)

If you are not separating technology from humanity, and if you are not
asserting  a fear of technology, then what, precisely, are you saying?
"25+ years in the technology business" tells me neither that you are
technophobic nor technophiliac; my comment was a response to your last
post, and your various rants here on the WELL, were your position seemed to
be that technology is external, and that it is inherently destructive.  If
your position is, rather, that technology is value-neutral, and that the
question of "the human use of human beings" is in fact human, rather than
technological, we may be closer to agreement.  

>I specifically indicated that technology will/could be turned *against* the
>enemies of humanity and that I intend to do just this.  Nowhere have I
>renounced technology.  I am, in fact, the one who is calling for its
>strategic use in the war to arrest the abolishing of the human race.  I
>merely noted that technology is being used by humanity's enemies.  I would
>have thought that to be non-controversial.  If one were interested in
>defending humanity, that is.

Who or what is abolishing the human race? How will you use technology in
this war 'against humanity's enemies'?  This sounds rather like a holy war
to me; perhaps it is being fought as much within you as within the human race?

>We humans are at war and, therefore, we need a strategy.  Optimism is no
>shield against this assault on our humanity.  We must understand what is at
>stake and how the various forces are aligned.  If we constrain ourselves to
>renouncing the existence of truth and meaning (as so many have done,
>including Hayles), then we have already lost that war and doomed humanity
>to extinction.  

Again, you've misread Hayles.  To say that truth is not precisely knowable
is not to say that there is no truth.

>The technology of genetic and psychological manipulation
>will ensure that humanity will cease to exist and that it will be replaced
>by an engineered android race of Borg-like "post-humans."  We cannot simply
>affirm life if we wish to remain human.  We must understand what it is that
>makes human life human if we wish it to remain human, afterall.  We are,
>perhaps, the world's most endangered species.

You have rather more faith in the predictive abilities of science fiction
than I do.

>To be human is simply to understand that reality is not constrained.  Truth
>exists and knowing this is what ultimately makes us human.  Grasping this
>truth arms us to defend our species.  Constraining oneself by ignoring it
>actively invites our demise.

I think what you call 'truth' is what Hayles calls 'flux,' and it is
inevitably constrained, for humans, by human perceptual limitation. But
again, I think you use the same words as she does, without the same meaning.

Borg-Boy Jon

***

Date: Thu, 17 Apr 1997 14:26:46 -0700 (PDT)
To: Jon Lebkowsky <jonl@onr.com>
From: Mark Stahlman (via RadioMail) <stahlman@radiomail.net>
CC: nettime-l@desk.nl
Subject: Re: <nettime> Is Reality Constrained?

Jon:

Thanks for the response.  Perhaps I have misread Hayles, but given that
none of what I read in the posted essay (or her other writings) seemed
particularly novel nor particularly difficult, I sorta doubt it.  She is
adopting an epistemology which has hundreds of similiar expressions -- it's
what she calls poststructural critical analysis and, while I'm sure that
she's a valued participant in that world, she hardly invented the approach.
 

The whole idea that we only "know" through our senses is just another form
of some famous old exercises familiar to all first year philosophy
students.  She mistakes this "knowledge" for science and, as a result,
seems to me to be deeply confused about just about everything else.  I know
it's not ultimately her fault but she's responsible for those she has
choosen to follow.

The question is whether she or those who adhere to this approach understand
the implications of their own analysis.  Does Hayles understand Hayles?

I did a little surfing to her UCLA site and picked this gem out of her 1993
essay, "Virtual Bodies and Flickering Signifiers":

"I understand 'human' and 'posthuman' to be historically specific
constructions that emerge from different configurations of embodiment,
technology and culture."

This is the issue.  Is "human" the species -- not "human" the word --
socially constructed?  She doesn't say.  Perhaps, she doesn't think it's an
interesting question.  Perhaps, she is constrained from figuring out the
answer.  Maybe she thinks it's out there in the unknowable flux.  

You are also unsure of what is human and what is not.  That is the problem.
 And, it's the central problem that affects us all.  Without an answer to
the question "What is humanity?", we are in very real danger of abolishing
ourselves.

I'm convinced that humanity is not "socially constructed."  Furthermore,
I'm convinced that those who express the belief that humanity can be
plastically altered ("socially constructed") are themselves hideously
dangerous people.  These are utopians of the worst kind who have set out to
reshape humanity to suit their own purposes.  Naturally, these purposes are
always explained as improving the world by perfecting humanity.  This
explanation is intended for fools.  Sadly, it often works.

Technology is *not* neutral.  It has imbedded and often misunderstood
purposes but purposes nonetheless.  A gun is a technology which is designed
for killing.  While it might be used to target shoot or sit on the wall,
it's still a killing tool.

I don't fear technology; I fear technology's designer's purposes.  If
people who set out to permanently alter humanity design technology for the
purpose of abolishing humanity, then this could indeed be the result.  This
is the world she is talking about and, since she seems to be going along
with the game, it's the world she's contributing to shaping.

I intend to design technology which will resist this process of alteration
for the simple reason that we are approaching an irreversible event. 
Humanity could be abolished.  That's all.  I hope this is now a little
clearer and thanks for the exchange.

Mark Stahlman
New Media Associates
New York City
newmedia@mcimail.com


***
At 02:26 PM 4/17/97 -0700, via RadioMail wrote:

>You are also unsure of what is human and what is not.  That is the problem.
> And, it's the central problem that affects us all.  Without an answer to
>the question "What is humanity?", we are in very real danger of abolishing
>ourselves.

Mark, why do you think that I'm unsure? I don't feel particularly confused
about "what is human," do you?

>
>I'm convinced that humanity is not "socially constructed."  Furthermore,
>I'm convinced that those who express the belief that humanity can be
>plastically altered ("socially constructed") are themselves hideously
>dangerous people.  These are utopians of the worst kind who have set out to
>reshape humanity to suit their own purposes.  Naturally, these purposes are
>always explained as improving the world by perfecting humanity.  This
>explanation is intended for fools.  Sadly, it often works.

Humanity may not be socially constructed, but human reality is inherently
socially constructed, but again, my meaning differs from yours if you think
'socially constructed' means 'plastically altered.'  We're talking about
interpretation, not altered. You seem to hold the view that
phenomenological reality is clearly knowable, and that our "knowledge" is
no way constrained (that word again, in another sense) by perceptual or
experiential limitiations.  This is absurd.  And I'm not sure who these
utopians are that you're referring to, but my thrust is neither dystopian
nor utopian.  It's more a rejection of each shaggy apocalypse
story....which is not to say that the stability of the phsyical world or
humanity are any way assured.  I just don't see the same certainly of
collapse that you're seeing, and I'm not sure I would interpret 'collapse'
in quite the same way: i.e. 'the end of the world *as we know it* is not
necessarily the end of the world, if you know what I mean.

>Technology is *not* neutral.  It has imbedded and often misunderstood
>purposes but purposes nonetheless.  A gun is a technology which is designed
>for killing.  While it might be used to target shoot or sit on the wall,
>it's still a killing tool.

Technology as a general concept is neutral. Specific technologies may not
be neutral, as they are defined by their uses. Big difference, I think.

>I don't fear technology; I fear technology's designer's purposes.  If
>people who set out to permanently alter humanity design technology for the
>purpose of abolishing humanity, then this could indeed be the result.  This
>is the world she is talking about and, since she seems to be going along
>with the game, it's the world she's contributing to shaping.

Is this the world she is talking about? I'm sorry, but I don't get that at
all. I think she's talking about acknowledgment of human limitation, not
alteration of human design.

>I intend to design technology which will resist this process of alteration
>for the simple reason that we are approaching an irreversible event. 
>Humanity could be abolished.  That's all.  I hope this is now a little
>clearer and thanks for the exchange.

What technology are you designing, then?



=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Jon Lebkowsky     *     jonl@onr.com     *     www.well.com/~jonl
* Austin conference, Electric Minds  (www.minds.com)
* Vice President, EFF-Austin (www.eff-austin.org)
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
---
#  distributed via nettime-l : no commercial use without permission
#  <nettime> is a closed moderated mailinglist for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: majordomo@icf.de and "info nettime" in the msg body
#  URL: http://www.desk.nl/~nettime/  contact: nettime-owner@icf.de