carlo von lynX on Fri, 19 Jan 2018 14:16:59 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: <nettime> social media critique: next steps?


On Wed, Jan 17, 2018 at 12:45:35PM +0100, Felix Stalder wrote:
> Therefore, I would phrase the dilemma differently. The struggle is
> whether an oligarchy controls the mass of people through machines, and
> the mass of people using the machines to articulate and enact their
> collective will.
> 
> In may ways, machines -- deep-learning, big data -- are god. The seat of
> knowledge on a scale that mere mortals cannot comprehend it and the
> source of action that, for all its arbitrary surface appearance, can
> always claim an underlying justification that remains hidden to all but
> a few.

Yes, god! Not literally, but I like the concept of "god" here to
make it clear that it is far too much power and knowledge that
any individual, corporation or state would be able to handle in
a way that doesn't lead to totalitarianism of such a dystopian
nature that humanity has never experienced before.

We must and we can declare this sort of god illegal, like we
tamed the god-like power of destroying the planet by atomic
warfare. Only this time we can do it better, because we can
require industry to design technology in such a way that it 
cannot be used to collect *social big data*, the kind that by
its nature can be a threat to whatever is left of democracy.
We can impede the weapons from being assembled in the first
place.

If tech capable of building a panopticon around human society
is impossible through the imposition of tech that defends
against this, then there are no more oligarchies that control
*the* machines, the ones with *the* big data. It's a bit late
to get started with this, given that all of the necessary big
data to blackmail every individual on the planet is already
collected in Bluffdale, but better late than never.

I'll pick another mail from the unlike-us list, where a similar
conversation among some of my favorite thinkers has been 
unfolding, starting from Geert's cross-post:

On Wed, Jan 17, 2018 at 05:39:59PM -0800, Doug Schuler wrote:
> ALL of the following must happen at the same time and we must link them and
> (at least begin to) institutionalize them (i.e. make them more reliably
> operational). The needs include the following (which I would call *patterns*
> )
>
>    - Critique the system(s)

We have plenty of that.

>    - Communicate the critique(s)

Even Italian state television ran a news special on how
technology is breaking democracy, so it's out there.

>    - Design and build alternatives

Some of us started in 2001 and still aren't done. We are
working on this, but to really replace the entire stack
from IP to cloud technology so that it actually works for
all billions of us and isn't just a cosmetic or difficult
to use patchwork, is a hell of a challenge. But I am
convinced industry could whip up what humanity needs in 
only three years if there was a legislational requirement
for them to do so (in order to be allowed to sell devices
to us, for example).

>    - Use the alternatives

Even if anything up to the actual challenge was ready
for everyday use, it would not be able to compete
commercially. You can't beat the surveillance economy.
Too many people do not care for the societal consequen-
ces of getting things gratis, they just see the gratis
for themselves. Thus, the market can't fix it. The 
market mechanism does not model societal problems.

>    - Work with activists
>    - Work with policy makers
>    - Work with people
>    - Work together (better)
>    - Perform ongoing “meta cognition” about our approach.

On Wed, Jan 17, 2018 at 06:03:07PM -0800, Erin Glass wrote:
> Adding a bullet point to Doug's great list (if I may):
>    - Get the alternatives in the classroom

If the alternative is the new standard, all the devices
would start doing things in a constitutionally viable way
without the users needing to learn much being different
at all. When I say, they can have the cake and eat it
I really mean it. Even those apps that monitor their
health condition can be done in ways that aren't harmful.
One well-baked law is enough, but we can't leave it to
the politicians to write it. nettime could be a good place
to crowdsource it.

On Wed, Jan 17, 2018 at 10:35:54PM +0000, Jonathan Marshall wrote:
> There is also a sense in which no one is in control. Technologies and actions always have the possibility of unintended effects, unexpected consequences and so on. It is also likely that hard attempts at control will eventually be undermined by the disorders that the attempts generate.

Tech has certainly reached a point where a single individual has
a hard time understanding the complete stack from the circuitry
up to what shows up on the screen or went across the net, but
still there could be ways to isolate and partition layers if
anybody cared to. The current unregulated market produces no
incentive in this regard. Let me make an example on something
that could help, but probably wouldn't get established if not
by legislative intervention:

Imagine if you first need to know the public key of a machine
and that machine needs to acknowledge your public key as a
legitimate one to talk to it? Would it matter much if behind
such a crypto "firewall" there was a WinNT machine running
a supermarket's cashier machine? An attack like WannaCry
simply wouldn't get far if you first have to authenticate
cryptographically, not with a preset password of admin1234.

If machines are too old to run such a "firewall", then they
need a hardware condom between them and the network. I'm not
making any distinction between networks since we know that
sooner or later you will always have insecure devices in a
company network.

This is a simplification, I'm not offering a solution in 15
lines. I'm saying that we don't know how drastically we
might be able to reduce the attack surfaces since we as a
society actually have never tried taking appropriate measures.

The Internet is running on industry standards whose priorities
are minimum voluntary effort and maximum commercial gain.
You can't expect to get very far with that in regards of
societal priorities. Even if technical solutions exist, there
will always be enough old broken hardware ready to be subdued
if there are no obligations towards society. Broken hardware
these days is a threat to democracy. Not only can you use
p0wned devices to DDoS targets, you can run mass manipulation
campaigns from all of those IP addresses.. troll armies etc.

The way our governments think this is just a race towards who
is going to have the most powerful troll armies, is idiotic.
Cyber warfare is a race to the bottom, a losing game for human
race. We need legislation that imposes cyber peace by technical
standards. And it is feasible. We can have the cake and eat it.


-- 
  E-mail is public! Talk to me in private using encryption:
         http://loupsycedyglgamf.onion/LynX/
          irc://loupsycedyglgamf.onion:67/lynX
         https://psyced.org:34443/LynX/
#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org
#  @nettime_bot tweets mail w/ sender unless #ANON is in Subject: