Patrice Riemens on Thu, 10 Oct 2013 21:40:11 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> Pascal Zachary: Rules for the Digital Panopticon (IEEE)


original to:
http://spectrum.ieee.org/computing/software/rules-for-the-digital-panopticon


Rules for the Digital Panopticon

The technologies of persistent surveillance can protect us only if certain
boundaries are respected

By G. Pascal Zachary
(Posted 20 Sep 2013)

For centuries, we humans have lacked the all-knowing, all-seeing
mechanisms to credibly predict and prevent bad actions by others. Now
these very powers of preemption are perhaps within our grasp, thanks
to a confluence of technologies.

In the foreseeable future, governments, and perhaps some for-profit
corporations and civil-society groups, will design, construct, and
deploy surveillance systems that aim to predict and prevent bad
actions?and to identify, track, and neutralize people who commit them.

And when contemplating these systems, let?s broadly agree that we
should prevent the slaughter of children at school and the abduction,
rape, and ­imprisonment of women. And let?s also agree that we should
thwart lethal attacks against lawful government.

Of late, the U. S. government gets most of the attention in this
arena, and for good reason. The National Security Agency, through its
vast capacity to track virtually every phone call, e-mail, and text
message, promises new forms of preemption through a system security
experts call persistent surveillance.

The Boston Marathon bombing, in April, reinforced the impression
that guaranteed prevention against unwanted harm is elusive, if not
impossible. Yet the mere chance of stopping the next mass shooting
or terror attack persuades many people of the benefits of creating
a high-tech version of the ­omniscient surveillance construct that,
in 1787, the British philosopher Jeremy Bentham conceived as a
panopticon: a prison with a central viewing station for watching all
the inmates at once.

Some activists complain about the potential of such a system to
violate basic freedoms, including the right to privacy. But others
will be seduced by the lure of techno fixes. For example, how could
anyone object to a digital net that protects a school from abusive
predators?

Ad hoc surveillance will inevitably proliferate. Dropcam and other
cheap surveillance programs, already popular among the tech-savvy,
will spread widely. DIY and vigilante panopticons will complicate
matters. Imagine someone like George ­Zimmerman, the Florida
neighborhood watchman, equipped not with a gun but with a digital
surveillance net, allowing him to track pretty much anything?on his
smartphone.

With data multiplying exponentially and technology inexorably
advancing, the question is not whether an all-encompassing
surveillance systems will be deployed. The question is how, when, and
how many.

In the absence of settled laws and norms, the role of engineers looms
large. They will shoulder much of the burden of designing systems in
ways that limit the damage to innocents while maximizing the pressures
brought to bear on bad guys.

But where do the responsibilities of ­engineers begin and end?

It is too early to answer conclusively, but engineers would do well to
keep a few fundamental principles in mind:

    Keep humans in the loop, but insist they follow the ?rules of the
road.? Compiling and analyzing data can be done by machines. But it
would be best to design these surveillance systems so that a human
reviews and ponders the data before any irreversible actions are
taken. If citizens want to spy on one another, as they inevitably
will, impose binding rules on how they do so.

    Design self-correcting systems that eject tainted or wrong
information fast and inexpensively. Create a professional ethos
and explicit standards of behavior for engineers, code writers,
and designers who contribute significantly to the creation of
panopticon-like systems.

    Delete the old stuff routinely. Systems should mainly contain
real-time data. They should not become archives tracing the lives of
innocents.

Engineers acting responsibly are no guarantee that panopticons will
not come to control us. But they can be part of getting this brave new
world right.


About the Author

G. Pascal Zachary is the author of Endless Frontier: Vannevar Bush,
Engineer of the American Century (Free Press, 1997). He teaches at
Arizona State University.




#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org