nettime's_roving_reporter on Thu, 18 Nov 1999 16:18:04 +0100 (CET)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> The real issues of cryptography: Ross Anderson Interview


[http://www.newscientist.com/ns/19991106/confidenti.html]

Confidentially yours

Everyone's doing it. Banks, shops, governments, even the British Civil
Service--they're all trying to put services online. Unfortunately,
e-commerce and e-government are nothing without e-trust. How will you know
who you're really dealing with when you buy that holiday or fill in that
form online? At Cambridge University, Ross Anderson and his team are trying
to create the ultimate instruments of online confidence in the shape of
software tools that encrypt information so strongly it can be read only by
people who hold the right decoding keys. But, as Anderson tells Ehsan
Masood, we'll only get the e-world we want if governments regulate
encryption wisely.

You say you are not a typical cyberlibertarian. How do you define this
term? And why don't you see yourself as one?

Cyberlibertarians tend to see the Internet as leading to the abolition of
governments. Their idea is that given the advent of anonymous e-mail,
digital cash and so on, the state will no longer be able to support itself
by raising revenue through taxation. I don't think this is likely or
desirable. Think what England was like when the government didn't really
exist: anyone with any wealth or property had to design their house to
withstand infantry-strength assault. That's not efficient. National
governments and policemen will survive the electronic revolution because of
the efficiencies they create.

Because your team is a world leader in research into computer security,
you've been chosen as one of the finalists to design a new encryption
standard for the US. Do you think you'll win?

The Data Encryption Standard currently used by the US banks and other
organisations is no longer secure enough from attack. The US government
invited the crypto community to develop what they're calling the Advanced
Encryption Standard. Along with Eli Biham from the Technion in Israel and
Lars Knudsen from the University of Bergen in Norway, I've invented a
cipher that's been selected for the final. People are sceptical about
whether the US government will pick a non-American winner, but I think
there's a general perception that our system is the most secure of the
finalists. It's not the fastest. But then we designed ours on the
assumption that it needs to keep stuff secret for the next hundred years--
despite advances in technology and in the mathematics of cryptography.

Politicians and others have expressed alarm at the prospect of criminals
using encryption to keep their e-mails secret and evade detection. You say
this concern is bogus--why?

It's based on a misconception of what law enforcement operations are like.
At present, the police have little interest in intercepting and reading the
transcripts of phone conversations. It's costly and tedious and rarely
justified except in serious, high-budget investigations. Instead, what the
police are mostly after is traffic logs--information on who called or
e-mailed whom, at what time and for how long. Criminals understand this and
try to make their communications as unobtrusive as possible. In Britain,
the main threat to police intelligence gathering comes from the prepaid
mobile phone, not e-mail encryption, because you can buy one without giving
out your name and address. In other words, the users of these devices can't
be traced, so they're ideal for running an operation such as drugs dealing.

But what about terrorists?

The idea that information technology has revolutionised the way terrorists
run their operations has been exaggerated. Take the IRA. We know from a
recent court case that this organisation still writes down the orders for
its active service units on pieces of cigarette paper, which they wrap in
cling film so the courier can carry the orders through customs in his
mouth. That's how real terrorists keep information to themselves.

But isn't the point that the terrorists will soon be abandoning the cling
film and using cleverly encrypted e-mails instead?

Encryption alone won't help them. If I were to hold a three-hour encrypted
conversation with someone in the Medell'n drug cartel, it would be a dead
giveaway. In routine monitoring, GCHQ (Britain's signals intelligence
service) would pick up the fact that there was encrypted traffic and would
instantly mark down my phone as being suspect. Quite possibly the police
would then send in the burglars to put microphones in all over my house. In
circumstances like this, encryption does not increase your security. It
immediately and rapidly decreases it. You are mad to use encryption if you
are a villain.

You're a well-known critic of the suggestion that governments should have
access to the keys needed to decode people's encrypted messages and
duplicate their electronic signatures. Why are you so opposed to such
"third-party" access schemes?

It's a flawed idea designed for a world which no longer exists, where the
main users of encryption were the military, the intelligence services and
the diplomatic corps. Now encryption is widespread and used largely in
systems whose security is in the interests of law enforcement, such as cash
machines and burglar alarms. The authorities are going to have terrible
difficulty in trying to draft any third-party legislation that blesses
encryption applications that are good for law enforcement and curses those
that aren't. The argument that criminals will use encryption is bogus. The
argument that governments need to hold keys to unlock people's encrypted
files to solve crime is bogus. The real issues are more complex and much
nastier.

So what sorts of issue should we be worried about?

Here's one example. Britain's Civil Service is adopting an e-mail security
protocol called "cloud cover". In this scheme, departmental security
officers will get copies of the electronic keys that are used not just to
de-crypt messages, but also to create the digital signatures on them. This
will enable ministers to plausibly deny any responsibility for information
that leaks from their department. They can simply claim that the message
was forged--and by the very official whose job it was to stop leaks.

But as far as freedom of information goes, this scheme is a disaster. If in
a decade's time you are awarded access to an embarrassing government
document, the officials of the day could use the keys they hold to
substitute a forgery and you'd never be able to know. Even if they gave you
a genuine document, you couldn't be sure it hadn't been forged. This is the
sort of horrible complexity that third-party key schemes bring into real
systems.

Something else to worry about is the digital election, in which people vote
for political candidates electronically via a polling system made secure by
encryption. The British government seems keen on this idea but the
potential for fraud is unbelievable. If we get a national election network
in Britain, then under current policy GCHQ would be charged with securing
it. Would you be comfortable with a system where the outcome of the
election was controlled by the spooks? This already happens in Russia. Do
we want it here?

What about information warfare?

Information warfare is not new, and terrorists aren't the only culprits.
Governments have been doing it for decades. Look at GCHQ. They listen to
people's telephone conversations, hack into their computer systems, jam
their radar. This is information warfare.

What's the worst possible outcome of a cyber attack?

Suppose a Western power were to hack into Iran's national grid. There could
be an electricity blackout for, say, three days. Several hundred people
might die, such as those on dialysis machines. Retaliation might be
expected for the simple reason that Iran is not in a position to try the
head of a Western state for deliberately targeting civilians, which is a
war crime. Unfortunately, this view is not shared in places like Britain's
Ministry of Defence. Those engaged in information warfare tend to view such
attacks on other countries as being a zero-cost way of conducting warfare.

If information warfare is so easy, doesn't it make countries like Britain
or the US more vulnerable to attack?

Absolutely--we have more critical information technology and therefore more
to lose than countries such as Iran or Serbia. But the phrase "information
warfare" is also a marketing exercise by the intelligence community, who've
talked it up to justify increased budgets, and redefined it to include
threats to infrastructure and even spin-doctoring. This has a certain
appeal to the current generation of politicians.

Could information warfare ever replace conventional wars?

I don't think so. Take the recent NATO action in former Yugoslavia. A team
in Serbia attacked the NATO website by overloading it with requests for
information. They sent so many requests that the NATO Web server couldn't
provide information to anyone else. NATO's response was to bomb the Serbian
satellite link and thus reduce the Serbian bandwidth to a level that NATO
could cope with. So information warfare is more likely to feed into
conventional warfare than replace it.

Would you refuse admission to a research student from a country that the
British authorities regard as "sensitive"?

The Foreign Office would like universities to vet students for
high-technology courses from certain countries. The leading research
universities and the Committee of Vice-Chancellors and Principals view this
as unacceptable. We haven't turned people down because they are backed by a
government with which some people might not agree. If there's going to be
vetting, it has to be done during the visa process.

Intelligence agencies have long tried to prevent encryption technologies
from spreading. Why are you so opposed to export controls?

The UK government is currently encouraging the European Commission to
introduce regulation designed to compel member states to license the export
of encryption software because of the perceived threat to security.

But encryption software has been available for years on the Net: the stable
door is already open. If this became law, it could make it more difficult
to sell software to Belgium than to sell electroshock rods to Indonesia.

Worse, because the proposed regulation is not limited to encryption but
affects everything the Ministry of Defence considers "high-tech" it could
prevent researchers from sharing a wide range of other types of softwares,
information and training skills. We'd have to keep track of what we taught
to whom: we might have to get personal export licences to teach most of our
foreign students. It's true that the US government also tried to control
the export of technologies and information related to encryption. But
American university researchers have freedom of speech rights which UK
academics don't have, so the rules are not so intolerable for them.

>From New Scientist, 6 November 1999


#  distributed via <nettime>: no commercial use without permission
#  <nettime> is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: majordomo@bbs.thing.net and "info nettime-l" in the msg body
#  archive: http://www.nettime.org contact: nettime@bbs.thing.net