by way of tbyfield@panix.com on Thu, 9 Jul 1998 18:52:43 +0200 (MET DST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> [excerpted] RISKS DIGEST 19.84


[*heavily* edited for redistribution on nettime. "<...>" marks the spot.-T]

RISKS-LIST: Risks-Forum Digest  Tuesday 7 July 1998  Volume 19 : Issue 84

   FORUM ON RISKS TO THE PUBLIC IN COMPUTERS AND RELATED SYSTEMS (comp.risks)
   ACM Committee on Computers and Public Policy, Peter G. Neumann, moderator

***** See last item for further information, disclaimers, caveats, etc. *****
This issue is archived at http://catless.ncl.ac.uk/Risks/19.84.html 

  Contents:
 <... -9 lines ...>
FIPS-Flop: Reuters on failed NIST key-recovery effort (Alan Davidson)
NSA declassifies encryption code (Edupage)
 <... -1 line ...>
Defining the line between hacking and web surfing... (Eli Goldberg)
Y2K problem worries CIA (Edupage)
 <... -1 line ...>
Galaxy IV muzak withdrawal (Philip Edmonds)
No manual switching for railroads; result, famine (Doneel Edelson)
 <... -1 line ...>
Re:  More on @#$%& Software (Michael A. Nelson)
 <... -1 line ...>
Abridged info on RISKS (comp.risks)

----------------------------------------------------------------------

 <... -133 lines ...>

Date: Fri, 26 Jun 1998 14:46:39 -0500
From: Alan Davidson <abd@cdt.org>
Subject: FIPS-Flop: Reuters on failed NIST key-recovery effort

The 22-member U.S. Government Technical Advisory Committee to Develop a
Federal Information Processing Standard for the Federal Key Management
Infrastructure (TACDFIPSFKMI) has failed in a two-year effort to design a
federal computer security system that includes "back doors," a feature that
would enable snooping by law enforcement agencies.  Addressing Commerce
Secretary William Daley, the panel wrote that it "encountered some
significant technical problems that, without resolution, prevent the
development of a useful FIPS. ... Because the focus of this work is
security, we feel that it is critically important that we produce a document
that is complete, coherent, and comprehensive in addressing the many facets
of this complex security technology.. The attached document does not satisfy
these criteria."

The failure casts further doubt on the Clinton
administration policy -- required for government agencies and strongly
encouraged for the private sector -- of including such back doors in
computer encryption technology used to protect computer data and
communications, according to outside experts.  But administration officials
said the panel, which is set to expire in July, simply needed more time.
[Source: U.S. effort on encryption "backdoors" ends in failure, By Aaron
Pressman, Reuters, 25 Jun 1998, PGN Stark Abstracting]

[Pressman's article also included this quote from Alan Davidson: "The
administration keeps spending taxpayer money to pursue a ...  strategy
that's wrong-headed and won't protect security.  Its own advisory committee
can't answer basic questions about how to make it work for the government,
yet they continue to push for its adoption by everyone, worldwide."  PGN]

Alan Davidson, Staff Counsel, Center for Democracy and Technology, 1634 Eye
St. NW, Suite, 1100 Washington, DC 20006  202.637.9800  <abd@cdt.org>

------------------------------

Date: Thu, 25 Jun 1998 16:22:04 -0400
From: Edupage Editors <educom@educom.unc.edu>
Subject: NSA declassifies encryption code 

The National Security Agency has declassified its 80-bit-length Skipjack
encryption algorithm and its 1,024-bit-length key exchange algorithm, and
made them publicly available.  "This declassification is an essential part
of the Department of Defense's efforts to work with commercial industry in
developing reasonably priced computer-protection products," says the
Pentagon.  "This declassification decision will enable industry to develop
software- and smart card-based security products, which are interoperable
with Fortezza."  The Skipjack algorithm is used in the Fortezza PC smart
card, which controls access to computers in the Defense Message System and
other DoD applications.  (*EE Times*, 24 Jun 1998; Edupage, 25 June 1998)

------------------------------

 <... -17 lines ...>

Date: Wed, 1 Jul 1998 19:49:14 -0700
From: Eli Goldberg <eli@prometheus-music.com>
Subject: Defining the line between hacking and web surfing...

I've recently been faced with a very curious intellectual dilemma: at what
point is the web browsing that we do potentially --- and unknowingly ---
crossing the line into illegal hacking?

RISKS has explored this topic before (such as with alternate uses of 
robots.txt files, such as for finding interesting stuff like 
http://www.cnn.com/webmaster_logs/). 

Here are two recent encounters that have left me rather perplexed:

Case #1: A lot of AFS directories (a network file system popularized by 
CMU in the 1980s) have been starting to appear in recent months as 
publically viewable HTTP directories, without the knowledge of their 
owners. (In many cases, the directory owners have since graduated or 
moved to a staff position, leaving countless long-forgotten files and 
E-mail archives in their home directory.)

On two occasions in the past month (one at MIT, and one at CMU), 
I've performed ordinary web searches using ordinary search engines, and 
ended up finding private documents belonging to friends, with personal 
and confidential information. 

In each case, I immediately alerted the friend, and they had the 
permissions changed immediately, and the offending material removed. 
(Now, removing the summaries from a dozen search engines for hundreds of 
pages will be another matter. ;)

Could perhaps a tenuous argument be constructed that an individual 
reading these private documents --- after realizing that they were not 
meant to be publically posted --- was hacking?


Case #2: A *lot* of webmasters omit index.html files in critical 
directories, or perhaps forget to configure their servers to deny access 
to directory listings to HTTP directories that lack index.html files. 
This renders any casual web surfer trivially able to surf the actual 
directory tree of their web site --- including their CGI directory --- 
and associated private data files.

I've encountered this twice tonight --- once while attempting to 
post a housing vacancy at a local University's housing list (system was 
down, and I was curious why ;), and a second time while browsing a web 
site of a music publisher whose works I have enjoyed in the past.

In the latter case, I immediately stumbled upon full archives of 
this company's (unprotected) customer orders, web logs, & associated 
information, and other information that I believe any company should 
reasonably consider private. Ouch! 

Let's say I went ahead and read those files. 

Say, I was curious about more information about the company's 
customers buying habits, and had no malicious or criminal intent. Would 
this be breaking the law? 

On one hand, the webmaster *probably* didn't intend for the 
information to be public. Does a difference truly exist between 
exploiting known configuration errors in web sites, and exploiting known 
configuration errors in networked UNIX systems to access information not 
meant to be public?

On the other hand, it doesn't matter what they intend. They *have* 
made it public, and they've just placed it on a server where any bozo 
with a web browser can get to it just by typing a regular URL; how could 
one be breaking the law by viewing what they've already placed in a 
public area for viewing? 

(Certainly, I never signed an agreement to limit my use of the web site to
merely clicking on links, and have every right to type whatever I'd like
into the URL field!)

Now, let's say a competitor to the company in question happened to stumble
upon the same URL and data. What, then?

------------------------------

Date: Thu, 25 Jun 1998 16:22:04 -0400
From: Edupage Editors <educom@educom.unc.edu>
Subject: Y2K problem worries CIA 

Central Intelligence Agency director George Tennant is warning that the Year
2000 computer bug (found when programs are unable to correctly interpret
dates past 1999) "provides all kinds of opportunities for someone with
hostile intent" to gain information or plant viruses.  "We are building an
information infrastructure, the most complex the world has ever known, on an
insecure foundation."  (*USA Today*, 25 Jun 1998; Edupage, 25 June 1998)

------------------------------

 <... -20 lines ...>

Date: 	Wed, 24 Jun 1998 10:45:57 -0400
From: Philip Edmonds <pedmonds@cs.toronto.edu>
Subject: Galaxy IV muzak withdrawal

The loss of transmission from the Galaxy IV satellite last month made some
customers realize how much they depended on Muzak. In Lafayette, Indiana,
one upset and elderly Burger King customer told the manager: "Now I just
have to sit here and hear myself think."  [Source: *The Globe and Mail*, 24
Jun 1998]

Phil Edmonds

------------------------------

Date: Wed, 1 Jul 1998 15:48:38 -0500
From: "Edelson, Doneel" <doneeledelson@aciins.com>
Subject: No manual switching for railroads; result, famine

This is an excerpt from the CSIS Y2K conference.  

Gary North's Y2K Links and Forums   Summary and Comments
Category: 	Shipping_and_Transportation		
Date: 	1998-06-24 19:34:02		
Subject: 	No Manual Switching for Railroads; Result, Famine
Link:	http://www.csis.org/html/y2ktran.html#simpson		

At a June 2 conference on y2k sponsored by the Center for Strategic and
International Studies, Alan Simpson confirmed what I have been saying for
over a year: the trains will go down. He said that the railroads have
abandoned manual controls.  "Going back to the rail system, they've taken
out manual points. I talked to some of the major rail companies a few days
back and said, 'Go to manual.' And they said, 'All our manual points are in
the warehouse up in New York State waiting to be disposed of. We cannot
switch manually anymore. We have taken out manual reversion systems on most
of our key communication, power, and switching systems.' " Conclusion: * * *
* * * * * * * And a few weeks ago he started looking at this, and it was
Bruce Webster here who mentioned about, in one of his presentations, the
could-be famine in the United States in 2000. And like most of you here I
thought rubbish, rubbish, until we started looking at the infrastructure and
started the wildfire scenarios on what if.  And looking at New York and
California, I walk into a supermarket and I get lettuce, fresh vegetables,
any day of the year. Seven days ago they were in a field in California. Now
they're in a supermarket just outside New York. We know the switches on the
railroads are faulty. We know because of mergers, even today, many of the
major corporations in the railroad business don't know where the railway
stop is.  When you move this way through, come 2000 you could have a
scenario -- and when you look at this, it's the Soviet Union in the '80s --
where there's plentiful supply of food in the fields, but you can't get it
from the fields to the towns to feed the population. This is not a way-out,
whacko scenario. This is for real.  

------------------------------

 <... -20 lines ...>

Date: Wed, 01 Jul 1998  8:46 -0500
From: MNELSON@arinc.com
Subject: Re:  More on @#$%& Software (Agre, RISKS 19.83)
     
Mr. Agre's observation raises a disturbing issue regarding the languages of
choice in computer science courses.  I have grown increasingly concerned
that a sizable number of colleges and universities have chosen C or C++ as
their language of choice based almost solely on its prevalence in the
marketplace.  While I agree that both are excellent, capable languages, they
must be evaluated in the context of their original application.  C (and by
extension C++) was designed as a system software programming tool, to be
used by experienced programmers to develop operating system and related
software; it allows the programmer wide latitude and great flexibility,
assuming in almost every case that the programmer knows what he/she is
doing.  On the other hand, this laissez-faire approach can lead to
extraordinary, and sometimes destructive, program behavior under other
circumstances, particularly those involving inexperienced programmers.
Framed another way, it's a matter of choosing the right tool for the job at
hand.  C/C++ is probably the right tool if you're developing system
software; I maintain that is not always the case for general application
software development.
     
Having taught both Pascal and C++ for over 10 years I've seen the situation
Mr. Agre's describes almost every semester: student programs inadvertently
walking off the end of an array.  With Pascal the language's run-time bounds
checking caught this every time; the lack of these checks in C++ has been
the source for countless hours of debugging baffling program behavior.  Time
and again, I have cautioned my students to treat C++ as an exquisitely
capable power tool with few, if any, safety features: in the right hands it
can do wonders, allowing a craftsman to fashion a work of art, knowing all
the while that a minor slip-up could cost him a finger, arm, or leg.  While
languages such as Pascal and Ada have taken a beating over the years from
various constituencies within the programming community for a wide variety
of alleged sins, in the final analysis the safeguards built into these
languages offer significant value-added to project managers in terms of the
avoidance of this type of common logical errors within their program
designs.

Michael A. Nelson, ARINC, Incorporated

------------------------------

 <... -80 lines ...>

Date: 31 Mar 1998 (LAST-MODIFIED)
From: RISKS-request@csl.sri.com
Subject: Abridged info on RISKS (comp.risks)

 The RISKS Forum is a MODERATED digest.  Its Usenet equivalent is comp.risks.
=> SUBSCRIPTIONS: PLEASE read RISKS as a newsgroup (comp.risks or equivalent) 
 if possible and convenient for you.  Alternatively, via majordomo, 
 SEND DIRECT E-MAIL REQUESTS to <risks-request@csl.sri.com> with one-line, 
   SUBSCRIBE (or UNSUBSCRIBE) [with net address if different from FROM:] or
   INFO     [for unabridged version of RISKS information]
 .MIL users should contact <risks-request@pica.army.mil> (Dennis Rears).
 .UK users should contact <Lindsay.Marshall@newcastle.ac.uk>.
=> The INFO file (submissions, default disclaimers, archive sites, 
 copyright policy, PRIVACY digests, etc.) is also obtainable from
 http://www.CSL.sri.com/risksinfo.html  ftp://www.CSL.sri.com/pub/risks.info
 The full info file will appear now and then in future issues.  *** All 
 contributors are assumed to have read the full info file for guidelines. ***
=> SUBMISSIONS: to risks@CSL.sri.com with meaningful SUBJECT: line.
=> ARCHIVES are available: ftp://ftp.sri.com/risks or
 ftp ftp.sri.com<CR>login anonymous<CR>[YourNetAddress]<CR>cd risks
   [volume-summary issues are in risks-*.00]
   [back volumes have their own subdirectories, e.g., "cd 18" for volume 18]
 or http://catless.ncl.ac.uk/Risks/VL.IS.html      [i.e., VoLume, ISsue].
 The ftp.sri.com site risks directory also contains the most recent 
 PostScript copy of PGN's comprehensive historical summary of one liners:
   get illustrative.PS

------------------------------

End of RISKS-FORUM Digest 19.84 [excerpted for redistribution on nettime]
************************
---
#  distributed via nettime-l : no commercial use without permission
#  <nettime> is a closed moderated mailinglist for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: majordomo@desk.nl and "info nettime-l" in the msg body
#  URL: http://www.desk.nl/~nettime/  contact: nettime-owner@desk.nl