[UKUUG Logo] Copyright © 1995-2004 UKUUG Ltd

UKUUG


Previous Table of Contents Next

From the Net

Tapping, Tapping On My Network Door

(Matt Blaze and Steven M. Bellovin)

Readers of this column are familiar with the risks of illegal monitoring of Internet traffic. Less familiar, but perhaps just as serious, are the risks introduced when law enforcement taps that same traffic legally.

Ironically, as insecure as the Internet may be in general, monitoring a particular user's traffic as part of a legal wiretap isn't so simple, with failure modes that can be surprisingly serious. Packets from one user are quickly mixed in with those of others; even the closest thing the Internet has to a telephone number --- the ``IP address'' --- often changes from one session to the next and is generally not authenticated. An Internet wiretap by its nature involves complex software that must reliably capture and reassemble the suspect's packets from a stream shared with many other users. Sometimes an Internet Service Provider (ISP) is able to provide a properly filtered traffic stream; more often, there is no mechanism available to separate out the targeted packets.

Enter Carnivore. If an ISP can't provide exactly the traffic covered by some court order, the FBI offers its own packet sniffer, a PC running special software designed especially for wiretap interception. The Carnivore computer (so named, according to press reports, for its ability to ``get to the meat'' of the traffic) is connected to the ISP's network segment expected to carry the target's traffic. A dial-up link allows FBI agents to control and configure the system remotely.

Needless to say, any wiretapping system (whether supplied by an ISP or the FBI) relied upon to extract legal evidence from a shared, public network link must be audited for correctness and must employ strong safeguards against failure and abuse. The stringent requirements for accuracy and operational robustness provide especially fertile ground for many familiar risks.

First, there is the problem of extracting exactly (no more and no less) the intended traffic. Standard network monitoring techniques provide only an approximation of what was actually sent or received by any particular computer. For wiretaps, the results could be quite misleading. If a single packet is dropped, repeated, or miscategorized (common occurrences in practice), an intercepted message could be dramatically misinterpreted. Nor is it always clear ``who said what.'' Dynamic IP addresses make it necessary to capture and interpret accurately not only user traffic, but also the messages that identify the address currently in use by the target. Furthermore, it is frequently possible for a third party to alter, forge, or misroute packets before they reach the monitoring point; this usually cannot be detected by the monitor. Correctly reconstructing higher-level transactions, such as electronic mail, adds still more problems.

The general-purpose nature of Carnivore entails its own risks. ISPs vary greatly in their architecture and configuration; a new component that works correctly in one might fail badly --- silently or destructively --- in another. Carnivore's remote control features are of special concern, given the potential for damage should a criminal gain control of an installed system. ISPs are understandably reluctant to allow such devices to be installed deep within their infrastructures.

Complicating matters further are the various kinds of authorized wiretaps, with different legal standards for each. Because Carnivore is a general purpose ``black box,'' an ISP (or a court) cannot independently verify that any particular installation has been configured to collect only the traffic for which it is legally authorized.

Internet wiretaps raise many difficult questions, both legal and technical. The legal issues are being debated in Congress, in the courts, and in the press. The technical issues include the familiar (and tough) problems of software correctness, complex system robustness, user interfaces, audit, accountability, and security.

Unfortunately, there's no systematic way to be sure that any system as complex and sensitive as Carnivore works as it is supposed to. A first step, the best our community has yet found for this situation, is to subject the source code and system details to wide scrutiny. Focused reviews by outside experts should be part of this process, as should opening the code to the public. While the details of particular wiretaps may properly be kept secret, there's no reason for the wiretapping mechanism itself to be concealed. The observation that sunshine is the best disinfectant applies at least as well to software as it does to government.

Even if we could guarantee the correctness of software, difficult systems issues still remain. Software alone cannot ensure that the reviewed code is what is actually used, that filters and configuration files match court orders, that evidence is not tampered with, and so on.

Ultimately, it comes down to trust --- of those who operate and control the system and of the software itself. Trusting a law enforcement agent to be honest and faithful to duty in a free society is one thing. Trusting complex, black-box software to be correct and operationally faithful to specifications, however, is quite another.

Matt Blaze and Steven M. Bellovin are researchers at AT&T Labs in Florham Park, NJ.


The Lent Lectures 1998

Part I of a series on "The Growth of Computing"
Presented by Professor Arthur Staplefood of Goodnose University

Introduction

People often walk up to me in the street and ask me "Professor, what does it feel like to be at the amputating edge of information technology?" and I say to them, "Expensive". After a short discussion on whether cost can be an emotional state I amend this answer to, "It feels like running on a treadmill." They then thank me, ask me for an autograph, ask me for another autograph for their friend Susan and head off for a pizza.

After a while I thought this would be a good subject for a lecture.

My Computer

I used to be able to boast about my computer, but it's now on the cusp of obsolescence. I bought it a fortnight ago, and although I can still manage to write the odd letter on it I can only dream of running the latest software.

Last week I upgraded its processor from five parallel 333 Mhz Pentium II chips to ten, which has allowed me to run my office software at an almost reasonable pace. By installing it, however, I filled my original 15 Gb hard disk and so I had to buy a 150 Gb replacement. This is roughly the size of a sofa and lives in the attic, where it is connected to my desk by a satellite link. I was a little irritated when the day after they brought out a 300 Gb disk for half the price that fits in my pocket with room for the change. However, this isn't nearly as irritating as my 96 speed CD-ROM drive, which apparently produces enough lift to keep a Jumbo Jet in the air. This would come as no surprise to anyone who has heard it running. My operating system consists of a few fragments of Neap OS 98 beta code which boots up once in ten and can almost run my DOS accounts program without crashing. I do any important work I have on my trusty 286.

But I don't want to make you think that keeping up to date with computers is all frustration. Weighing down the other side of the scales are peripherals, and lots of them. My new keyboard, for example, allows me to type over five times faster than normal by using an intuitive combination of keys, foot pedals and nose sensors. Almost as good is my new mouse, which works by rolling the back of one's hand on the underside of a fixed tracker ball. It sounds cumbersome but is magnificent once you get used to it and I confidently expect it to conquer the laptop market within the month.

My monitor is of the new Active Polymer design which hangs on the wall and is less than 0.01mm thick. As such I have to keep the windows closed for fear of it blowing away, but its image is crystal clear and runs for a century on those funny little batteries that you get in watches.

I also have an excellent voice recognition package which responds to my every command, and it took only three years repeating "la plume de ma tante" into the microphone to do it. Admittedly it only responds in French, but I confidently expect version 2 to put that right.

My favourite peripheral of all, though, is my printscanfaxcoffeemachinecopier. It really is absolutely fantastic! The things you can do with it! Wow! When you look back at computer history you realise how lucky we are to be able to own printscanfaxcoffeemachinecopiers ...

Computer History

On the OHP now you can see the entire history of the home computer squeezed onto a single transparency. The arrows show the movement of computer users (or "suckers", as they are known in the industry) from one make to another as technology progresses.

On the
OHP...

For simplicity I've left many interesting but short-lived computers out. Further information will be given in our companion TV series Computer Family Trees to be broadcast on BBC 2 on Wednesdays.

Staplefood's Law --

It seems that everyone who comments on the computer industry has an eponymous law describing its growth and I don't want to be left out. Staplefood's Law is as follows:

The average computer's processing power doubles in the week after you have bought a computer.

I have collected extensive empirical evidence to prove this law, which will be published in my book Staplefood's Law (£6.99, The NeapNet Press) next month.

Future Trends

Most IT analysts would agree that there are only two important trends in the computer industry:

  1. How fast your computer goes.
  2. How good your games are.

What they don't realise, however, is that both of these remain constant. Take the first trend, for instance. Now, we all know that:

Hardware Speed + Software Speed = Computer Speed

But take a look at the graph on the OHP plotting these speeds against time:

Graph 1

As you can see, the total computer speed remains constant because software slows down at exactly the same rate as hardware speeds up. I wouldn't like to suggest that Microsoft and Intel have in any way done a secret, mutually-lucrative deal to fix these speeds (we might lose our funding, after all), but can it only be due to the laziness of computer programmers?

And now the second trend. Here's the misleadingly simple equation:

Playability + Game Environment = Game Quality

And on the OHP now there's the graph of this:

Graph 2

The decline in inspiration for games since the days of Space Invaders equals the rise in quality of the gaming environment, thus maintaining constant quality.

That concludes Part I of the 1998 Lent Lectures. In Part II we shall be looking in depth at the effect of curtains on the growth of computing. Thank you all for listening.

[Applause]

Copyright © Tom Keal http://www.freer-close.demon.co.uk/neapnet/index.htm


Previous Table of Contents Next
Tel: 01763 273 475
Fax: 01763 273 255
Web: Webmaster
Queries: Ask Here
Join UKUUG Today!

UKUUG Secretariat
PO BOX 37
Buntingford
Herts
SG9 9UQ