[UKUUG Logo] Copyright © 1995-2004 UKUUG Ltd

UKUUG


[Back]

Newsletter Section 9

From the Net





Trust Me

(Steve Talbot)

For $150 or so you can now buy your own PC-based lie detector – a voice analyzer more accurate, according to its inventors, than the traditional polygraph. The product, we're told, will help credit card companies deal with one of their biggest headaches – the person who runs up a big bill and then claims the card was stolen. A Tel Aviv employer is planning to screen five hundred job applicants with the device. Then there are travelers' checkpoints and airports.

A microphone worn on the officer's shirt would pick up the traveler's voice for analysis on a tiny computer attached to the officer's belt, with results being relayed to the officer by a discreet earphone.

Incredibly, the product is called “Truster – A Personal Truth Verifier”. Made by an Israeli firm, Makh-Shevet, and based on work by the Israeli military, it adapts to your phone, allowing you to monitor all your callers (doubtless helping you build more trusting relationships). Since the voice is analyzed without being recorded, existing laws against recording probably don't apply.

Of course, the device is stirring up a lot of controversy, legal and otherwise. As with the taping of White House interns, however, it's not at all clear how much the legality will matter so far as private use is concerned. In any case, Makh-Shevet's CEO, has the usual, bullet-proof justification for subjecting society to whatever his engineers manage to devise.

This is the computer. This is the society that we've decided to live with. The technology is here. It's up to everyone to decide how to use it. I use it as a decision-support tool, not as a decision tool.

Segal is, I'm sure, intelligent and upstanding, but the cliched words he has let slip here are those of a blind fool. One wonders why, in this age of supposed informational efficiency, his filters and bots haven't supplied him with the facts most directly relevant to his responsibilities – simple truths available to any first-year student of the history of technology. In particular: the minute you and I pick up his invention with the intent to use it – and before we make any decision at all about how to use it – crucial decisions have already been made. “Default” decisions, you might say, which, if they are not absolutely binding on us (and they are not), nevertheless become social forces at large, with a highly predictable character.

As I'm sure many others have been pointing out, merely to decide to monitor your conversational partners in this way is already to enter into an altogether different relationship with them. And that underlying difference in quality is likely to transform society far more than any particular decisions you make about “good” and “bad” uses.

The notion that you can gain a basis for trust by using this instrument comes as close to comic farce as anything I've seen in the world of high-tech gadgets. It also provides another instance of the

“Fundamental Deceit of Technology” (see NETFUTURE No. 38, No. 40, and No. 48). That's because the more we improve our analyses of such externalities of speech as “microtremors”, and the more we therefore rely on them, the less practiced we will become at hearing and understanding the speaking self behind the sound waves. And the only enduring basis for trust lies in this inner, intimate, delicate wedding of hearing and response – the meeting of persons. Truster is not exactly the most natural broker of such meetings.

By the way, none of the reports I've seen so far has mentioned the obvious: Truster can be used not only as a putative lie detector, but also as a reliable biofeedback device. Employing it, we can learn to project the physical sound features that Truster presumptuously correlates with such things as “confusion”, “excitement”, “exageration”, “sarcasm”, and “falsehood”. Before now, of course, the general public had no convenient access to such training tools. (Will governments insist on keeping the more sophisticated algorithms out of circulation, and will some new outfit called Pretty Good Trust release the algorithms to the world?)

In any case, now we can look forward to yet another escalating technological arms race, just like the ones between privacy seekers and snoopers, between free speechers and filter-wielding censors, and between security providers and security breachers.

And as the unresolvable escalation proceeds through ever new generations of software (keeping the high-tech companies well fed), we will in all likelihood fail to notice the crucial fact: by having shifted the search for trust onto technical ground, we will have subverted still further the deeply social and humane consciousness upon which all trust finally depends.

How should we respond to devices like Truster? I don't have any good answer. Given the current social realities, the arms race is not about to disappear, regardless of anything you and I do. But there is this:
nothing ever prevents us from remaining outside the arena of combat and cultivating that saner, communal ground upon which victory in the battle for trust can ultimately be won.

This article first appeared in NETFUTURE, Issue No. 66 (24 Febuary, 1998) and is reproduced with permission. It is an on-line newsletter devoted to Technology and Human Responsibility.

To subscribe send an e-mail message to listserv@infoserv.nlc-bnc.ca with the following in the body of the message (wrapped here):

subscribe netfuture yourfirstname yourlastname

Alternatively, the newsletter is archived at the following web site: http://www.oreilly.com/peopl e/staff/stevet/netfuture/
    



[Forward]
Tel: 01763 273 475
Fax: 01763 273 255
Web: Webmaster
Queries: Ask Here
Join UKUUG Today!

UKUUG Secretariat
PO BOX 37
Buntingford
Herts
SG9 9UQ