The idea has seized our imaginations with all the force of a logical necessity. In fact, you could almost say that the idea is the idea of logical necessity -- the necessity of embedding little bits of silicon logic in everything around us. What was once the feverish dream of spooks and spies -- to plant a "bug" in every object -- has been enlarged and re- shaped into the millennial dream of ubiquitous computing. In this new dream, of course, the idea of a bug in every object carries various unpleasant overtones. But there are also overtones in the larger and better-promoted notion of ubiquitous computing, despite the fact that our ears are not yet attuned to them.
I suppose Bill Gates' networked house is the reigning emblem of ubiquitous computing. When the door knows who is entering the room and communicates this information to the multimedia system, the background music and the images on the walls can be adjusted to suit the visitor's tastes. When the car and garage talk to each other, the garage door can open automatically whenever the car approaches.
Once your mind gets to playing with such scenarios -- and there are plenty of people of good will at places like the MIT Media Lab and Xerox PARC who are playing very seriously with them -- the unlimited possibilities crowd in upon you, spawning visions of a future where all things stand ready to serve our omnipotence. Refrigerators that tell the grocery shopper what is in short supply, shopping carts that communicate with products on the shelves, toilets that assay their clients' health, clothes that network us, kitchen shelves that make omelets, smart cards that record all our medical data, cars that know where they're going -- clearly we can proceed down this road as far and fast as we wish.
And why shouldn't we move quickly? Why shouldn't we welcome innovation and technical progress without hesitation? I have done enough computer programming to recognize the inwardly compelling force of the knowledge that I can give myself crisp new capabilities. It is hard to prefer not having a particular capability, whatever it might be, over having it.
Moreover, I'm convinced that to say "we should not have technical capability X" is a dead-end argument. It's the kind of argument that makes the proponents of ubiquitous computing conclude, with some justification, that you are simply against progress. You can only finally assess a tool in its context of use, so that to pronounce the tool intrinsically undesirable would require an assessment of every currently possible or conceivable context. You just can't do it -- and if you try, you underestimate the fertile, unpredictable winds of human creativity.
But this cuts both ways. You also cannot pronounce a tool desirable (or worth the investment of substantial resources) apart from a context of desirability. Things are desirable only insofar as a matrix of needs, capacities, yearnings, practical constraints, and wise judgments confirms them. This leads me to my first complaint against the ubiquitous technology pushers.
When we are asked to accept or reject a particular bit of technology -- and, more broadly, when we are asked to embrace or condemn ubiquitous computing as a defining feature of the coming century -- we should flatly refuse the invitation. Technologies as such are the wrong kinds of things to embrace or condemn. To focus our judgments on them is to mistake what is empty for something of value.
Take, for example, the questions we face in the classroom. They are educational questions. They have to do, in the first place, with the nature, destiny, and capacities of the child. Such questions are always deeply contextual. They arise from a consideration of this child in this family in this community, against the backdrop of this culture and this physical environment.
It's one thing if, deeply immersed in this educational context, pursuing the child's education, we come up against a gap, a shortfall, a felt need, and if, casting about for a solution, we conclude: The computer might offer the best way to fulfill this particular need. But it's quite another thing to begin by assuming that the computer is important for education and then to ask the backward and destructive question, "How can we use the computer in the classroom?" This is to deprive our inquiry of its educational focus and to invite the reduction of educational questions to merely technical ones -- a type of reduction that is the reigning temptation of our age. It leads us, for example, to reconceive learning as information transfer -- fact shoveling.
Spurred by this backward thinking, we've felt compelled to spend billions of dollars wiring schools, retraining (or dismissing) teachers, hiring support staff, buying and updating software, rewriting job descriptions, and designing a new curriculum. Then Secretary of Education Richard Riley comes along after the fact and says, Oh, by the way,
We have a great responsibility .... We must show that [all this expenditure] really makes a difference in the classroom. (Education Week on the Web, May 14, 1997, via Edupage)
The same concerns arise in the workplace. Why do we work? Surely it is, in the first place, in order to discover and carry out our human vocations and to achieve something of value for society. What shape this productive effort might take -- and what tools might be embraced healthily -- can follow only from the most profound assessment of the needs and capacities of both the individual and society.
Yet such assessment is increasingly forgotten as social "progress" and vocational decisions are handed to us by automatic, technology-driven processes. It is no accident that we see today a growing consensus among entrepreneurs that all considerations of human value should be jettisoned from the business enterprise as such. Seek first the Kingdom of Profitability, we are advised -- that is, seek what can be perfectly calculated by a machine -- and all else will somehow be added to you.
Here again is the reduction of real questions to one-dimensional, abstract, decontextualized, technical ones. The availability of the precise, computational techniques of accounting have encouraged us toward a crazy reversal, whereby the healthy discipline of profitability no longer serves us in work that we independently choose as worthy and fulfilling, but rather we choose our work according to its profitability. It is always easier to make our choices according to rules that can be clearcut, precise, and automatic -- the kind of rules that can be embedded in ubiquitous silicon -- than to ask what sort of human beings we want to become. We can answer the latter question only through our own struggling and suffering -- that is, only by embedding ourselves in real-world contexts.
So my first complaint is this: the most visible pronouncements in favor of ubiquitous computing take the form of huge investments in places like the MIT Media Lab where the whole aim is to pursue new technologies out of context, as if they were inherently desirable. This mistaking of mere technical capacity for what really matters is the one thing guaranteed to make the new inventions undesirable.
The healthy way to proceed would be to concern ourselves with this or that activity in its fullest context -- and then, in the midst of the activity, ask ourselves how its meaning might be deepened, its purpose more satisfyingly fulfilled. Only in that meditation can we begin to sense which technologies might be introduced in appropriate ways and which would be harmful.
If the researchers at the Media Lab pursue their work via such immersion in problem contexts -- that is, by exploring significant questions as a basis for seeking answers -- they've done a miserable job of communicating the fact to the rest of us. What we actually receive from them (via the news media) is a steady stream of exclamations about the wonders of this or that technical capability. Typical, so far as I can tell, is the fact, reported in the New York Times, that one of the Media Lab staffers most concerned to render kitchen appliances intelligent is "a bachelor who rarely uses his kitchen". Is it such people who will point us toward the realization of the kitchen's highest and most humane potentials?
A technology-focused consciousness -- and you could fairly say that our society is becoming obsessively technology-focused -- is a consciousness always verging upon emptiness. It is a consciousness whose problems are purely formal or technical, with precisely definable solutions. They can be precisely defined because they lack context, they have no significance of their own.
Now, it needs adding that no technology perfectly achieves this "ideal" of emptiness and self-containment. As I have pointed out before, a complex device like the computer evolves historically and has numerous tendencies of ours, numerous habits and contexts of use, built into it. This is why you can never say say that such devices are neutral in their implications for society.
And, of course, its non-neutrality is what enables us to assess a technology: does it fit into and serve this particular context or not? So when I speak of "technology as such", I am to some degree falsifying things. But the point is that this is the very falsification the ubiquitous technology pushers are encouraging through their strongly decontextualized celebration of ... technology as such.
This makes a certain self-deception easy, whereby new technical capacities are much too quickly assumed to represent the answers to problems. And it diverts massive social resources into the production of technologies that, because they will be injected into real contexts with alien force, will certainly prove socially destructive.
That, in fact, will be the argument of the next installment. More widely, the remaining parts of this essay will deal with issues such as these:
Related articles from NetFuture:
Steve Talbot is editor of Netfuture, a freely distributed newsletter dealing with technology and human responsibility.
Part 2 will be published in the June newsletter.
Tel: 01763 273 475
Fax: 01763 273 255
Queries: Ask Here
|Join UKUUG Today!||
PO BOX 37