[UKUUG Logo] Copyright © 1995-2004 UKUUG Ltd

UKUUG


[Back]

Newsletter Section 2

Reviews




Deeper: A Two-year Odyssey in Cyberspace

John Seabrook
Faber and Faber £12.99
ISBN 0-571-17533-3, 688pp

(Reviewed by Lindsay Marshall)

[lindsay] For this issue I thought that I would review something a little different from the usual techie stuff that we normally cover. Writing books about the Internet is all the rage at the moment – there are all kinds from guidebooks right through to erotica, and sadly the general standard is pretty dire. John Seabrook is at least a “proper” writer, being on the staff of New Yorker magazine, so he does know how to put together something that is readable.

In this book Seabrook describes his experiences of going from being a newbie to a seasoned net user with all that that entails. Many of you reading this will have been around the net so long that you won't even remember what it was like being a newbie – in fact some of you probably never were newbies in the strict sense. The difference between Seabrook's experience and ours is mainly that he does not come from a technical background. He isn't a programmer, nor is he particularly interested in the technerotic aspects of shiny new hardware and software. The computer was just a tool for him (and as I typed this sentence my machine hung so I had to recover this document from scratch...) and only later did he get more involved in the cultural aspects of networks.

This is a good book and you should read it, not only because it is well written and fun to read, but because it describes a different kind of world to the one we are used to. But unfortunately this is also a problem. First off he describes an interview he did with Bill Gates and three times (count 'em!) whilst typing this sentence the word processor produced by that man's company hung and I had to reboot. And Seabrook seems a little puzzled as to why everyone hates BG! (By the way, I am sure BG is a perfectly nice man when you meet him, but that doesn't mean you have to approve of him!) I wonder if this lack of understanding is due to Seabrook's non-technical approach. He just doesn't get why so many people don't like Microsoft and its products, or perhaps the MS haters are like people with Linn hi-fi systems and are perfectionists; after all most people get on fine with their all-in-one music centres. But even if I can't quite come to terms with this rather rosy image of King DOS, the background material here is fascinating – Seabrook's mother's reaction to all this and the e-mail he gets from people who read his article. (His mother sounds like a good person – I wonder what her e-mail address is?)

Moving on from his Gates interview he starts to get more sucked into the interactivity – he communicates the buzz of e-mail very well – the constant checking to see what's new and the worry that no reply has come. He gets himself flamed, but chickens out of flaming back,

so misses out on the catharsis that this (and third degree burns) can bring. Then he joins the WELL and this is the part of the book that I like the least.

The WELL has always struck me as an alien organisation. Atypical of most other net-based activities and certainly atypical of what has happened in European networking circles. Rheingold has written a lot about the sense of Virtual Community he found on the WELL and Seabrook does the same, though he came along later when the WELL was undergoing major changes. The WELL sounds terribly pompous with its talk of “perns” and its historical archives – don't get me wrong, I am all for keeping history, but this all sounds terribly precious! Even here he manages to avoid flaming people, even when they are pulling him to bits for another article he wrote. What self control!

And then comes the Web. And here he really does capture the experience (though he is unnecessarily rude about Lynx which is a tool I use quite often!). “Surfing” is a pretty tedious occupation and doing a successful web demo to sceptics is hard. His discussion of one-to-many versus the many-to-many is stimulating and interestingly he praises the Web for being a pull-medium rather than push. I wonder what he thinks about all the new push systems that people are trying to sell at the moment?
(Side note did you know that Pointcast is a HUGE bandwidth hog?)

Go out and get hold of a copy of the book and read it for yourself. I enjoyed it, you might not, but you will certainly learn some good stuff from it. This is one that is definitely going on the reading list for the Social Impact of Computing course I am putting together at the moment.

Lindsay Marshall is a lecturer in the Department of Computing Science at the University of Newcastle upon Tyne. He has some UNIX DECtapes and a paper tape machine in his office.



OpenNT - UNIX for Microsoft Windows NT

(Martin Houston)

[martin] I am responding to the article by Ian Nandhra “OpenNT - UNIX for Microsoft Windows NT” in the April 1997 news@UK.

Softway Systems were kind enough to send me OpenNT some time ago now. I do not have any quarrel with the technology as such, but I am quite unsettled by the apparent ease of use with which Win32 programs can be integrated into the OpenNT environment.

I can clearly see the scenario where unscrupulous companies will bid for Government work that has the “Open Systems” only stipulation.

Instead of just using Open NT as a delivery platform for Open software, there will be a temptation to slip in some native NT components at various vital points,

thus stopping the proposed solution as a whole from being open.

Open NT could be a dangerous “Trojan horse” making a mockery of the tendering rules that were intended to stop official bodies being “locked in” to one particular supplier or technology path.

Open NT does have some big advantages, if a system has got to be both Microsoft and Open at the same time, but it is a technology ripe for abuse. Buyers should be aware of this and be prepared to make a stipulation that no components that are only available on NT are permitted to be a part of an OpenNT based solution. The two worlds can share a machine but any more fraternising is dangerous to the whole concept of Open Systems!

Martin Houston is 35 years old, married with a daughter and four cats! A UNIX Systems programmer and system administrator for 15 years and organser of the UKUUG Linux SIG since 1994.



Linux in a Nutshell

Jessica Perry Hekman
O'Reilly & Associates, Inc.
ISBN 1-56592-167-4
424 pps
(Reviewed by Jon Lacey)

Another excellent text from O'Reilly.

The people who already own “UNIX in a Nutshell” get a very similar book, but one that focuses on Linux in particular and covers the tools produced by the FSF (Free Software Foundation) for the GNU (Gnu's Not UNIX) project. These tools are popular on other UNIX systems, not just Linux, so prove useful for all.

The book covers a lot of the same ground as the UNIX version with chapters on pattern matching, the Emacs editor, the vi editor, the ex and sed editors, along with gawk , the Gnu version of the standard UNIX awk . These chapters include all the keystrokes and movement commands needed to be able to use the basic editors.

New chapters include: Programming, which details the commonly used programming commands such as ar , as , bison , gcc and g++ for creating programs and ctags , etags , gdb and make for maintaining them. System Administration Overview, covers a very brief overview of administrating a Linux system and talks of IP addresses and configuring TCP/IP. System Administration Commands is an eighty page section that, like the earlier Linux System Commands section, covers all the available administration commands in an alphabetical listing giving syntax, available options, and some examples of use.

There is a good chapter on the bash (Bourne Again SHell), this is said to be the shell most used on Linux boxes and also a chapter on csh (C SHell) and its enhanced brother tcsh .

Overall, this is another little gem from the O'Reilly stable and the only fault I can pick with it is that there are not enough examples in the commands section; the syntax and options cover every possible case, but a few more examples would aid the understanding for a novice Linux'er. But this is a small price to pay against the amount of material contained within.

Jon Lacey is an Open University student who in his spare time (not much of that, let me tell you) is trying to teach himself how to use the *NIX operating system so he can get a decent job in the big bad world of computing.

The Case of the Killer Robot

Richard G Epstein
Published by John Wiley & Sons, Inc., 1997, ISBN 0-471-13823-1
(Reviewed by Mick Farmer)

This unique book consists of a book within a book. The Inner Book, aka The Killer Robot Papers, contains twenty-nine stories/papers/essays about the professional, ethical, and social aspects of computing. The author explores these areas by concentrating on a fictional industrial accident where a robot arm malfunctions, killing the operator (“The Case of the Killer Robot”).

These stories themselves are presented by the fictitious author, Pam Pulitzer, a reporter with the Silicon Valley Sentinel- Observer. All the aspects of computing mentioned above are woven into the history of the case as it moves along from a new generation of robots being delivered to the widget maker, the grisly accident itself, an investigation into criminal negligence in designing the robot, a programmer being indicted for manslaughter by the district attorney, and the problems surrounding the development of the robot's software. Such topics as how the project was modelled, how existing software was reused, and how the software was reviewed, are brought into the thread.

The whole episode is made completely believable by weaving real aspects, such as real individuals, real organisations, and the ACM Code of Ethics, into the plot! Only the last two “print media” stories, concerning the future of computing and a Red Indian initiation ceremony on the top of a mountain overnight, lack, in my view, the plausibility to be true. However, I also enjoyed reading the “broadcast media” stories, especially the one concerning an expert system in medical diagnosis that fails to recognise certain symptoms with fatal repercussions. Far too close to home for me!

As the author points out in the (fictitious) introduction, these stories concern a wide range of issues in the field of software engineering (the software life cycle, CASE tools, user interfaces, programming paradigms, software maintenance, to name but a few) together with many of the social implications of computing (intellectual property and software theft, hacking, applications of AI, honesty and trust, injuries due to work with computers, to name but a few more). So I'm more than happy that everything that should concern people developing critical (whatever that means) systems is covered in this book.

Following these stories, there are three appendices, what the (real) author refers to as The Outer Book containing The Inner Book. Appendix A is a list of real people and real institutions that appear in The Killer Robot Papers. Such notables as Fred

Brooks, Bill Gates, and Clifford Stoll appear here alongside the ACM. Appendix B consists of endnotes and references. The endnotes are extremely valuable for providing background, factual, information about the stories. The references are thorough. Appendix C consists of a collection of discussion questions based on the stories.

This book certainly introduces the social and ethical aspects of computing in an innovative and stimulating way and, yes, it's fun to read. This book will certainly go into the reading list for the Social Implications of Computing course that we teach. I'll leave the last word to Ronald Anderson, Chair of the ACM Committee on Computer Ethics: “It is a milestone in the history of Computing”.



CGI Programming on the World Wide Web

Shishir Gundavaram
O'Reilly & Associates, Inc.

(Reviewed by Virantha Mendis)

As WWW sites on the Net increase at an exponential rate, it has become vital to attract new visitors and to keep existing visitors re-visiting the site. This is necessary for the survival of the site as many of them have financial backers. To keep the site attractive there are many techniques available for the maintainer of the site. These include the use of forms, graphics, providing gateways to existing legacy database systems, animation and so on.

This book by Shishir provides an excellent source of information for the adventurous Web developer. At the heart of any WWW server which provides a two-way communication channel between the server and the net surfer is the use of CGI scripts. These allow the capture of user responses for further processing by the server. This is reflected in the book by the inclusion of seven chapters dealing with issues related to CGI, from the very basics of CGI programming to creating stand-alone applications that can be used on a WWW server. These include applications such as guestbooks and a calendar manager.

The only snag with these examples is that all of them are written in Perl. OK, there are many Perl programmers in the world, but there are many non-Perl programmers as well. The chapters on CGI programming are concluded by looking at how to use imagemaps to enhance the appearance of the server.

There are a few more chapters that deal with issues like presenting information from relational database systems on the Web and producing various gateways that can present information in a more friendly manner. For example, converting the UNIX man pages into HTML so the pages can be viewed in a browser. The chapter on debugging lists many common errors that can creep into CGI scripts and how to solve them.

Overall, the book is well written and presented, which is now becoming the standard for O'Reilly books. As mentioned before, the only criticism is the exclusive

use of Perl to demonstrate the CGI programming tips and techniques. This book is primarily aimed at people who want to provide many services on their WWW server as opposed to the server administrators. This is highlighted by the exclusion of security issues related to CGI programming from the book.

Virantha is currently working for Bioscience Information Technology Services, Biotechnology & Biological Sciences Research Council, as a UNIX techinical support analyst.



Scripting Languages: Automating the Web

World Wide Web Journal Volume 2, Issue 2
ISSN 1085-2301
(Reviewed by Mick Farmer)

I've been looking for people willing to review this publication since its inception, without success (hint, hint :-). Therefore, I decided to review this issue because the theme of the current issue of news@UK is the WWW.

This is the first time that I've actually read an issue of this journal from cover to cover and I must admit that I was pleasantly surprised by the content. The editorial set the scene by explaining how scripting languages and extensible behaviour will enable us to build powerful and versatile information systems. It also described the World Wide Web Consortium's (W3C) role as one of not supporting specific vertical technologies (i.e. Java vs ActiveX, etc.), but more stepping back and concentrating on enabling hooks, such as the <OBJECT> tag, for any piece of mobile code.

Following this was a relatively short “interview” with Larry Wall and Tom Christiansen about Perl, its culture and community. There wasn't anything here that I hadn't already read about in the Perl newsgroups or from the major Perl web sites, especially Tom's. However, for someone new to Perl, it was a good introduction into the mind-set of the Perl community, putting strong emphasis on the way individuals around the world are helping each other, contributing code, in the open spirit that we remember from the early UNIX days.

The Work in Progress section contained two short pieces about Philipp Hoschka's work on real-time multimedia and Jim Whitehead's (team-) work on collaborative authoring. Both were too short for my mind and lacking in detail.

The W3C reports were far better. First, Dave Raggett's report presents the HTML extensions to support client-side scripting of HTML documents and objects embedded within HTML documents. This was followed by Vincent Quint and Irene Vatton's Introduction to Amaya. Certainly compulsory reading for anyone seriously wanting to use the version of Amaya that's on the accompanying cover CD! The final report was by Philipp Hoschka on his group's work “Towards Synchronised Multimedia on the Web”. This followed the same ground as his earlier article.

The technical papers that followed were excellent and well worth reading if you're interested in the battle of the scripting languages. The first, by Paul Lomax, described VBScript, a subset of Visual Basic for applications.

The second, by Nick Heinle, described JavaScript, a simple scripting language which, in reality, is not related to Java, except that they share a common syntax.

The third, by Guido van Rossum, described Python, and how it has been used for everything from CGI scripts and HTTP servers, through HTML generators, to a complete browser with support for applets. Interestingly, I discovered that the language is named after the famous British comedy troupe!

The fourth, by M. Hostetter, D. Krantz, C. Seed, C. Terman, and S. Ward, described Curl, a “gentle slope language” for the Web. Described by the authors as a new language for creating Web documents with any sort of content, it reminded my of TEX because expressions are enclosed in curly braces in the form {operator ...} . The language can be used as an HTML replacement for the presentation of formatted text, whose capabilities include those of scripting languages as well as compiled, strongly-typed, object-oriented system programming.

The fifth, by Lincoln Stern, was originally published in the Perl Journal and describes CGI.pm , a Perl module that hides the messy details of the CGI interface. If you've ever tried to maintain state information with CGI scripts then this paper tells you how to do it easily!

The sixth, by Ron Petrusha, described Win-CGI, a possible CGI for Windows. The author maintains that Win-CGI remains a viable technology for enhancing and extending Web servers.

The seventh, by Clinton Wong, describes LWP, the library modules for WWW access in Perl. The LWP library is available at all CPAN archives and is written for Perl programmers who want to write Web robots and other automatic tasks using HTTP. This report mainly describes the syntax of the LWP modules with numerous examples of their use.

The eighth, by Shishir Gundavarum, describes how to write Web Gateways in Perl. This is illustrated by creating a Perl program that uses the CGI to display Usenet News. This program uses the CGI_Lite module for tackling CGI information and data, and the Easy_Sockets module for communicating with servers. An interesting application.

The final report, by Z. Peter Lazar and Peter Holfelder, is entitled “Web Database Connectivity with Scripting Languages” and describes different approaches to interfacing databases and the Web. They compare traditional CGI programming with Perl, Netscape LiveWire 1.0 applications based on JavaScript, and NeXT WebObjects 3.0 applications based on WebScript. They decline to name a clear winner, but, in my opinion, they prefer the more open approach provided by Perl!

So, there it is. Lots of useful stuff that I might not have otherwise found out about.

[Forward]
Tel: 01763 273 475
Fax: 01763 273 255
Web: Webmaster
Queries: Ask Here
Join UKUUG Today!

UKUUG Secretariat
PO BOX 37
Buntingford
Herts
SG9 9UQ