(a.k.a. IBM Secure Mailer, previously known as VMailer) At the time of writing it is at Beta version 19981230.
makeand a C compiler.
Makefile. A new user and group need to be added to the system. Once you have finished building Postfix, your final tasks depend on how you wish to use Postfix. There are three choices.
Comprehensive instructions are given for each choice and take only a few minutes to complete.
/etc/postfix/main.cf, the mandatory edits are documented within the installation instructions and involve setting the installation directory and the local addresses. Other options are well documented within
Postfix is virtually drop-in compatible with Sendmail.
Unlike qmail, no changing of the
.forward files is required.
Postfix is designed to be secure and uses no set-uid or set-gid programs (other than that indicated above). It generally distrusts everything, including itself.
Postfix is claimed to be faster than all these, and easier to configure. It is certainly much easier to configure than Sendmail, the only alternative with which I have experience.
Another advantage that people may appreciate is the lack of DJB, something which can put people off Qmail.
 Dan Bernstein, the author of Qmail.
Postfix was simplicity itself to install on my system
(FreeBSD 3.0-Release). Replacing Sendmail was a simple
task, and configuration was a joy (compared to
Window Maker is a window manager for the X Window System designed to emulatethe lookand feel of part of the NEXTSTEP® GUI. It's designed to be relatively fast and small, feature rich, easy to configure and use.
Open source. Free to download and redistribute under the terms of the GPL.
You need a UNIX system with the X Window System (X11), a
C compiler and
make to install
You can optionally have KDE or GNOME installed as Window Maker has been designed tobe compatible with these desktop environments.
Compiling and installation has been made simple you can
simply run the supplied
INSTALL script that asks you a few questions and
then compiles and installs automatically. If you have a
non-standard installation the
README files will
explain what to do.
Once installed each user who wants to use Window Maker as
their default window manager can run
wmaker.inst script which
sets up their account to start up Window Maker as their
default window manager.
Window Maker features a clip' and a dock' which both contain icons for launching applications but the clip contains icons specific to each workspace and the dock contains the icons that appear on all the workspaces. To change between workspaces you click on the arrows of the clip. The clip is collapsible meaning you can hide the icons attached to the clip (so only the clip is visible). You can do this by double clicking the clip or by setting it to auto hide.
Adding an application to the clip or the dock is easy. All you have to do is start up the application (e.g. from an xterm window or from the menu mentioned below) and then drag the icon for the application to either the dock or the clip. To remove an application drag the application's icon away from the dock or the clip (this only works if the application is closed).
As with many window managers you can bring up a menu of options by clicking onthe background (aka the root window) with the right mouse button. With Window Maker you can also move the menu and position it on the desktop. The menu will then remain on the desktop until you close it by clicking the cross on the title bar.
As with many X window managers the appearance can be changed by using themes.A few themes are already supplied and are accessible from the menu under Appearance'. You can download more themes for Window Maker and other window managers from http://themes.org/ Themes generally alter the background pattern/image and the colours usedby the menus and title bars.
There are many other window managers available each with a different look and feel. Most common window managers are open source and will compile on most popular UNIX systems.
INSTALLfile included with the source code lists the platforms it is known to run on.
READMEfiles and manual pages. The documentation is clear and easy to understand. There is also a User's Guide available in PostScript or HTML. This can be downloaded separately from the Window Maker web site.
A quick note about KDE and GNOME: KDE and GNOME are both separate projects to provide a consistent desktop environment for the X Window System with its own set of libraries and applications with a common look and feel supporting features such as drag and drop between compatible applications. KDE comes with its own window manager (kwm) and GNOME doesn't have a standard window manager. Window Maker has been designed to be GNOME and KDE (it replaces kwm) compliant. This means you can make Window Maker the default window manager for both KDE and GNOME. More information is contained in the appropriate README files.
(Reviewed by David Hallowell)
WordPerfect is a word processing program with a graphical user interface.
Commercial. Free for personal, non-commercial use if you register online, available on a 90-day trial if you're not using it for personal use. More info at http://linux.corel.com/
You need a Linux system with kernel 2.0.18 or greater. I recommend at least 16MB of RAM.
When WordPerfect was launched, Corel had an agreement with CNET that initially WordPerfect would only be available from the download.com website to download. Unfortunately they named their files incorrectly and it caused problems with the installation.
Fortunately you can now download WordPerfect from other
sites such as
Here they've actually named the file
correctly and all you have to do is untar it and run the
Runme script to start installation.
You're unlikely to run into problems unless you download from
CNET's download.com which still hasn't fixed the problem
with incorrect naming.
As with most word processors there's many options you can configure and all are availablefrom interactive menus. Configuration is fairly straightforward as context sensitive help is available for most dialogue boxes by pressing F1 on the keyboard or clicking the Help button in the dialogue box.
Download locations are listed at http://linux.corel.com/. I recommend downloadingfrom a UKmirror of Linuxberg such as http://hensa.linuxberg.com/. It's also available on the cover disk of the March 1999 issue of PC Plus. Linux.corel.com will also list the vendors that sell the commercial edition of Corel WordPerfect for those who are not using it for personal use or who want to own the full product.
ApplixWare and StarOffice are both Office suites so include other features such as a spreadsheet. WrodPerfect is just a word processor, although Corel will be releasing an entire office suite for Linux in the future. If you use spreadsheets or other office applications such as presentation software you'll need one of the packages listed below.
This version just runs on Linux, although WordPerfect is available for Windows, Macs and some other UNIX variants.
I tested WordPerfect on Slackware 3.5 and Red Hat 5.2 as well as with the new 2.2 series of kernels.
The default language is US English although you can download support for other languages including UK English.
Corel WordPerfect can read and write to the following file types:
There's an extensive on-line help system as well as some README files. If you require further documentation you'll have to buy the commercial version.
Performance was better than StarOffice and similar to ApplixWare.
Support will be available from Corel if you pay for WordPerfect. If you've downloaded it for free you won't get support although you probably won't need it.
WordPerfect has all the features you expect from a word processor including the more recent innovations such as the automatic spelling checker, grammar checker, and the auto correction of common errors. A useful feature of WordPerfect is the shadow cursor which enables you to position the cursor on a blank area of the screen. The interface will be familiar to WordPerfect users on other platforms and intuitive enough for users of other word processors to quickly learn.
The only disadvantage is that the downloadable version isn't feature complete. The Equation editor is missing and so is the drawing capability. If you require these you'll have to buy the commercial version. The commercial version also comes with extra clip art and fonts. For most people's needs however the freely downloadable version has all the features you'll need.
It's reliability has proved excellent and you've got nothing to lose by trying it.
(Reviewed by Mick Farmer)
It's not often one gets an opportunity to review a book that's been written especially to be read (scanned) by a computer so, I thought, let's see what it's all about.
In a nutshell (no pun intended), the Electronic Frontier Foundation (EFF) want to demonstrate that the Data Encryption Standard (DES), as supported by the US Government, is not secure at all. To support their claim, the EFF built a custom design machine for just $200,000 that cracks DES "in a week".
As many of you know, the US Government restricts the export of cyptography (munitions once again), so organisations like the EFF are not allowed to publish their material on the WWW or make it available via FTP. However, this restriction doesn't apply to publishing on paper, i.e. in a book, so that is what the EFF have done.
The book consists of twelve chapters. The first chapter is a general overview of DES, putting it into context (remember that this book is primarily for an american audience). Chapters 2 and 3 outline the low-level chip design together with block diagrams, timing diagrams, and signal diagrams. The EFF DES cracker consists of thousands of search units. Each search unit takes a key and two 64-bit blocks of ciphertext; it decrypts a block of ciphertext with the key. and checks if the resulting block of plaintext is "interesting". If not, it adds one to the key and repeats the process. If the first decryption is "interesting", the same key is used to decrypt the second block of ciphertext. If both are interesting, the search unit tells the controlling software, otherwise the whole process is repeated. The definition of "interesting" depends on what you're looking for.
24 search units fit inside a chip and 64 chips are mounted on an industry standard VMEbus circuit board. 12 circuit boards are mounted in a Sun-4/470 chassis and two chassis are connected by ribbon cables to the generic PC that runs the software. This means they can search 91,160,000,000 keys/second, or half the key space in 4.5 days.
Chapters 4 through 7 contain the source code for both the
hardware and software.
You can either use the Optical Character Recognition (OCR)
tools from Pretty Good Privacy, Inc. (PGP) directly, or
bootstrap programs to manually
upload the data.
Chapter 8 contains the hardware board schematics, together with the Sun-4 backplane modifications that were necessary for their original machine.
Chapters 9 through 11 are mainly reprints of existing articles related to cracking DES with existing hardware and software together with extrapolations for future hardware configurations.
Finally, Chapter 12 gives brief biographies of the principle authors of this book.
This review may give the impression that the book is very dry. Far from it, there are interesting snippets about encryption in general and how the US Government, virtually paranoid, views security of information. The fact that this book can be used to build a DES cracking machine (legally) outside of the US is just wonderful.
Mick Farmer is currently cultivating a CPU farm of abandonded PCs to help with the DES and RC5 cracking contests organised by Distributed.Net. Contributions welcome.
(Reviewed by Jim Webber)
So one day the book review guy comes into my office and hurls something at me. He is clearly disgusted to have held this projectile for even a short amount of time. Abuse about the choice of my operating system and application suite follow, before he departs as brusquely as he first appeared. With some trepidation I turn to the object and examine it carefully. First reaction: urggh. It is a book about programming Microsoft Word. Pause for a moment. O'Reilly, good pedigree. Pause again to double check second reaction. Second reaction confirmed. This might well be of use. Better get reading.
So I start reading, and my first thoughts are well, OK, this is all well and good but why on Earth would I want to program a word processor anyway? Two minutes later my colleague blurts out the news that he's managed to get word to spew out some Latex something-or-other using VBA macros. Question answered- you want to do more with it than it does with its standard configuration. So then I begin to get cynical, doesn't emacs do all this kind of stuff too? Does this mean I have to learn Lisp? Oh dear, no wonder it was physically thrown at me.
My fears were soon allayed however. The introductory sections give a real flavour of VBA. After a relatively short period, I found that I could understand how it all worked - not something that I could ever say of Lisp, that's for sure. One hundred and nine relatively skippable pages and I was ready for leaning about "Object Models" and other Visual Basic disasters. So bolstered by my seemingly rapid learning, I moved swiftly on into the thick of the book. Actually using VBA do allegedly useful "stuff."
Fortunately Learning Word Programming does not linger over buzzwords. Instead it approaches the situation in a very pragmatic and down-to-earth way. A few pages after the object model buzzword and we're getting into what objects Word will let you at and what you can do to them. Nice.
These terse, yet surprisingly thorough tutorials are accompanied by useful pieces of source code to get you going, indeed by reading the source code alone you'd pretty much be able pick it all up. Though the examples seem sometimes contrived, they always seem to get the point across fairly effectively, and thankfully there are no "idiot-proof" step-by-step examples to boost further rainforest depletion. You get the feeling this was written by a literate programmer, for (not necessarily literate) programmers.
As expected of a publication from O'Reilly, presentation is of a high standard. Graphics are used sparingly, but effectively and the text layout is pleasing. Overall structure of the book is good, and amenable to dipping-in as well as cover-to-cover reading. At twenty pounds, minus the obligatory small change, it's not at the cheaper end of the market like the Nutshell manuals, but unlike the Nutshell series it's not purely a work of reference. The extra fiver or so is well worth it when you consider for your money you're getting a Nutshell++.
Although this is not one I'd recommend for an average Word user, but it must come highly recommended for Windows sysops and anyone interested in tweaking Word. The skills that is instils would be of immense value to anyone running a corporate or campus Windows PC infrastructure. I know from experience that even simple jobs such as the ability to bring up a print dialog, rather than dumping jobs to a printer immediately would save time, money and temper. Definitely one for the bookshelf.
(Reviewed by Jim Webber)
Linux, in combination with a whole host other like-minded free computing initiatives really does seem to have caught the imagination of the media of the past few months. Linux and its associated open source software movement have been touted as anything from a major threat to world economic stability, through to a universal panacea. However, anyone with half a mind engaged on the subject can see Linux for what it is: an operating system, and a quite reasonable OS at that.
It is good to see that amongst the amongst the widespread politically oriented attention that Linux is receiving, that there are still some people out there actually programming Linux, rather than talking about the philosophy behind programming it. Linux Application Development is a no-nonsense book, dour in appearance whose thread is not detracted by petty political feuds, and images of cute, fat seabirds, precisely aimed at those people who want to program. Whilst it pays due homage to the Linux Elders and the contribution that the open source movement is alleged to be making, the aim of the book is not lost beneath this, refreshingly scant, introduction.
Once the acknowledgements are made, the book addresses the discipline, subject-by-subject starting from documentation working through the development tools and environments, libraries and system calls, right up to system programming covering processes, networking, console programming, file handling and suchlike. The final section in the book is devoted to development libraries and covers such useful topics as dynamic loading and querying system databases.
Each section in the book is concise and accurate without being terse, and the authors are to be congratulated on their excellent use of down-to-earth language. The focus of each section is clear, making the book eminently suitable for dipping into as a reference. Furthermore, the coding examples provided are relevant, and focus on the precise problem at hand without getting carried away. They are the kind of examples which you wish you could see when manual pages have left you in some doubt as to the best way to pursue a problem, giving enough detail to be useful whilst not going so far as to cloud one's objective.
Overall, the book can only be a blessing for the Linux development community. With the growing popularity of Linux, books like this, which teach the craft of UNIX programming well, are going to become valuable artefacts. What more can I say, if I was to develop solely for Linux, this is a book that I would want to read cover to cover, and still have it sitting on my desktop throughout. If you're serious about Linux development, this is a book that you really shouldn't be without.
(Reviewed by Jim Webber)
Why anyone, apart from shit-for-brains money grabbing tosspots (and of course the majority of corporate IS managers), would actually want MCSE accreditation is beyond me. Whilst the overall concept of accredited exams if certainly noble, the implementation falls far below expectations, in my opinion, MCSE represents little more than the McDonaldsisation of computing. As such it was with some trepidation that I received a copy of MCSE Electives in a Nutshell, reassuring myself that the Nutshell books are generally of a high pedigree, and bloody good value for money to boot. My only hope was that O'Reilly had produced a good piece of literature on what is ostensibly a crock of shit.
The first thing that strikes you about the book, in common with other Nutshell titles, is the layout. No wasteful wide margins or annoying icons to provide distraction. No, the Nutshell books are always well presented typographically, and the structure of the books is usually spot-on. MCSE elective is no different. Content aside, it is very well laid out and the overall structure of the book is logical and mostly impeccable. Use of illustrations is minimal and highly effective, no room for spurious pages of how Windows NT looks or how to delete a file with explorer here. As with other Nutshell titles, MCSE Electives has handy thumb-markers on the edge of each page so that locating a pertinent section, combined with the excellent chapter structure, is straightforward.
Having been suitably impressed with the presentation, content was next on the agenda. Now, never having sat any MCSE qualifications before now, I didn't really know what to expect. Needless to say I wasn't particularly shocked to find that MCSE exams seem to be aimed at simpletons. The approach is akin to cookbook programming. First do A; then B and so on, not really trying to emphasise understanding of concepts, but highlighting tasks and problems and giving step by step solutions. Having said that, I know several examples of the same attitude, and products thereof, running UNIX networks for major British companies, so it's not symptomatic of just MCSE.
So putting myself in the place of an MCSE student, I plowed in. To my surprise, the material inside really wasn't that bad. Certainly most of it was fairly rudimentary, and completely geared towards cramming for a bogus exam, but nonetheless the information was of a reasonable quality. Indeed, I would be tempted to give the book to NT administrators in place of, for example, NT in a Nutshell because it covers far more ground, far more concisely.
The first chapter covers some TCP/IP basics, and how to set up TCP/IP under NT. Whilst this is interesting enough, there were some inaccuracies. The author claims that TCP/IP was developed to allow platform independent networking between UNIX, NT, Netware and so on. OK, so this is true insofar as this book is concerned, but as an absolute truth it's clearly missing the goal somewhat. Still, discrepancies aside, anyone who knows a about TCP/IP will find the chapter an enjoyable and concise refresher. As for anyone studying towards MCSE, well just don't let them anywhere near a network.
The second chapter covers IIS4 (Internet Information Server) administration. For those not in the now about Microsoft technology, IIS in its various versions provides "Internet Services" such as an FTP daemon, a web server and so forth. Now, when I set up IIS on my machine I found it pretty straightforward and as such didn't even turn to any on-line documentation, most probably because all I needed was a straightforward web/ftp server. If I was going to set up something more intricate like a transaction or certificate server, this book would actually probably suffice, as the information, though curt, is the proverbial "good stuff."
Chapter three is a no brainer: Internet Explorer 4 administration. OK, so this is a revision guide for halfwits, but this is just ridiculous. The one chapter in the book which I thoroughly disdain. Again, high quality information, but surely even for MCSE candidates this must be taken as known. Enough said.
The fourth chapter is almost as detestable as the third, in that it is merely how to install and set up Microsoft's poxy, sorry proxy, server. The only reason that this chapter is slightly better than the previous is that far fewer people have ever bothered to install and run the proxy server than IE4. Still, what's there is good, I sympathise for you if you ever find yourself needing to look at it.
The fifth and final chapter covers exchange server. Now although I'm familiar with what exchange does, I've never actually looked into it. Now whether it's bias on my part or actual lack of inanity on behalf of this chapter (installation instructions excepted), it's actually rather interesting. As with the rest of the book the information is concise and accurate, and covers a great deal of the functionality offered by exchange server. This is a chapter that I could quite easily use as a quick reference in preference to a volume on exchange server.
To summarise, I can honestly say that if I was unlucky enough to be involved with MCSE, then this is certainly a book that I'd look into as a revision guide. As it is now, it will sit on my shelf, and on those rare occasions that I can't work out how to do something under IIS, or all my knowledge of TCP/IP escapes me, it'll come down in preference to full texts on those subjects. For those cramming for the MCSE, you have my sympathy, and my advice: buy it, it's cheap and it will help. The "study guides" preface to each chapter and constant exam tips, whilst damned annoying when you use the book as a quick reference, will undoubtedly provide the your typical MCSE mindset with everything their hearts could desire.
(Reviewed by Jim Webber)
Integrating Microsoft's Windows NT and Novell's Netware operating systems is a hot topic in corporate-class information systems today. With Microsoft's NT server software still failing to deliver the goods on the server side, and ever increasing support requirements on the NT client side, Netware is still seen as a crucial part of an IS infrastructure. Furthermore, the legacy of earlier versions of Novell's products often means that a complete switch to pure Windows NT networking, even in small networking environments, has not been possible. The challenge for IS staff is blending the two potentially competing technologies into the best possible mix to support their user base, whilst contending with two differing approaches to networking from Novell and Microsoft, from the very protocols used upwards.
Here is where Managing Windows NT/Netware Integration seeks to capitalise. There is certainly now, more than ever before, the demand for vast networks of workstations to meet the needs of a large body of users, such as that found on a university campus, where it may be the case that workstations are not the property of any one single user. Presently, such a task is beyond the remit of a pure NT4 network from a management point of view. A common workaround for this has been for the computing infrastructure to be composed of NT technology at the client end, and Netware at the server. To this end, the book covers the whole spectrum of NT-Netware integration from setting up client side hardware, through to managing server security, covering key technologies such as NDS along the way. In addition, case studies of common operations are given which explain in great detail some of the tasks that an administrator would expect to perform such as installing network software and maintaining user accounts and environments.
In addition to the everyday administration tasks likely to be encountered, some more technical details such a performance tuning of clients and alleviating network congestion are supplied. Unfortunately some of this detail is somewhat superfluous to anyone with half an idea on computing in general. In particular, supplying Intel's Icomp ratings of the X86 processor family highlighted the utter banality of some of the information in the book.
All in all the book is pleasant enough to read and is reasonably laid out in the ubiquitous soft-cover-computing-book-with-icons-in-the-margin style. The author's use of language is plain without being dull and there is a considerable quantity of sometimes useful graphical material to support the text. For a newcomer to NT-Netware integration, it is a useful starting point, covering a variety of common tasks and problems. Indeed, it would not be an such unreasonable book to read cover-to-cover for those truly new to the subject area. As a reference book for experienced users, it fares a little less well as would be expected of a book with figures as useless as "Typical Desktop Shortcuts," though the later chapters on security may well prove to be an eye-opener. Overall, it must be rated as a worthwhile buy for staff new to the subject area, though for old-hands the money may well be better spent on literature covering one of the sides to be integrated in much greater detail.
Jim Webber James.Webber@ncl.ac.uk
(Reviewed by Ken Tucker)
This is the third edition of this book and covers Bind 4.9 and Bind
8.1.2 although passing references are made to Bind 4.8.3 where
required. For those who do not know what DNS is, it's the Domain Name
System -- the Internet's means of mapping host names to Internet
addresses. As such it is a rather important part of the
Internet. Without it programs such as
ftp and web browsers
would not be very easy to use. It's a lot easier to remember an
address with a name rather than a lot of numbers separated by dots;
this is where DNS comes in. BIND on the other hand is the Berkeley
Internet Name Domain and is the most popular implementation of DNS in
Chapters 1 and 2 discuss Domain Name System theory. Chapters 3 through 6 help you to decide whether to set up your own domain, then describe how you go about it. The middle chapters, 7, 8, 9 and 10 describe how to maintain your domain, how to configure hosts to use your name server, how to plan for growth of the domain and how to create subdomains. The last chapters, 11 through 15, deal with troubleshooting tools and problems and programming with the resolver libraries. The book concludes with 6 appendices which illustrate the DNS message format, how to compile BIND, domain registration forms (if you need to apply for a domain name) and, useful for postmasters, a list of top level domains (which include all the two letter country codes).
Topics covered by the book include
The book is written in an easy-to-read style and explains the mystery of Internet addressing in an easy-to-follow manner. However, it is not a book to be read cover-to-cover like a novel. There is a lot of detail presented here and anyone using it to understand their systems or even to configure them would be wise to follow the suggested paths through the book that the authors thoughtfully provide in the preface. Different chapters are suggested for reading based on your organisational role, e.g. there are different chapters suggested for sysadmins setting up their first domain, experienced admins, programmers and, of course, postmasters.
As a postmaster myself I was directed to 4 chapters that were relevant to my work. Reading the rest gave me a fuller understanding of how the DNS works even though I have to admit I often got lost in all the fine detail. I would need to find a not-too-comfortable armchair for couple of hours to get to grips with some of the more technical content of this book.
I also have to say that I found the first sentence on page 51 confusing. There is an example of an address lookup and the sentence states "There are subdomains called aa, adelphi, al, allegany and many others." But none of these names actually appear in the example as subdomains; presumably they would be there if the complete list of subdomains had been listed. The example apparently shows a truncated list, but it would be nice if they were shown since they were mentioned.
As is usually the case these days, you are often pointed to resources on the Web for supplementary information, such as the Internet Software Consortium and its work on BIND. The book is also peppered with references to the relevant RFCs and in some cases you are advised to have these to hand when reading certain passages of the book.
It's interesting to note that the book's binding has been strengthened. The publishers must believe that this book is going to get a lot of use!
Ken Tucker is a UNIX and VMS system administrator for Cardiff University and jealously guards his AlphaVMS workstation, the only one left in the Computer Services department.
(Reviewed by Ken Tucker)
This book is ideal for those starting out writing Web pages and is also useful for experienced Web authors who wish to keep abreast of the emerging standards. I see this book as a bridge between the current standard (and the extensions used by the different browsers) and the new HTML 4 standard. It describes the common tags in use but reminds you constantly which ones are deprecated in the HTML 4 standard; this is the main emphasis of the book. However, there are standards and standards! Current browsers should conform to the HTML 3 standard, but they are not consistent in which tags each supports or their extensions. Will HTML 4 be a true standard? We will have to wait and see what new versions of the popular browsers support. The book is full of common-sense suggestions for writing style, and reading through it you get a feel for good document structure and layout.
The book consists of 15 chapters. The first two chapters introduce HTML and the World Wide Web from scratch. Chapters 3 to 8 build on this by intoducing the basic tags which affect text, images, rules and hyperlinks. The meatier stuff for those of us already familiar with creating our own HTML documents is in chapters 9 to 13, where topics such as Cascading Style Sheets, Frames, Tables, Forms, etc. are discussed. It also has 6 appendices. The first defines HTML grammar, i.e. what tags you can use and the rules for using them, in the form of an alphabetically sorted lookup table. The second is an HTML tag quick reference which defines what the tag actually does and what browser supports it. The third is a quick reference to the Cascading Style Sheet properties; the authors acknowledge that the support of style sheets is likely to change faster than the book can be reprinted, so they suggest you look at the O'Reilly web site for the more up-to-date information. The World Wide Web Consortium's recommended specification for cascading style sheets can be found at http://www.w3.org/pub/WWW/TR/REC-CSS1. The remaining appendices define the HTML 4 SGML Document Type Definition, character entities, i.e. how to specify non-standard characters in HTML, and finally the colour names and values used to specify colour attributes of document elements. There is also a detachable Quick Reference card inside the back cover.
Let's start with what I consider to be little niggles before moving on
to the book's good points. Page 28 defines # as a pound sign, which is
fine if you are American but is confusing to us Brits. On pages 445
and 446 the
noembed tag is defined as having no end tag but one is
used in the example, which is inconsistent. Page 451 has confusing
keyboard events. It states only 3 keyboard events are currently
supported by HTML 4;
onKeyRelease should be
onKeyUp which is
the event that happens when a
key is released and is the one described in the text.
With the negative points out of the way, what about the rest? The book does a comprehensive job of covering all the commonly used tags including those that are deprecated in HTML 4 (i.e. those that are likely to be withdrawn at subsequent releases of the more common browsers as they start to support HTML 4). It gives tips on style and how to include tags that may be redundant at present but could be very productive in future with the possible advent of automated content related document searching. For example, with correctly used tags, you may in future be able to obtain a complete list of bibliographic references to add as a footnote to your web pages.
It also has little tips and tricks scattered thoughout its pages, such
as maximising network efficiency when displaying graphical content. A
1 pixel wide and 1 pixel high graphics file can produce coloured lines
when the browser won't support them using the
<HR> tag. You load a
minimal size graphics file and let the browser takes care of the
The overall message of the book seems to be that style sheets should be used to design your web pages. Many tags are deprecated in favour of style sheets which should make for greater consistency and more easily maintained pages, especially if you have large document collections. A change to the style sheet will affect all documents using that style sheet, you don't have to amend each one individually (with a chance of missing one) to change the look.
A nice touch is the simulation of hyperlinks in the chapter sections which point you to the relevant chapter and section where the particular topic is discussed in more detail. Reading the book gives a feeling that while concentrating on giving the nuts and bolts of the tags it also suggests good page design and sparks little ideas as you read it. Now all I need is the time to put some of it into practice -- my web pages haven't been updated for years.
Ken Tucker is a UNIX and VMS system administrator for Cardiff University and jealously guards his AlphaVMS workstation, the only one left in the Computer Services department.
(Reviewed by Adrian Cummings)
This is a pocket-sized pocket reference about Oracle-supplied PL/SQL packages from the master of large books on PL/SQL, Steven Feuerstein. As you would expect from the title this is a companion to Oracle Built-in Packages from O'Reilly and not a more portable replacement.
The pocket reference consists of descriptions of the syntax for the built-in packages and functions from the major DBMS_% and UTL_% packages. It also covers the restrict references pragmas and the associated constants and exceptions.
It is not a user guide to PL/SQL nor is it a complete set of the built-in packages as it is restricted to the more commonly used areas. To my mind this is a bit of a shame as you may find yourself creating your own procedure simply because you were unaware of the supplied one. Where appropriate, the minimum release of Oracle required is indicated.
Curiously this reference book doesn't have an index. Despite a couple of minor quibbles this book should prove a useful addition to the PL/SQL programmer's pocket.
Adrian Cummings can be found searching for a source of larger pockets for his growing collection of pocket references.
(Reviewed by Adrian Cummings)
Oracle Security covers protecting your database using only the tools supplied by Oracle or provided by the operating system, without using any third party products. Not that you'd think of doing that, would you?
The book comprises three main parts, the first covering the purpose and protection of the Oracle system files as viewed by the operating system. We then move on to the protecting of the internal database objects such as the data dictionary. Finally, this is then extended to the mechanisms that your applications may use to protect the data.
There is also a comprehensive discussion of default roles and user accounts, profiles and passwords. This section has some interesting things to say about some of the default settings.
The authors next cover the security-related aspects of developing a database security plan and application. This includes what, when and how to audit. This too is a complete discussion about designing in security into your application from the outset.
The last part covers extras such as the Oracle Enterprise Manager (OEM) and Oracle Security Server (OSS). Security of web servers is covered, but not in much detail. This is somewhat strange given the emphasis that Oracle places on the Oracle Applications Server (pronounced web server) and the trend of increasing "web awareness" of the RDBMS (e.g. the forthcoming Oracle8i).
At various points throughout the book key points are highlighted by either an owl or a turkey as appropriate -- just in case you weren't quite sure, the significance of the owl and the turkey is also explained. As an aside, the cover of this book is not for the sqeamish bearing, as it does, one of the trademark O'Reilly animal engravings. In this case a tarantula. This may cause some concern amongst one's colleagues (at least that is what I have found as the review copy is banned from the office).
Despite the potential this has to be a dry subject, Oracle Security is written in a clear, readable style. I was particularly taken with the metaphor comparing Oracle with a fridge -- I may never think of Oracle in the same light again. This should be a welcome addition to any collection of Oracle manuals, although I wouldn't be in the least bit surprised to find a second edition of this work appearing in due course to cover areas that are missing. I'm particuliarly thinking of the current industry fetish of having to place everything "on the Web".
Adrian Cummings does his best to avoid appearing on the Web.
(Reviewed by Huw Gulliver)
Tom Christiansen, a noted Perl instructor and co-author of Learning Perl and Programming Perl, has teamed up with Nathan Torkington, co-maintainer of the Perl FAQ, to produce this book. Split into 20 chapters it covers all the key Perl topics -- strings, numbers, dates and times, arrays, hashes, file access, directories, subroutines, references and records through to Internet services and Web programming. This is achieved by providing a mix of the simple how-to-do-this-or-that solution with more in-depth mini-tutorials where appropriate. The book has its roots in chapters 5 and 6 of Programming Perl which provided examples of everyday uses of Perl. This new book also serves as an adjunct to Learning Perl and Programming Perl.
Each chapter covers a particular topic and is divided into sections termed 'recipes' with an overview of the chapter at the beginning. Each recipe is designed to be a self contained unit which presents a particular problem, its solution or solutions, a discussion of the problem and the solution(s) and references to other related information. The format has been designed so that the book can be read sequentially or random access à la traditional cookbook. This has largely been achieved.
Although I normally try to read the books I review from cover to cover in this case I have dipped into it at random since I believe this will be how this book will be used by most readers. This is also what the authors expect. This book it has found a place on my desk alongside my copy of Programming Perl proving a useful source for simple things (like putting commas in numbers) to providing a starting point if not the solution for more complicated problems. The book has also spent some time on my bedside cabinet and on my coffee table where it has been dipped into as the mood takes. Part of my ability to pick up and put down is the cookbook format; the other is the well put-together Table of Contents which makes it easy to find what you are looking for.
Although I'm sure most of the information in this book could be found from other sources -- such as FAQs, the Perl Documentation, News Groups or Web sites,even other books -- it is the way in which it has been brought together and presented that makes this book so useful. The only disappointment was scanning through the errata for the first printing on the O'Reilly Web site; whilst most errors are of simple wording there appear to a be fair number of typos, like having spaces where they should not be spaces, which for the experienced Perl programmer is just an annoyance but may cause the novice Perl programmer a few headaches. To be fair this book is no worse than any other in this respect.
This aside, I like the book very much and it has found a permenant place on my desk ready for when I next need help or inspiration.
Huw Gulliver is a member of the network support team and UNIX administrator in Information Services at Cardiff University.
(Reviewed by Chris Cook)
This tiny tome is the distilled essence of the Perl language, up to and including release 5.005. As with other pocket references, reading this book is not the way to learn Perl but it is still a useful way for the novice user to acquire a feel for the language. A read through this whets your appetite by exposing you to more obscure features of the language than you are likely to encounter unless you read Programming Perl from cover to cover.
It is less than seventy pages long, and small pages at that, but it makes for very slow reading if you happen to be anywhere near a computer. Several times per page you find yourself breaking off and "just exploring" another feature. This can be intensely frustrating as, by definition, there is insufficient space in such a small book to justify full enough explanations of everything covered. The result is that you can end up spending disproportional amounts of time trying to make a fragment of a script work. It probably helps to have a copy of Programming Perl to hand; unfortunately I didn't.
With my curiosity piqued by the reference to subroutine prototypes meant another twenty minute diversion at the keyboard attempting to discover what they were and then how to make them work. I must confess, this was only resolved once I had read the relevant man pages. (I have transgressed the unwritten rule of computing -- "my son, you must never read the instructions until you've tried pressing all of the keys first" -- I will duly hand in my system administrator's badge and gun.)
The book contains all the sections you would expect it to, from command line flags and environment variables to language syntax and function definitions. Many of the function definitions do contain small examples of how they are used but it is fair to say that if it is examples you're after, your best bet is O'Reilly's Perl Cookbook.
The book makes passing reference to a few of the newer additions to the core language; multithread support and the Perl compiler. In the case of threads, the passing reference is so fleeting that only those already well educated in thread programming would recognise it; this is probably a good thing.
For quick reference purposes, perhaps the most useful sections for those of us not steeped in the language are the ones covering the special variables. All those one- or two-character system variables do my head in when I am trying to decipher a piece of code (and that includes my own). In addition to describing the variables, hashes and arrays, this section gives the more memorable and certainly more legible alternative names provided by the standard module "English". The other invaluable section for the fledgling programmer spreading their wings is the listing of all of the standard modules and their descriptions.
The book does lack a little detail in some areas. It could benefit by inclusion of section numbers, referring to more lengthy expositions within Programming Perl. Generally this is another fine, useful book in O'Reilly's range of pocket and desktop references.
Chris Cook is a System Administrator for BAeDSL in Dorset.
(Reviewed by Raza Rizvi)
This is surely an ambitious title!
As an ISP, this is a subject close to my own heart. We frequently have requests from users asking how they can prevent their mailboxes filling with unsolicited mail. This 'pig' book covers the complete territory of spam from who the spammers are to how to stop spam from reaching your email or USENET account, both from the viewpoint of the individual end user and the ISP.
Chapter 1 kicks off with the background and reasons why spam is sent by certain individuals, with a good descriptions of why spam is such a bad thing. From a philosophical view it is clear that although it is cheap for a user to send spam, it is expensive for recipients, administrators, innocent parties at the company or ISP of the spammer, society, and the Internet as a whole -- and gives a couple of examples of spam attacks.
Chapter 2 goes back to Jon Postel's (now alas deceased) RFC706 on Junk Mail to show that the current spam problem has been a problem since the early days of the Internet. It chronicles the history of spam from the early eighties onward, including descriptions of the spam kings Cantor, Siegel, Slaton, and Sanford Wallace. Also well documented with the rise of the spam kings is the concurrent ascendance of the spam cancellers.
The basic technology concepts of message transfer in mail and USENET News are covered well enough for a layman to grasp. This leads into how a user might protect themselves from being a target for spam, primarily by describing how to protect your email address from being harvested by others. Of course this will only work for new addresses and therefore techniques for automatically filtering messages for all the most popular mail packages are described in detail. Given that a user might want to know where the mail came from, a good deal of time is spent in showing how one might use the clues available in the message itself or other Internet facilities, such as nslookup, to suss this out. The chapter finishes with a guide on how to respond to such junk mail. This can be a provocative area, but great care seems to have been given about how to approach this. I would have liked to have seen the option of doing nothing, since it is possible that a reply to spam confirms that the address used in the spam was valid, and therefore a good target for future spam!
Having covered email, the authors turn their attention to USENET News, covering broadly the same territory of filtering, handing cancel messages from others (NoCeM), tracing spam, complaining about spam, and finally cancelling spam yourself.
And now the real meat -- Chapter 7 deals with ISP spam issues. This is a very good synopsis of the available tips, tricks, fixes, tools, and techniques for fighting spam. Since many readers will not be ISPs, I will not cover this chapter in detail except to say that the only obvious omission seemed to be the (very recently defunct) ORBs anti-spam system.
To complete the book, the legal and legislative approaches are covered, from Acceptable Use Policies through to Federal bills (sorry no UK or EU detail here, unfortunately).
Appendix A provides a collection of URLs for pertinent RFCs, tools, informational sites, and Appendix B has a complete timeline of the life of Cyber Promotions Inc. (the company set up by Sanford Wallace to push out hundreds of thousands of spam emails every day).
In summary, this is a good read for any ISP, or ISP support team. I doubt whether many end users would be so enraged by spam to actually pay for a book solely on the topic, but it doesn't stop this from being a timely and complete summary.
Raza Rizvi is technical support manager at REDNET, a very busy network integration company and medium sized ISP. He is still recruiting networking and administration staff at http://www.red.net.
(Reviewed by Andrew Cormack)
Web sites are like gardens: if they are just left to grow the result is very soon an impenetrable jungle. A little design, planning and maintenance are needed to keep the weeds at bay and the paths clear. Last year Rosenfeld and Morville's Information Architecture explained the scientific approach to web design; now there are sufficient well-kept web sites for Jennifer Fleming to present an educational guided tour.
The aim is made clear early on: to satisfy visitors. Some sites exist only for their author's pleasure but most must earn their keep. Readers are unlikely to stay at a site, and even less likely to return, unless it gives them what they want. But every site will have a different group of users so its design must address their particular needs. The obvious way to learn what users want is to ask them but Internet folk are notoriously resistant to surveys and often give misleading answers. A more profitable approach is to imagine typical users, considering their requirements and the concerns they might have when using the site. The book uses a number of these examples, showing how they can not only identify problems with a site design but also suggest new features which users would appreciate.
Every web site is different but few are unique. Whatever the purpose of your site it is likely that someone else has addressed the same problems. The book includes interviews with many professional designers discussing their ways of approaching a new task. Concrete examples are always useful and need not be taken only from other web sites. Print, CD-ROM and other media can provide ideas and even wider analogies can be drawn: many web shops have borrowed the familiar shopping basket.
User requirements can be reduced to three groups. The first are common to all web sites and any other unfamiliar situation: "Where am I?", "What can I do here?", "How do I leave?". These can be addressed by a set of standard tests presented in the first chapter. Other requirements are common to a type of site -- "Does this shop sell books?", "How do I pay?" -- or even more specific "What is the ISBN of...?". The second half of the book considers how existing sites address these latter questions, concentrating in turn on the themes of shopping, on-line communities, entertainment, education, corporate identity, and information. Using live examples involves a small risk that the site will change and spoil the printed description. I had wondered whether the CD might include snapshots of the sites as described but it only contains hyperlinks to them and the extensive bibliography.
Web sites are never finished and the author provides a useful chapter on how development should proceed. She also admits that this ideal is rarely found in practice! The description refers to a web consultant but the tasks and political problems will be equally familiar to internal teams. An appendix includes instructions for implementing common features including the clearest explanation of frames I have found. I would only add that it is impolite, or even illegal, to load someone else's page into your frames.
Web Navigation has a similar subject to Information Architecture, but takes a very different approach. There are none of the detailed instructions and procedures found in the earlier book, but many more examples and ideas as inspiration. Either book will provide a good basis for designing web sites; ideally read both. If you have to choose, web sites based on Rosenfeld and Morville are likely to be logical and easy to use but those following Fleming will probably be more fun.
Andrew Cormack was postmaster and webmaster in Information Systems Support at the University of Wales, Cardiff, but has now moved on to head up JANET's CERT team.
(Reviewed by Andrew Cormack)
The web is made of many parts -- authors, files, servers, networks, browsers, readers -- each of which can affect its ability to transfer information. Web Performance Tuning looks at every component of the system and suggests ways to improve its performance. For the impatient, the first chapter has a number of quick fixes for performance problems but it would be a tremendous waste only to read this far.
Proper tuning involves setting targets and measuring performance, and should begin even before buying hardware or writing code. Capacity planning can be viewed as preventative tuning. This is serious systems engineering, but so well presented that it should be accessible to anyone. The concepts of latency, throughput and utilisation are clearly explained along with the factors likely to restrict them. Load sharing and mirroring may be solutions, but these have their own problems. Performance measurement tends to involve standard benchmarks though the author takes an extremely cynical view of most published figures (an appendix includes Netscape's instructions for improving benchmark results). Finally there are examples of small, medium and large server configurations. "Large" means Netscape and Alta Vista and should probably carry a health warning!
The second half of the book looks in detail at each component of the web system. Starting at the client end there are explanations of how browsers render pages, how the various caches can affect performance, and what should be the priorities in choosing a computer for web browsing. Even the user can be "tuned" by learning the keyboard shortcuts for common actions. The hints for improving performance are always supported by an explanation of the underlying system so that an informed decision can be made on the likely impact of any change. Although most of the detailed instructions assume a Microsoft operating system the author does observe that the most effective "tuning" could be to replace this by UNIX.
The chapters on networking cover everything from finding an appropriate Internet Service Provider to ensuring your network card has the right UART chip. Even in these obscure but significant corners the book is a model of clarity, greatly helped by a stream of well-chosen analogies. The description of the Internet backbone ignores everything outside North America though elsewhere there are comments on the difficulties of getting connected in other parts of the world. There is even a section on tuning TCP/IP parameters in the UNIX (and especially Solaris) kernel; the recommended further reading (Stevens and Tannenbaum) is quite correct but might be rather heavy going for some readers. I was delighted to see web caches getting a favourable mention. The suggestion that commercial sites should pre-load their pages into the caches of large ISPs is a welcome antidote to the usual obsession with "accurate" hit counts.
Only UNIX is considered stable enough for a production web server. The various monitoring tools are described but interpreting the results and tuning the operating system are referred to Mike Loukides' classic System Performance Tuning. This book does include descriptions of UNIX processes and files to highlight the main performance constraints on the web server program. There are detailed suggestions for helping both Apache and Netscape web servers with brief summaries of other programs. The content served by a web site can have a considerable effect on its speed. Here there are the usual recommendations to keep files as small as possible, but also hints to help the browser to start displaying the page sooner. The penalties of using CGI scripts are explained: mod_perl and fastcgi are mentioned as possible improvements with URLs for further information. Finally there are some general points for the design of Java programs and web databases. Three standard web pages on tuning are included as appendices. As well as Netscape's document on their own server these cover Apache (by Dean Gaudet) and Solaris (by Jens-S Vöckler).
Tim Berners-Lee has taken to referring to the World Wide Wait: anyone reading Web Performance Tuning will learn why the system is slow and what they can do about it. They should also pick up a lot of sound software and systems engineering practice, as well as more arcane facts such as the eye-brain bandwidth. Few books manage to be both informative and entertaining: this is one of them.
Andrew Cormack was postmaster and webmaster in Information Systems Support at the University of Wales, Cardiff, but has now moved on to head up JANET's CERT team.
(Reviewed by Daf Tregear)
This section of the bookshop is almost as crowded with offerings as the Java section. Contrast the lot of the authors of Java books who struggle to keep up with this rapidly changing language with that of the authors concerned with the books which teach us how to write for the World Wide Web. The latter spend much of their time describing features which many browsers don't fully support yet. But we have to prepare for a time when they do.
The "horse's mouth" when it comes to books on the web is obviously the employees of the World Wide Web consortium, who must be the authoritative source. Dave Raggett is a lead architext of the HTML standard and Raggett on HTML 4, Second Edition (by Dave Raggett, Jenny Lam, Ian Alexander and Michael Kmiec, Addison-Wesley, 1998, 437 pages, £24.99, ISBN 0-201-17805-2) is a follow-on of the classic HTML 3 produced by the same bunch two years previously. It retains clearly laid-out text spiced up with zany humour and cartoons and loaded with clear examples; it has the feel of the classic Unix System Administration Handbook by Evi Nemeth & friends. Addison-Wesley have cut down somewhat on the colour pages inserted (including, thankfully, those of WWW consortia members lounging around swimming pools) without affecting the clarity of the message. This is the book I choose to read rather than just use as a reference.
I suppose you could almost class Cascading Style Sheets: Designing for the Web by fellow W3C members Håkon, Wium Lie and Bert Bos (Addison-Wesley, 1997, 279 pages, £21.95, ISBN 0-201-41998-X) as a "companion volume". We are now told to put no indication of style management in our HTML markup directives, but to put all indications of our preferences for style in our cascading style sheets. CSS are now an essential part of The Way Ahead (well, that's the current plan). Those of us with large sites to maintain are already making use of them, despite the very patchy support at the browser end. In fact, it's easy to get carried away and subject the poor user to a variety of dazzling colours and font effects at the drop of a hat, simply because one can. The principles behind CSS aren't difficult and there is much to be gleaned from the web (I found both http://www.w3.org/Style/CSS/ and http://www.htmlhelp.com/tools/csscheck/ very helpful) and from the comp.infosystems.www.authoring.stylesheets newsgroup. However there are some tricky little details that can confuse and this book works well both as a tutorial and a reference. You still need to check out the web to find out which browsers support, or implement correctly, which features of CSS -- a target moving too fast to make it onto paper.
The last offering I consider is HTML for Fun and Profit, Third Edition (Mary E.S. Morris and John E. Simpson, Sun Microsystems Press/Prentice Hall, January 1998, £31.99, ISBN 0-13-079672-7). This aims for breadth rather than depth using the familiar layout for this series of books from this press mark. Lots of white space is seen as necessary and screen shots (indispensible for showing the effects of markup) have to be kept to a minimum to prevent the book from becoming even bulkier (it's 378 pages). To compensate, many examples of different markups have been put on the accompanying CD-ROM, with a corresponding rise in price for the reader. Fun and profit for whom, I wonder? I can see this strategy would be helpful to the person setting out to master HTML at home using a dial-up connection and wanting to practice first offline before spending time and money plodding around looking at the sites suggested in the good appendix. This book has been out a year, so is outdated in some respects already but it makes the attempt to make mention of future developments as well as document the dirty tricks that surfers will notice being used to overcome current browser limitations. Unlike the other books mentioned above it also pays some attention to the fact that the user may decide to use commercial software to write HTML and reviews some of the offerings current at the time of going to press.
Tel: 01763 273 475
Fax: 01763 273 255
Queries: Ask Here
|Join UKUUG Today!||
PO BOX 37