The newsletter of the UK Unix Users Group
The Linux 2003 event, held in Edinburgh from 31st July to 3rd August, was well attended and a great success.
Pictures and write-ups from the event can be found on the event web site and further in this Newsletter you will see a summary of the event by Alasdair Kergon.
For those of you who were unable to attend please find enclosed a copy of the conference CD. This is a membership benefit which entitles you to receive all UKUUG CDs.
We are now looking at venues for Linux 2004!
The AGM this year will be held on Thursday 25th September at UCL (and not at the IOE as previously announced). You should have received the Agenda etc. in a separate mailing a few weeks ago. All details are on the web-site also. The AGM at 6.15 p.m. will be followed by a talk by Dave Raggett.
Looking further ahead to February 2004 we are working on details for the Winter conference which will be held in Bournemouth. The venue has been chosen as it provides good low-cost facilities and the surrounding area has numerous B&B options at varying prices. The last two years the event has been held in London and in 2001 was held in Newcastle. Bournemouth also has good rail, train and bus links. Please put 25th and 26th February in your diary now. The event call for papers is enclosed.
The next Newsletter will be the December issue and copy date is: 21st November.
After the UKUUG AGM on Thurday September 25th, there will be a talk by Dave Raggett:
Emerging standards for voice and multimodal interaction on the Web
Dr Dave Raggett, W3C/Canon http://www.w3.org/People/Raggett
This talk will present W3C's work on developing standards for expanding the means by which people interact with the Web. W3C's Speech Interface Framework is aimed at the 2 billion phones world wide, allowing people to use spoken commands and key pads for input and to listen to prerecorded speech, synthetic speech and music for output. W3C's work in this area is bringing Web technology to call centers.
Not content with that, we are also working on giving people the choice of which modes or which combination of modes they wish to use for interaction. Today, we are used to clicking on links, and dragging the scroll bar to browse through documents. With multimodal, you get the ability to use spoken commands in place of clicking, or typing, or writing with an electronic pen. These modes can be used as alternatives or used together, for instance when you combine speech with pen gestures like tapping, circling or drawing arrows.
If you are in a quiet restaurant or a noisy room then speech isn't such a good idea, but when you want to keep your hands and eyes free, then speech is really useful. We use our eyes and hands to interact with the Web, and our ears and mouth to interact via the phone. Bringing these together has a lot of appeal. Work on the W3C Multimodal Interaction Framework is still at an early stage.
Tuesday 11 November, 2003; 10:30
In association with Apple, UKUUG is proud to be organising an "Apple Technology briefing - Mac OS X and the Power of UNIX", aimed at all UNIX users, particularly those interested in its core technologies and the associated development environments.
The provisional programe includes: main speaker Simon Patience, UNIX developer for 20+ years and head of Core OS at Apple. He is responsible for the BSD core of Mac OS X and will be talking in detail about the UNIX parts of OS X.
Ken Tabb from the University of Hertfordshire will speak on High Performance Clustering of OS X, including uses and demonstrations of clustering OS X nodes using standard clustering APIs and examples of vectorising code for AltiVec engine.
Stuart McRobert will also present a session entitled "Apple Pie: A new recipe".
There will also be at least one other speaker, talking about Perl/Ruby/Python/Shell scripting on OS X. Further details on the website.
This event is free to attend, but you must register in advance as places are strictly limited. You can register at http://www.ukuug.org/events/apple03
Further details and the latest programme are also available on the UKUUG website at the above URL.
Linux.Conf.Au 2004 opens registrations!
Registrations for Linux.Conf.Au, Australia's national Linux and open-source conference, to be held in Adelaide, Australia on January 12-17, 2004, are now open.
Now is the time for YOUR fun to begin. Right now, head off to http://lca2004.linux.org.au/register/ where you can sign up and pay for your attendance at next year's LCA.
But it's not just conference registration that you can stitch up - you can organise dormatory accommodation on-line, arrange for your partner to go to our Partner's Programme, sign up for miniconfs, and now for our next extra special, secret surprise - you can sit LPI exams as well!
The Linux Professional Institute http://lpi.org is one of the leading Linux certifications on offer, and on the 2 days preceding the conference (overlapping the miniconfs), you can sit up to four of the LPI exams at a greatly reduced cost! Another bargaining chip to use when explaining to your boss why they should pay for you to come to LCA2004.
But don't leave it too long! When we opened registrations at midnight on September 1, people started registering straight away. Honest! That was before we announced anything publicly. We were surprised too! :-)
Register early to secure your spot - by doing so you help us put on an even better conference! If you want to see our more formal announcement, please check out our Media Centre at http://lca2004.linux.org.au/mediacentre for our press releases.
Stay tuned for more announcements soon,
Your friendly Linux.Conf.Au 2004 Organising Team...
The NordU conference brings together researchers, practitioners, system administrators, system programmers, developers and other interested in the latest advances in operating systems, Open/ Free software, Linux, BSD, Solaris, security and interoperability.
If you are working in any practical aspects of System Administration, Security, Operating Systems, Open Source/Free UNIX, Languages or Interoperability, the program committee would like to encourage you to submit a paper. Submissions are due on September 22, 2003
For 5 years now the NordU conference have been one of the leading (if not the only) conference and forum for system administrators and UNIX professionals to meet, learn, and exchange ideas on every aspect of computer and network management in the Nordic Area.
As UNIX and Free Software becomes more and more widely adopted in corporations and academia we see a need to stay on top of current trends. Attacks on our system from hackers over the Internet are also more frequent now.
The Conference will last for 5 days. Three days of tutorials will be followed by two days of technical conference sessions including refereed papers, invited talks, works-in-progress, and panel discussions.
Further details and a call for papers can be found here: http://www.nordu.org/NordU2004/
http://www.bsdgroups.org.uk/manchester The Manchester BSD Group meets in the Lass O'Gowrie, Manchester, on the first Tuesday of each month. "The Lass" is on Charles Street, just off Oxford Road near the BBC. We are a group of people with a wide range of backgrounds and skill levels who discuss all sorts of stuff (sometimes even related to BSD). We welcome everybody, particularly users of BSD or those with an interest in related matters. The fact we meet in a pub gives a clue to the social nature of the meeting. If you need directions or would like more information, there is a mailing list and information at the above URL.
We look forward to meeting you.
"A life in space" looks at Prof. Pat Stakem's life work and goes into his interest in GNU/Linux.
2nd October: 18:30 for 19:00 - room 6620 Adsetts Centre, Sheffield Hallam University Central Campus.
Tickets are free: please contact
Pat Brunskill, Sheffield Hallam University
Further information will be available in mid Sept from www.sheflug.co.uk/meetings.html
Having been a member of the UKUUG for some time I feel I have to comment on both the Newsletter and the group itself.
The UKUUG newsletter when compared to similar publications appears to be a bit 'thin', with the previous edition providing a perfect example. Whilst I recognise the work done by O'Reilly Publications within the Open Source community they appear to be taking up more and more of the newsletter and perhaps sixteen book reviews in one edition is a little over the top? Let's put it another way, what would be left if the book reviews were taken out? For nearly a quarter of a century I have been a member of the Radio Society of Great Britain, an organisation who represent the UK amateur radio community and their journal (published monthly) has advertising, published papers, monthly reports on various technical matters and a wealth of various technical material. Given the academic background of much of the Linux community I fail to see why we cannot match this standard and at least have one or two published papers in each edition. The way things are going I get more technical information from Slashdot.
The UKUUG itself appears to be an interesting group of people but I have the feeling that there are a number of opportunities that are being missed. Certainly the conferences twice a year are well worth the money and effort indeed, this is the main reason why I continue to be a member of the group, but I am sure that more can be done. Despite existing efforts by both UKUUG and local Linux User Groups the penetration of Linux into the installed user base is painfully small and I am sure that the UKUUG could represent the Open Source community in this regard quite well. Where for example are the comments in the UK press regarding the SCO affair and how this could affect both the Open Source community and IBM? Where also are the replies to the FUD issued by Redmond on a regular basis? If we don't speak up then nobody will know that we are there.
I also think that the UKUUG are missing out on certain commercial opportunities, many of which might add to the coffers of the group. For the last two years we have had no t-shirts to mark the Summer conference of the group and whilst the shirts in Manchester were provided as part of the package I would have gladly PAID for a garment to mark the Bristol and Edinburgh events. Yes, discounts on books are available (back to O'Reilly again) but maybe the group could also offer discounted prices on hardware. A couple of examples might help. Firstly, there are various commercial single board computers which are available to the OEM market, some of which have dual network interfaces and would make ideal firewalls. Purchased in single quantities these boards are not cheap however if the group were to make a bulk purchase I am sure that they would find customers waiting within the group. Such a device was shown at the Bristol conference yet I never heard anything about it ever again. Secondly there is the problem of Linux on laptops. Ask any manufacture of laptops for either a machine with Linux installed or a blank hard disk and you will still be forced to deal with the beast of Redmond. I am sure that the UKUUG has enough purchasing power to obtain a laptop from a major manufacturer without the OS and then sell it on at a profit.
Finally there must of course be the good news. Thank you to Jane, Alasdair and the rest of the crew for the smooth organisation of the various conferences, especially the recent event in Edinburgh which in my case was worth flying 1800 miles each way. Well done to all of you and thank you for your efforts, but it would be appreciated if the UKUUG could expand a little in the directions indicated.
Peter M Gant
Budapest, HungarySeveral of the points which Peter makes in his letter are well-taken and should in my view act as starting points for discussion about how UKUUG could and should raise its profile and involve members more fully in a variety of activities, of which writing quality technical articles for the newsletter is just one example. Anyone with suggestions for articles or offers of contributions is invited to get in touch. -- Roger Whittaker (Council)
After many letters and some meetings I am pleased to announce that we have a standard for Open source software in the libraries of Edinburgh. I shall post the standard and then the standard with an explanation in the following e-mails. Please could you forward these e-mails to anyone that is interested. I would especially like the recommendations posted to all other Linux user groups in the UK. This could be useful to help against the vote for software patents in the European parliament which will be on September 1st.
The recommendations have been accepted by Bill Wallace head of Libraries and Information Services for Edinburgh Libraries. I see no reason why this should not be accepted by most other libraries throughout the world. The Library community is very similar to the open source community, rather than one large institution it is based on individuals, groups, districts and countries. An International standard cannot be enforced but recommendations and guidelines will be more acceptable. I believe that these recommendations will be of benefit to Libraries, vendors and the open source community.
I have recently managed to get an article in Library and Information Update. A magazine that goes out to all the Librarians in the UK explaining that I will be personally donating 550 CDs containing OpenOffice 1.1. I will be donating these CD's so that every Library in Scotland can have a copy of the software to lend out like a book. I will be donating the CD's to Rhona Brankin MSP in capacity as Chair of the Scottish Library and Information Council (the advisory body to the Secretary of State and Scottish Ministers on Library and information matters). I have advised her that I am doing this and I am waiting for confirmation of acceptance. I have the backing of Patrick Harvie MSP and Bill Wallace so along with the recommendations I see no reason why they should not be accepted.
Thanks goes to all those that have helped me with this a special thanks to Patrick Harvie, Bill Wallace and those at Newington, Portobello and Muirhouse libraries.
These notes are taken with permission from Sandro's Weblog at http://www.zzoss.com/weblog/
I wrote quite some reports to my Weblog about the UKUUG 2003 conference in Edinburgh. Some people might ask - and I asked myself: "Why does he give away his knowledge?". In fact, travelling to conferences is quite some fun, but also quite some work.
The reasons to go to conferences is meeting with and talking to people, doing "human networking". If you're a speaker, conferences are the platform for your project to let others understand what you are trying to achieve. Conferences don't pay out quickly in terms of new customer deals. Instead, they often pay out only in terms of "knowledge exchange". Thus, it might be a good idea to keep to yourself all the good contacts you made at the conference, and all the good talks you have heard.
Writing reports about any session that you've visited is even more work. Especially if the aim of the reports is to let other people assess the importance of the talk in terms of "did I learn something new?". Usually, this is done within companies - but I do it for the public of the Weblogging community.
Why? Because I am a saint? Definitely not! I do it for purpose and I want to "earn" something. Weblogging is about selling, brokering, and buying knowledge. The whole Internet is. By providing precious information to others, I hope to raise awareness of people and thus my share in the knowledge market. Being known to be knowledgeable can in fact pay out in real cash - or at least in Blogshares :)
Weblogs perfectly fit into the mechanisms of knowledge markets. They are a vehicle for selling, brokering, and buying knowledge. They offer the ability to individuals to invent themselves as a product in the knowledge market: to show what they know, how they deal with information, on what their decisions are based, which actions they take, which results those actions bring.
Saints? No. Egomanics? Maybe. Rather clients as well as servers of the knowledge market :)
Raddle - Andrew Findlay
In his lightning talk at the end of UKUUG 2003, Andrew Findlay announced the very young Raddle project, which provides software to test network management systems.
Dasher - Matthew Garrett
There's not a lot to write about the Dasher project that Matthew presented - instead, you have to see it! Check out the Dasher project homepage. This animated GIF demonstrates how Dasher works and how it provides a completely different way of writing text. It's so cool, that the audience applauded after Matthew's first demonstration of the software during his talk :)
Automated Website Synthesis - Siu-Wai Leung
Siu-Wai Leung's talk Automated Website Synthesis was quite an eye-opener for me at UKUUG 2003:
Basically, Siu-Wai presented a proof-of-concept how to apply Artifical Intelligence (AI) and Semantic Web technologies to transform annotated content to a Website. Read it again! His talk was not only about how to present annotated content on a Website, no, it was rather about how to automatically create a Website from annotated content! In fact, what he talked about was ontology-based web content development and design.
He perfectly demonstrated the use of such an approach for aviation accident reporting. The tools he used are the HEIDI ontology (HEIDI is the abbreviation for "Harmonisation of European Incident Definition"). Furthermore, upon any aviation accident, a sophisticated pilot report form has to be filled in. Furthermore, data is automatically being recorded storing e.g. the altitude and speed of the airplane. Based on that data, Siu-Wai was able to create a graph showing the altitude of the plane, or a tree of events of the aviation accident.
The Website prototyping was done using a Simple Website Interface Model (SWIM), the Web Modelling Language (WebML). More on the technology behind can be found in Siu-Wai's paper (PDF).
At the end of his talk, Siu-Wai summarised that the ontology-based mapping of information, Website, and perception is an AI problem (that's his mapping hypothesis). He presented some ideas of how this could be achieved in the future. I very much liked his idea to use genetic programming to mutate the design of single Webpages or complete Websites to find the most suitable format to present the information to the user - an intelligent way of adaptive and personalised C/KMS.
That was a cool talk ;) The aviation accident reporting demo seems like a perfect prototype to exemplify the possible advantages of AI and the Semantic Web. Also, the idea of genetic mutations of info visualisation acknowledges the reality that every individual has a different way of filtering information and offers a higher flexibility in personalised content presentation then the theme-based CMS currently provide.
I met Jan Kiszka at UKUUG and he pointed me to the following two projects:
Information seems to be available in German only. The project is about collaborative training software and headed up by the ZDT of University of Hannover.
Learning Lab Lower Saxony (L3S)
The L3S is part of the Wallenberg Global Learning Network (WGLN) which is coordinated by the Stanford Center for Innovations in Learning (SCIL)."
Central research areas of the L3S are: Educational Technology and Collaborative Learning, Digital Media and Semantic Web, Innovations in Learning, eLearning Curricula and Content.
The work of L3S includes research, consultancy and technology transfer as well as infrastructure and support in the field of innovative teaching and learning technologies. Thereby L3S aims at the permanent introduction and use of these new technologies into education.
Jan had a nice IBM R40 Notebook with him, maybe that's my next one :)
Perforce - Tony Smith
Looking for an alternative to CVS (because CVS has its quirks), I attended Tony's talk on Source Control and Configuration Management using Perforce.
The problem, perforce aims to solve is: developers want to focus on developing, not reporting, but managers want to know what's going on.
Some features of perforce:
- client/server architecture
The perforce company has about 60 employees and 2400 customers (e.g. HP). The perforce software is closed source, but a special license for OSS projects is available.
Power Shell Usage: Bash Tips & Tricks - Simon Myers
Simon's session Power Shell Usage: Bash Tips & Tricks was simply wonderful with many goodies telling you how to gain a higher productivity working with the shell. Read the corresponding paper to go into details.
Most of the papers of the UKUUG conference are now available online. Access them via the programme for example: click on a title and find the paper linked at the abstract page.
KDE Development - David Faure
David's talk KDE Development dealt with new features in KDE 3.2.
He introduced the audience to application scripting via DCOP which allows for inter-process communication between KDE applications. KDcop is a useful GUI to inspect running applications and processes. DCOP has bindings to many languages (also PHP?) and usually works on the local machine only, but can be enabled to work over the network, but this feature is disabled by default.
KTrader handles "service types" and associates applications and mimetypes, taking into account the user profile. I thought to myself that we should think of a trader within CONESYS that deals with DDO packages and associates those packages with a certain repository.
KParts is there to load application components in the manner of
KParts::ComponentFactory::createPartInstanceFromQuery[...] which is a new approach compared to older KDE versions, much shorter.
kdialog --yesno "Run this script?" || exit 1
With KDialog, e.g. shell scripts can ask for user input.
KOffice is currently in the phase of transition to a XML based native format as defined by the OASIS office committee. This process (which is in the beginning) will provide simple interoperability between OpenOffice and KOffice. I noticed that KOffice has a flow chart application called Kivio which I will try out at home :)
Some notes on Konqueror: it will allow for sidebar modules a la mozilla and inline spell-checking. Also, it will be possible to integrate applications like umbrello to display UML diagrams.
The next KDE meeting will be the first one open to the public. Up to now, the meetings were only meant for KDE developers. From my point of view, this is a very important and the right step ahead to also include the users at meetings when developing an FOSS Desktop system. The often perceived dichotomy between developers and users must be bridged, as has also been discussed at this years OSCOM conference concerning Content Management Systems.
Free and Open Source Software in the Health Service - Anand Ramkissoon
I was interested in learning about an area of software engineering that I have not dealt with yet, hence I attended Anand Ramkissoon's talk Free and Open Source Software in the Health Service.
Anand outlined the history of medical lab software:
In the 1980s: various in house systems have been built, which were
- well specified
Then, in the 1990s, commercial systems became available that descended from in house systems. Their characteristics:
- mainly multi specialism
Since 2000 and onward, we see the death of in house systems. The Y2K compliance killed off the last in house systems. Also, a retreat from strategic involvement in specification by the medical labs can be observed, which is to Anand's opinion "a huge mistake". Another characteristic of the current situation is the commercial lock-in, mainly due to proprietary data formats. In this regard, the medical labs have a lack of power in negotiations, because the main companies simply refuse to port old data, although it is a requirement by the labs. And the vendors refuse even if there are no technical constraints. Hence, the buyers have to believe it, because they have no influence in the development process.
Anand's alarming general statement is that "the quality of software currently used in UK labs is poor, absolutely poor". The reasons are that there is virtually no competition with 3 different software systems in the UK, and globally not many more. Why are there not more vendors, he asked himself? And answered: There's no balance of power in the market and the software is hard to specify as it requires detailed knowledge.
The reality is that people at health services invest an awful lot amount of time to clean up after the software. This situation is paired with the counter-productive philosophy in higher and middle management of health services, that investing in new technologies is only possible when employees are laid off.
In summary, the health services in the UK, maybe even world-wide, and especially the medical labs are currently in the phase of vendor lock-in (sounds familiar to me when thinking of the general history of FOSS). Consequently, Anand started the project "Ganesh" to find a common standard for data interoperability, as well as developing an Open Source reference implementation.
The aims of the project "Ganesh":
- portability of databases, extracts and records
Not Fired for Buying Linux? - Andrew Nicholson
Throughout his session, Andrew did not work with slides displayed to the audience. His talk was mainly a kaleidoscope of good ideas and criticism.
He started with some questions addressing the audience, the funniest one: "Who wears a dress at his working place?"
Andrew continued analysing who is actually looking at the adoption of FOSS. It's mainly the media, he said, that presents case studies, articles, interviews, etc.; but, you usually don't hear about people who did not go for it.
Moving on to discuss the term "computer users", he doubted the usefulness of this term. He brought up an analogy: "Although managers talk a lot, we don't call them talkers; although politicians shake a lot of hands, we don't call them shakers - but computer users are called computer users because they use the computer."
Decisions on migrating to Linux, Andrew said, sustain the myth of rational decision making, which is a masculine approach to decision making. In general, Andrew is a constructivist thinker when he says that a decision is made first, afterwards we piece the evidence together to make our decision defensible.
Software is created socially, in discourses, speech, text and works with concepts, networks of concepts, theories.
His MBA research is based on some migration examples:
- the City of Nottingham that moved to a SuSE email system
Having a closer look at the actors in the migration narratives, he enjoyed interviews with employees of the West Yorkshire Police, stating "TV cliches" like "tax payers money", "I am a responsible police man".
One more interesting point when looking at the migration stories is that on the one hand, FOSS is presented as something new and different, on the other hand, its similarity is stressed (e.g. between MS Word and OpenOffice).
Andrew advised the audience to consider that it might not be a good argument that Linux helps companies in saving money, because the power of a manager is bound to the budget of his department. The more money he (can) spends, the more power he has.
At the end of Andrews superb talk, I asked myself, what's the essence of his statements? Is there nothing new under the sun, even with FOSS, or does it essentially make a difference?
Linux at IBM
Richard J Moore from the IBM Linux Technology Centre, talked about Linux at IBM:
Richard started off with a survey that asked "Based on what you have seen or heard so far with Linux, how would you rate Linux on the following aspects?". The results (most important on top):
He labelled Linux Kernel 2.6 "a major step in the maturity of Linux".
Summarising IBM's strategy, these are the important points:
- enabling linux hardware, software and services
Concerning the workload consolidation, IBM sees the following value propositions concerning Linux:
- reduce cost
Yes, Linux is also being used inside of IBM to "eliminate OS/2 and Windows servers" (!). Linux runs on about 1100+ xSeries servers and zSeries in the IBM intranet. They do email filtering, web server, etc.
The Linux Technology Center department, where Richard works as RAS architect, focuses on the following areas: kernel scalability, posix threading, pci hot plug etc. Their work is not architecture-specific. The department employs about 250+ engineers. More information can be found on the LTC Website.
Richard stressed at the end of his talks: "IBM does the upmost to be a good community player". The succeeding discussion with the panel largely covered patent issues. Giving up IBM software patents would involve a "painfully expensive process", Richard said. Jon "Maddog" Hall asked him, why IBM does not simple issue a statement saying that they will not use their patents against any FOSS project? Of course, Richard answered that this is nothing he could decide and added that "IBM is a big company that takes long to change its culture".
Another question from the audience addressed the point why IBM does not offer Linux support for PCs or notebooks? Jon helped Richard answering the question, saying that the IBM Q&A team would have lots of work to make sure that Linux runs on their hardware - even if they choose only some PCs or notebooks. Nevertheless, Jon predicted that "the more linux goes to the dekstop, the more it will be supported, it's simply a business decision". Richard followed Jon saying that more significant investment is needed to make Linux widely adoptable for the Desktop, but currently it is still too hard to get a proper return on investment.
Inspired by Jon "Maddog" Hall's statement at this year's UKUUG conference that the network is built into Linux I asked myself: why is the network not built into any open source CMS as it is with Linux? Why is it so hard to connect them? Why are they still monolithic blocks of content management?
Obviously, developers of OSS CMS have not yet learned the lesson that Linux tells them: make networking a commodity! Yes, of course, we all do Web services now, yes SOAP or XML-RPC. Yes, there are RSS feeds, trackbacks, pingbacks. Good! A good start, especially in the Weblog community. Unfortunately, the quest for interoperability is as well just at its beginnings. Can we learn from the times when networking was built into Linux? Maybe it's worth taking a look back at the discussions that evolved in the *nix community.
I will keep an eye on that in the realm of the CONESYS project.
Extreme Linux Programming - A Continuum - John "Maddog" Hall
Some notes on Jon "Maddog" Hall's session Extreme Linux Programming - A Continuum.
It was new to me when Jon told us that the Titanic movie used 160 alpha processors with Linux to render the movie. The final rendering took about a year. The producers saved 500000 dollars compared to proprietary solutions, a circumstance that Jon commented with: "So the world's most expensive movie was half a million dollar cheaper".
Jon examplified how Linux is used for super computing when finding quarks (physics), doing adaptive control of earthquakes, simulating meteorits crashing New York, mammograms (breast cancer) analysis.
In these cases, Jon said, Linux helps with its cost efficiency, because often people say: "We know how to solve the problem, but we cannot afford to solve it", until they see the cost benefits of using Linux for super computing.
Jon drew the following future and past chronologic line, showing "Where does Linux belong?":
- Beowulfs 1994/1995
And the nice thing is, he added, that all of it is based on one set of APIs.
Linux is just perfect for super computing, he said, because "the networking is built in" and "parallelism screams at you". With Linux, you have parallelism even in single-CPU machines where it cuts down on I/O wait time and keeps memory and cache "warmer".
The investment protection that Linux offers to super computing implementation, are based on the standard operating system, standard architectures, and standard programming techniques inherent to Linux.
Oh, and I learned a new acronym: RAS = Reliability/Availability/Scalability.
Published by O'Reilly and Associates
reviewed by John Collins
This book covers all aspects of Samba, the suite of software for interfacing with Windows networks. Samba lets you provide a Unix or Linux host as an emulated Windows share, including file and printer access. The client programs allow transfer of files to and from Windows hosts with appropriate shares set up. There is a bewildering array of options and facilities to fit into almost any environment and with any desired level of security.
The chapters of the book introduce Samba, giving an overview of its history and antecedents, describe installation on Unix servers, configuring NT domains, Unix client access, the all-important configuration file, name resolution and browsing, advanced disk sharing concepts, users and security, printing, some miscellaneous information and troubleshooting. Seven appendices give example configurations, a quick reference to configuration options, summary of daemons and commands, how to download the latest source version, compilation configure options, running Samba on Mac OS X and the GNU free documentation licence.
I think this book covers the whole subject extremely well. Particularly important to me, as it has caused a lot of grief in the past, was the whole thorny subject of encrypted and unencrypted passwords and different versions of Windows and options in the Samba configuration file. This is covered comprehensively, and the various options explained well. Some of the more bizarre options, e.g. certain aspects of printing, are well-explained also.
The book gives specific examples of how to do things both at the Windows end and the Unix or Linux end with copious illustrations and with links and web references to follow up various points. The reference appendices are helpful and index is pretty good. I didn't feel that the authors ever strayed off-topic either.
I think that O'Reilly books can be a bit variable in quality, and I know that the animal (in this case a bird, an African ground hornbill) theme on the fronts of each irritates some, but this is definitely one of the best I have come across and I am sure anyone who needs to do any work with Samba will find it indispensable.
Published by O'Reilly and Associates
reviewed by John Collins
This this a pocket reference for MySQL, the popular open source database management software. This is an almost complete implementation of ANSI SQL (with a commitment to filling the gaps in the the foreseeable future).
I think this is more or less exactly what you would want in a pocket reference. It assumes the user knows what he or she is doing and just lists the things you'd want to look up in a nice logical order.
It briefly discusses installation and setup, then lists data types, SQLkeywords in alphabetical order, operators, functions and table types.
The descriptions of the SQL keywords are very clear and helpful.
My only quibble is that there isn't a description of any of the various APIs. In most cases you have to build up an SQL statement as a string, but I thought it was lacking the commonly-used C and Perl APIs. Doubtless other readers would also have wanted descriptions of PHP and Python APIs.
Apart from that gripe, however, I'm sure that it is a handy reference for generating SQL statements.
Published by O'Reilly and Associates
reviewed by Sam Smith
In the five years since the first edition of Mastering Regular Expressions was released, there has been an expansion in the languages and tools people use. Languages such as VB.NET, C#, PHP, Java, Ruby and Python, if they did exist then, were nowhere near as widely used then as they are today. The update to the book, consisting of roughly 200 new pages, and a thorough update of the rest reflects the fact that times have changed.
Regular expressions are built into the above six languages; and Mastering Regular Expressions covers them all reasonably well. This includes comparisons of RE constructs which are available in each language, and examples of tasks where one is better than another. The book also covers well how and why regular expression engines actually work in addition to how to optimise your expressions for maximum speed or effect. Perhaps more importantly it also covers how not to "optimise" your expression wrongly. The book deals with its topic material well and thoroughly. It is not a tutorial on how to write regular expressions in a single language, but a guide on the overarching aims and supporting theories of regular expressions, and their use; while also including a large number of useful examples.
I should also mention Perl at some point in this review. While MRE has the colour scheme from O'Reilly which makes it look like a Perl book, Perl is, in large sections of the text, treated just as another language under discussion. Where there are features of Perl regular expressions that are only found in Perl, these are covered; as where there are features only found in one of the other languages being discussed.
Overall, the book is a useful addition to the bookshelf if you sometimes find yourself having to mangle some text in a less than entirely straightforward manner. It doesn't matter whether you have to break out the man page for the syntax of grep(1) or can type out multi-line expressions straight, there is still something in here for you.
Published by O'Reilly and Associates
reviewed by Mick Farmer
I really enjoyed this book. Some years ago, when at work I first migrated from Sun to Linux, my early Red Hat system was hacked (I think this is called rooted these days) by the trinoo crowd and was used as part of a denial of Service (DoS) attack on some remote hosts. Since that episode I have been security conscious, so I thought this book would check my defences, so to speak.
The book consists of nine chapters, each covering a particular aspect of Linux security or a particular tool. Red Hat 8.0 and SuSE 8.0 are the target distributions, but much of the information is generic.
The first chapter is concerned with Tripwire, an open source integrity checker, that stores a snapshot of your (unhacked) files that is used to periodically check for any discrepencies. I followed the instructions given and soon had basic integrity checking working - something I had always meant to do but had never got around to! This chapter also contained many recipes for different levels of paranoid integrity checking together with sections on verifying RPM-installed files and what to do if you can't use tripwire.
The second chapter covers firewalls using iptables and ipchains. I use iptables as my first line of defence and was pleased to discover that my personal rules for blocking access were covered in this chapter. Basically I only allow incoming Secure Shell (SSH) connections from the static IP address of my home ADSL router! Various sections covered the general housekeeping chores necessary to maintain a firewall on a single machine.
Chapter 3 covers network access control, i.e. incoming connections. Some preliminary sections on network interfaces are followed by recipes for enabling/disabling services via xinetd or inetd. Once again I noted that my own paranoid view of disabling all unnecessary services was dealt with, as were topics such as restricting access by user, host, time of day, etc.
Chapter 4 covers authentication techniques, primarily Pluggable Authentication Modules (PAM), Secure Sockets Layer (SSL), and Kerberos. SSH has its own chapter. This chapter focuses on basic setup and maintenance. I confess to not bothering to use SSL certificates or Kerberos.
Chapter 5 is about authorization control. It contains a number of recipes for configuring sudo so that a user can run commands as another (usually root) user. Since I'm the only user of my current Red Hat 8.0 system I don't use sudo. Perhaps I should!
Chapter 6 is about protecting outgoing connections, primarily using SSH and its relatives. Many recipes cover different flavours of public-key authentication. I followed these instructions and even went as far as using ssh-agent to allow authentication without typing a password or passphrase. All good stuff.
Chapter 7 is concerned with securing data, i.e. files. It covers everything you need to know about employing the Gnu Privacy Guard (GnuPG) which is an open source replacement for Phil Zimmerman's PGP. The recipes work fine and you can read my digital signature at the end of this review.
Chapter 8 is about protecting e-mail. Recipes show how to encrypt mail using a number of popular mailers and how to secure mail access using SSL and SSH. At work my mail is handled separately, so this chapter was not tested.
Chapter 9 covers testing and monitoring and, as such, is something of a mixed bag. I tried using John the Ripper to check my password strength, but it was still running after 48 hours so I gave up. There are the usual recipes for checking suspicious accounts, finding setuid (and getuid) programs, and securing device special files - much of which can be handled by Tripwire. I used chkrootkit to test for possible rootkits, worms, and trojans (none found). I used nmap to test for open ports (none). I used tcpdump and ethereal to watch my network traffic. There were a number of recipes on using snort as a packet sniffer which I skipped over. Once again I was pleased to see that I was already using the recipes to maintain and rotate log files. This chapter finishes with information on how to recover from a hack and how to file an incident report.
I think my machine is more secure than before I read this book. The advice is good and pitched at, for me, the right level. References were up-to-date as far as I could see. I would certainly recommend this book to anyone wanting to secure, or test the existing security, of a Linux system.
Council Chairman; Events; Newsletter
07973 231 870
PO Box 37
01763 273 475
01763 273 255