[UKUUG Logo] Copyright © 1995-2004 UKUUG Ltd



Newsletter Section 5

Soap Box

World Domination

(Richard Morin)

Most of us are inundated with announcements for talks, seminars, conferences and trade shows. My own case may be a bit extreme, due to my proximity to Silicon Valley and my efforts as a trade journalist. Of course, many of these events aren't all that enticing. Some are of questionable quality; some are related to areas of marginal interest. A large remainder, however, looks worthwhile and interesting, but (alas) unreachable due to location, cost or time constraints. In short, we all hear about far more events than we could possible attend; the trick is to make the best choices.

One of my favourites, the USENIX Conference, has gotten a miss for the last couple of years. The usual excuses applied, but I still regretted missing it. Consequently, I am pleased that I was able to attend the 1997 USENIX Annual Technical and USELINUX Conference in Anaheim, CA, in January.

USENIX isn't a gigantic organization. The Annual Technical Conference and the LISA (Large Installation Systems Administration) Conference both attracted about 1,500 attendees (up 25% on the last two years). The topic-specific workshops (electronic commerce, object-oriented programming, etc) are designed to be much smaller. The high quality of the talks more than makes up for any lack of size. The Birds of a Feather meetings, Guru-Is- In sessions and informal conversations are also valuable sources of information.

At the conference, I heard James Gosling (Sun's lead engineer and key architect for Java) and Rob Pike (a principal developer of Lucent Technologies' Inferno) give somewhat disparate views on network- oriented laanguage design. Bell Labs' Bill Cheswick gave a delightful talk on “Stupid Web Tricks”, sharing some Internet security holes and performance issues that may show up in the mass media one of these days, along with some fun things to do – and avoid doing – with Web browsers. UNIX notables such as Ken Arnold, Keith Bostic, Andrew Hume, Mike Karels, Sam Leffler, Kirk McKusick, Dennis Ritchie, Henry Spencer and Larry Wall sat in the audience for these talks (and chatted in the corridors during the breaks). Pretty heady stuff for a techno- groupie like me.

If you are interested in the future of open computing systems, you owe it to yourself to join USENIX, get copies ot its proceedings and newsletter, and attend the occasional conference or workshop. For more information, contact USENIX via email to office@usenix.org or via the Web at http://www.usenix.org (tell them Rich sent you:-).

As a small, but relevant, digression, I would like to acquaint you with some of

my own peculiar views on the history of UNIX. For the purposes of discussion, I will divide this into three overlapping epochs. The first epoch (research) began in the late '60s and ran into the late '70s. During the first part of this period, UNIX was restricted to the confines of Bell Laboratories in Murray Hill, NJ. Over time, it spread to other laboratories and educational institutions.

As I understand it, this period was characterised by a spirit of free and open co-operation. Members of the UNIX community passed ideas (and source code) back and forth at the drop of an email message, and life was good.

The second epoch (deployment) began in the early '80s. Sun Microsystems, for instance, made its debut at the Boston USENIX Conference in 1982. Although the Sun booth was the highlight of the show, it was a far cry from the company's present-day extravaganzas. Instead, it was a small booth featuring a prototype workstation. Sun co-founder Bill Joy and company scurried back and forth, trying to get everything working, pass out photocopied literature and answer questions.

At that trade show, talks were given by many of the notables mentioned above. Most of these had to do with ways of improving kernal features or performance. Detailed questions were commonly answered with an admonition to “read the source code”. By and large, this advice was given with helpful intent. Any hint of a partonizing attitude lay in the RTFM-like use of the source code as a screening process: If a newbie wasn't willing to study the code, he obviously wasn't worth much of the hacker's time.

As a nonaffiliated UNIX user, however, I had no access to the fabled source code, so the advice was more annoying than useful. Even if the hacker was sympathetic to my situation, there was little he could do about it, so useful interchange tended to die off at that point.

With the growth of the commercial UNIX industry, access to source code became even less common. Many commercial vendors considered their additions and modifications to be proprietary, so even the folks from universities and research laboratories started to feel “out of the loop”. Generally, most UNIX systems were being sold with binary-only licenses.

These changes had a predictable effect on the flow of communication. Lacking access to the source code, new users turned to books and magazines. Without freedom to exchange source code, many programmers became isolated from their peers at other companies. The USENIX conferences helped, of course, but far too many talks concerned developments for which the source code was “not presently available”.

Recently, however, I have started seeing signs of a third epoch (freeware). In this epoch, we have returned to the free and open sharing of ideas and technology, but, this time, everybody gets to play! There are still a number of commercial products,

to be sure, but there are also entire operating systems and suites of tools freely available for inspection, modification and even (gasp!) productive use.

Some of this use has to be hidden from corporate executives and lawyers (“Freeware? On our computers?”), but it is definitely taking place. With increasing frequency, programmers are able to present convincing arguments for freeware-based solutions. If your company needs to install a few thousand inventory-control systems, then commercial UNIX licenses could add millions of dollars to the cost. If you can propose a way to save this money and, at the same time, have complete control of the source code, you might just get your company to buy in.

Although early versions of freeware operating systems were limited to Intel- based hyardware, Linux and some others are now available for Apple, DEC, Sun and other platforms. In fact, some companies are finding out that the highly efficient Linux kernel has a way of rejuvenating their older machines.

The growth of the freeware community is not lost on the USENIX Association.

At this year's confernce, a parallel set of tracks was set up for Linux-related topics. This allowed Linux and UNIX devotees to hear about each other's activities and exchange ideas and opinions. I hope that the experiment is repeated in some form at future events.

The Linux community is developing its own notables, many of whom were in attendance. This gave me an opportunity to chat with Jon “maddog” Hall, Phil Hughes, Adam Richter, Ted T'so and many other Linux enthusiasts. All of us, of course, attended Linus Torvalds' talk on the future of Linux.

Tovalds' goal for Linux is very simple: World Domination. Although he presents this objective with a smile, he is quite serious about building up the system until it can be a viable alternative to Windows NT, etc. He spoke about issues like binary compatibility modes, technical support and user interfaces. Linux may have started as a late-night hack, but Torvald has his eyes on long-term possibilities.

I find the social phenomenon of Linux far more interesting than the exact kernel technology being used. A few years ago, Bill Joy said something like, “I don't know what UNIX will look like in 10 years, but it will be called System V.” I suspect a similar situation is developing in the freeware community.

The Linux kernal is solid, efficient and increasingly capable. The Linux community is doing the hard work that is needed to meet the assorted POSIX standards, as well as implementing new developments such as Ipv6 (often long

before they are put into commercial systems).

The GNU Project's General Public License allows Linux to import university-licensed and public-domain code, so Linux tends to absorb useful code from the rest of the freeware community. What's more, Linux has a large, enthusiastic and very co- operative band of developers, technical writers and users.

Most of the technology in Linux, however, lies above the kernel. The command set is largely drawn from the BSD, GNU and X11 communities. If I were to place these same tools on the Hurd or a BSD-based kernel, you would be hard-pressed to detect any differences. Apple's Mach- based MkLinux system is, in fact, indistinguishable from a vanilla Linux system, save that it runs on a Power Macintosh.

So even if your computer in 2005 is called a “Linux” system, it may not run anything like today's Linux kernel. On the other hand, as long as it works well and remains true to the spirit of free and open interchange, I don't imagine you'll be very concerned. And, by retaining that spirit, it will continue to attract and inspire the kinds of programmers that created (and continue to develop) the UNIX system.

Richard Morin operates Prime Time Freeware which publishes mixed-media (book/CD-ROM) freeware collections. He also consults and writes on UNIX- related topics.

Linux and The GNU Project

(Richard M Stallman)

Many computer users run a GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU that is widely used today is more often known as “Linux” and many users are not aware of the extent of its connection with the GNU Project.

There really is a Linux; it is a kernel, and these people are using it. But you can't use a kernel by itself; a kernel is useful only as part of a whole system. The system in which Linux is typically used is a modified variant of the GNU system - a Linux-based GNU system.

Many users are not fully aware of the distinction between the kernel, which is Linux, and the whole system, which they also call “Linux”. The ambiguous use of the name doesn't contribute to understanding.

Programmers generally know that Linux is a kernel. But since they have generally heard the whole system called “Linux” as well, they often assume that there must be a good reason for this. For example, some believe that once Linus Torvalds finished writing the kernel, his friends looked around for other available free software and just happened to find everything necessary to make a UNIX-like system.

What they found was no accident - it was the GNU system. The available free software added up to a complete system because the GNU Project had been working since 1984 to make one. The GNU Manifesto set forth the goal of developing a free UNIX-like system, and

after more than a decade of work, we have one.

Most free software projects have the goal of developing a particular program for a particular job. For example, Linus Torvalds set out to write a UNIX-like kernel (Linux); Donald Knuth set out to write a text formatter (TEX); Rob Scheiffler set out to develop a window system (X). It's natural to measure the contribution of this kind of project by specific programs that came from the project.

If the GNU Project were a project of this kind, and its contribution were measured in this way, what conclusions would follow? One CD-ROM vendor counted how much of their “Linux distribution” was GNU software. They found that GNU software was the largest single contingent, around 28% of the total source code, and this included some of the essential major components without which there could be no system. Linux itself was about 3%. So if you wanted to pick a name for the system based on credit for the programs in the system, that name would be “GNU”.

But choosing the name in this way would overlook a fundamental distinction. The GNU Project was not a project to develop specific software. It was not a project to develop a C compiler, although we developed one. It was not a project to develop a text editor, although we did so. The GNU Project's aim was to develop a complete free UNIX-like system. And that is what we have done, more or less, with many people's help.

Many people have made major contributions to the free software in the system, and they all deserve credit. But the reason it is a system – and not just a collection of useful programs – is because the GNU Project set out to make it one. We chose what programs to write based on what was needed to get a complete free system. We wrote essential but unexciting major components, such as the assembler and the C library, because you can't have a complete free system without them.

By the early '90s, when Linux was ready, we had put together the whole system, aside from the kernel (and we were working on a kernel, the GNU Hurd, which runs on Mach). So it was possible to put Linux, a free kernel, together with the GNU system, which had everything but the kernel, to make a complete free system.

Putting them together sounds simple, but it was not a trivial job. The GNU C library had to be changed substantially. And integrating a complete system as a distribution that would work “out of the box” was a big job, too. It required addressing the issue of how to install and boot the system – a problem we had not tackled, because we hadn't yet reached that point. The people who developed the various system distributions made a substantial contribution. Seen in perspective, their contribution was to combine Linux and the GNU system to produce a Linux-based modified GNU system.

Aside from GNU, one other project has independently produced a free UNIX-like operating system. This system is known as

BSD, and it was developed at UC Berkeley. The BSD developers were inspired by the example of the GNU Project, and occasionally encouraged by GNU activists, but their actual work had little overlap with GNU. BSD systems today use some GNU software, just as GNU systems use some BSD software; but taken as wholes, they are two different systems which evolved separately. A free operating system that exists today is most likely either a GNU or a BSD system.

We use Linux-based GNU systems in the GNU Project today, and we encourage you to use them too. But please don't confuse people by using the name “Linux” ambiguously. Linux is the kernel, one of the essential major components of a system. The system is GNU.

©1997 Richard M Stallman

Tel: 01763 273 475
Fax: 01763 273 255
Web: Webmaster
Queries: Ask Here
Join UKUUG Today!

UKUUG Secretariat