CHAPTER 2

Investigating Linux's Principles and Philosophy

You can frequently select a product or technology on purely pragmatic grounds—what OS works well for a given task, which software suite is the least expensive, and so on. Sometimes, though, understanding the principles and philosophy that underlie a technology can be useful, and might even guide your choice. This is true of some Linux users; the open source model of Linux, which I introduced in Chapter 1, “Selecting an Operating System,” has implications that can affect how Linux works. Furthermore, some people in the Linux world can become quite passionate about these principles. Whether or not you agree with these people, understanding their point of view can help you appreciate the Linux culture that you'll find in the work-place, online, at conferences, and so on.

This chapter covers these issues, beginning with information on Linux's origins and development over time up to the present. I then describe open source principles and how they can affect the way an open source OS works in the real world. Finally, I describe some of the roles in which Linux can work—as an embedded OS, as a desktop or laptop OS, and as a server OS.

  • image Linux through the ages
  • image Using open source software
  • image Understanding OS roles

Linux through the Ages

Although Linux's birth date of 1991 is recent by most historical standards, in the computer world 20 years is an eternity. Nonetheless, the software and culture in the early 1990s, and even before then, has had quite a legacy on today's software world. After all, what we use today is built atop the foundation created in the past. Thus, looking at how Linux originated will help you to understand Linux as it exists today.

Understanding Linux's Origins

Computers today can be classified in much the same way as was done in 1991, although some details have changed. A notable addition is embedded computers, as in cell phones.

In 1991, as today, computers were classified by their sizes and capabilities. Computers could belong to any of a handful of categories, ranging from desktop personal computers (PCs) to supercomputers. The PC marketplace of 1991 was dominated by x86-based computers that are the direct ancestors of today's PCs; however, other types of PCs were available, such as Macintoshes. Such computers generally used different CPUs and ran their own custom OSs.

In 1991, most PCs ran Microsoft's Disk Operating System (MS-DOS, PC-DOS, or DOS). DOS was extremely limited by today's standards; it was a single-tasking OS that didn't take full advantage of the memory or CPUs available at the time. The versions of Microsoft Windows available in 1991 ran on top of DOS. Although Windows helped work around some of DOS's limitations, it didn't fundamentally fix any of them. These early versions of Windows employed cooperative multitasking, for instance, in which programs could voluntarily give up CPU time to other processes. The DOS kernel could not wrest control from a program that hogged CPU time.

Unix was not the only multi-user, multitasking OS in 1991. Others, such as Virtual Memory System (VMS), were available. Unix is most relevant to Linux's history, though.

Above the PC level, Unix was a common OS in 1991. Compared to DOS and the Windows of that time, Unix was a sophisticated OS. Unix supported multiple accounts and provided true preemptive multitasking, in which the kernel could schedule CPU time for programs, even if the programs didn't voluntarily give up control. These features were practical necessities for many servers and for multiuser computers such as minicomputers and mainframes.

As time has progressed, the capabilities of each class of computer have grown. By most measures, today's PCs have the power of the minicomputers or even the mainframes of 1991. The OSs used on the PCs of 1991 don't scale well to more powerful hardware, and today's PCs are now powerful enough to run the more sophisticated OSs of 1991. For this reason, DOS and its small-computer contemporaries have been largely abandoned in favor of Unix and other alternatives.

Today's versions of Windows are not derived from DOS. Instead, they use a new kernel that shares many design features with VMS.

In 1991, Linus Torvalds was a student at the University of Helsinki, studying computer science. He was interested in learning about both Unix and the capabilities of the new x86 computer he'd just purchased. Torvalds began the program that would become the Linux kernel as a low-level terminal emulator—a program to connect to his university's larger computers. As his program grew, he began adding features that turned his terminal program into something that could be better described as an OS kernel. Eventually, he began writing with the goal of creating a Unix-compatible kernel—that is, a kernel that could run the wide range of Unix software that was available at the time.

Unix's history, in turn, stretched back two more decades, to its origin at AT&T in 1969. Because AT&T was at that time a telephone monopoly in the United States, it was legally forbidden from selling software. Therefore, when its employees created Unix, AT&T basically gave the OS away. Universities were particularly enthusiastic about adopting Unix, and some began modifying it, since AT&T made the source code available. Thus, Unix had a two-decade history of open software development. Most Unix programs were distributed as source code, since Unix ran on a wide variety of hardware platforms—binary programs made for one machine would seldom run on a different machine.

Early on, Linux began to tap into this reservoir of available software. As noted in Chapter 1, early Linux developers were particularly keen on the GNU's Not Unix (GNU) project's software, so Linux quickly accumulated a collection of GNU utilities. Much of this software had been written with workstations and more powerful computers in mind, but because computer hardware kept improving, it ran fine on the x86 PCs of the early 1990s.

Linux quickly acquired a devoted following of developers who saw its potential to bring workstation-class software to the PC. These people worked to improve the Linux kernel, to make the necessary changes in existing Unix programs so that they would work on Linux, and to write Linux-specific support programs. By the mid-1990s, several Linux distributions existed, including some that survive today. (Slackware was released in 1993, and Red Hat in 1995, for example.)

The 386BSD OS was a competing Unix-like OS in the early 1990s. Today it has forked into several related OSs: FreeBSD, NetBSD, OpenBSD, Dragonfly BSD, and PC-BSD.

THE MICROKERNEL DEBATE

Linux is an example of a monolithic kernel, which is a kernel that does everything a kernel is supposed to do in one big process. In 1991, a competing kernel design, known as a microkernel, was all the rage. Microkernels are much smaller than monolithic kernels; they move as many tasks as they can into non-kernel processes and then manage the communications between processes.

Soon after Linux's release, Linus Torvalds engaged in a public debate with Andrew Tanenbaum, the creator of the Minix OS that Torvalds used as an early development platform for Linux. Minix uses a microkernel design, and Tanenbaum considered Linux's monolithic design to be backward.

As a practical matter for an end user, either design works. Linux and the BSD-derived kernels use monolithic designs, whereas modern versions of Windows, the GNU HURD, and Minix are examples of microkernels. Some people still get worked up over this distinction, though.

Seeing Today's Linux World

By the mid-1990s, the most important features of Linux as it exists today had been established. Changes since then have included:

Improvements in the kernel The Linux kernel has seen massive changes since 1991, when it lacked many of the features we rely on today. Improvements include the addition of networking features, innumerable hardware drivers, support for power management features, and support for many non-x86 CPUs.

Improvements in support tools Just as work has progressed on the Linux kernel, improvements have also been made to the support programs on which it relies—the compilers, shells, GUIs, and so on.

Creation of new support tools New support tools have emerged over the years. These range from simple and small utilities to big desktop environments. In fact, some of these tools, such as modern desktop environments, are far more obvious to the end user than is the kernel itself.

Creation of new distributions As noted earlier, Slackware dates to 1993 and Red Hat (the predecessor to Red Hat Enterprise Linux, CentOS, and Fedora) originated in 1995. Other distributions have emerged in the intervening years, and some have been quite important. The Android OS used on smart phones and tablets, for instance, is becoming influential in the early 2010s.

Linux's roots remain very much in the open source software of the 1980s and 1990s. Although a typical desktop or embedded OS user is likely to perceive the OS through the lens of the GUI, much of what happens under the surface happens because of the Linux kernel and open source tools, many of which have existed for decades.

Using Open Source Software

The philosophies that underlie much software development for Linux are different from those that drive most software development for Windows. These differing philosophies affect how you obtain the software, what you can do with it, and how it changes over time. Thus, I describe these principles. I also describe how Linux functions as a sort of “magnet,” integrating software from many sources in one place.

Understanding Basic Open Source Principles

Broadly speaking, software can be described as coming in several different forms, each with different expectations about payment, redistribution, and users' rights. The number of categories varies depending on the depth of analysis and the prejudices of the person doing the categorization, but as a starting point, four categories will do:

Commercial software Individuals or companies develop commercial software with the intent to sell it for a profit. Developers generally keep the source code for commercial source software secret, which means that users can't normally make changes to the software except to alter configuration settings the software supports. In the past, commercial software was sold in stores or by mail order, but today it's often sold via downloads from the Internet. Redistributing commercial software is generally illegal. Microsoft Windows and Microsoft Office are both common examples of commercial software.

Shareware software From a legal perspective, shareware software is similar to commercial software in that it's copyrighted and the author asks for payment. The difference is that shareware is distributed on the Internet or in other ways and “sold” on an honor system—if you use the software beyond a trial period, you're expected to pay the author. Shareware was common in 1991 and is still available today, but it's much rarer.

Freeware should not be confused with free software, which is closely related to open source software. Chapter 3, “Understanding Software Licensing,” describes free software in more detail.

Freeware Freeware, like shareware, is available for free. Unlike shareware authors, though, the authors of freeware don't ask for payment. Sometimes, freeware is a stripped-down version of a more complete shareware or commercial program. Other times, the authors make it available for free to promote another product. Examples include Windows drivers for many hardware devices or the Adobe Reader program for reading Portable Document Format (PDF) files. As with commercial and shareware programs, freeware generally comes without source code.

image Open source software Open source software is defined by a set of ten principles, available at http://www.opensource.org/docs/osd. The most important of these principles are the right of the user to redistribute the program, the availability of source code, and the right of the user to make and distribute changed versions of the program. These principles mean that users can alter open source programs to suit their own needs, even in ways or for purposes the original author doesn't support.

Chapter 3 covers specific open source licenses in greater detail.

Variants within each of these categories exist, as well as hybrids that don't quite fit in any category. For instance, the Open Source Initiative maintains a list of licenses it has approved as fulfilling its criteria (http://www.opensource.org/licenses); however, developers sometimes release software using obscure licenses or using licenses that impose conditions that run afoul of one of the more obscure Open Source Initiative rules. Such software is technically not open source, but it might be closer to open source than to another category.

The basic idea behind open source software is that software developed in a transparent manner is likely to be superior to software developed in a closed manner. This superiority (and arguments against it) comes in several ways:

This principle is sometimes referred to as “Linus's Law,” which was stated by Eric S. Raymond in “The Cathedral and the Bazaar”: “Given enough eyeballs, all bugs are shallow.”

Better code Exposing source code to the community at large means that it can be reviewed, judged, and improved upon by any interested party. Otherwise obscure bugs might be found and squashed when they might linger and cause problems in a closed-source product. On the other hand, the validity of this claim is not well supported by research, and smaller projects might not gain much in the way of interest from other programmers, so they might not benefit from outside code review.

More flexibility By providing users with the source code, an open source project gives users the ability to customize the software for their own needs. If users submit changes back to the maintainer, or release them as a new branch of the project, then everybody can benefit from such changes. Of course, critics would argue that this flexibility is only a benefit to those with the necessary skill and time to make such changes, or to those with the money to hire somebody to do it.

Lower cost Although the open source definition does not forbid sale of software, the redistribution requirements mean that open source software ends up being available free of charge. On the other hand, if you want support you may need to purchase a support contract, which can reduce or eliminate the cost benefits.

Lack of vendor lock-in The developers of some proprietary products, and particularly very popular ones, can make it difficult for competing products by using proprietary file formats or standards and by not supporting more open standards. Open source tools are less subject to such problems, since they can be modified to support open standards even if they initially don't do so. As a practical matter, though, even proprietary file formats and protocols are usually reverse-engineered, so vendor lock-in usually ends up being a temporary problem rather than a permanent one.

Of course, within the Linux community the general consensus is that each of these factors is a real point in favor of Linux, and of open source software generally; the downsides noted are generally regarded as minor compared to the advantages. In the end, you'll need to make up your own mind on these matters after using different types of software.

Linux as a Software Integrator

Since soon after Unix was created, the OS fragmented into a set of loosely affiliated OSs. These OSs were incompatible on the binary level but more or less compatible on the source code level. This is still true today. You can take the same program and compile it for FreeBSD, OS X, and Linux, and it will work the same on all three platforms—but the compiled binaries made for one platform won't work on the others.

There are exceptions to this rule, though. Some programs rely on features that are available on just some Unix-like OSs. Others have quirks that make it impossible to compile them on some OSs. If a program falls into disuse, it may become unusable on newer OSs because it relies on compiler or OS features that have changed. Such problems tend to be ironed out over time, but they do crop up periodically.

Because of Linux's popularity, most open source Unix programs compile and work fine on Linux. Commercial programs for Linux also exist, although most of these are obscure or specialized. In any event, Linux has become an OS that most open source Unix programs must support. This effect is so strong that many projects now target Linux as the primary platform.

Understanding OS Roles

Computers fill many roles in the world, and as computers have become more common and less expensive, those roles have multiplied. Linux can serve as the OS for most of these roles, each of which draws on its own subset of support utilities. Some of these roles also require tweaking the kernel itself. I briefly describe three of these roles: embedded computers, desktop and laptop computers, and server computers.

Understanding Embedded Computers

image As noted in Chapter 1, embedded computers are specialized devices that fulfill a specific purpose. Examples include:

Apple, Microsoft, and other vendors provide their own OSs for cell phones.

Cell phones Modern cell phones use computers with OSs that range from simple to complex. Linux powers some of these cell phones, usually in the form of Android.

e-book readers These devices, like cell phones, are specialized computers and so use an OS to power them. For many current e-book readers, that OS is Linux—either a custom Linux version or Android.

The MythTV package (http://www.mythtv.org) can turn an ordinary PC into a Linux-based DVR, although you'll need a TV tuner and other specific hardware to make it work.

DVRs Digital video recorders (DVRs), which record TV shows for later viewing, are computers with specialized software. Some of these, including the popular TiVo models, run Linux.

Car computers Automobiles have included computers for years. These have mostly been tucked out of the way to monitor and control the engine; however, modern cars increasingly come with computers that users more readily identify as being computers. They manage global positioning system (GPS) navigation systems, control the radio, and even provide Internet access.

Appliances Televisions, refrigerators, and other appliances are increasingly using computers to monitor energy use and for other purposes.

You might also think of tablet computers as falling in this category as well, although they can more closely resemble desktop or laptop computers. The distinction is mainly one of how much control the user has over the OS; embedded devices are designed to be used, but not maintained, by end users. The system administration tasks described in this book are done at the factory or using much simpler and more specialized user interfaces.

Understanding Desktop and Laptop Computers

Desktop computers are similar to another class of computer, known as workstations. Workstations tend to be more powerful and specialized, and they often run Unix or Linux.

image Linux began life on a desktop computer, and although Linux doesn't come close to dominating that market, desktop computers are a good way to begin learning about Linux. Laptop computers are similar to desktop computers from a system administration perspective; both types of computers are often used by a small number of people for productivity tasks, such as word processing, Web browsing, and managing digital photos. For brevity, I'll use the term desktop to refer to both types of computer from here on.

Linux software for such tasks is widely available and is quite good, although some people prefer commercial counterparts, such as Microsoft Office or Adobe Photoshop, that aren't available for Linux. This preference for a few specific commercial products is part of why Microsoft Windows continues to dominate the desktop market. Some people have speculated that the open source development model doesn't lend itself to the creation of popular GUI applications because software developers tend to be too technically oriented to fully appreciate the needs of less technically capable users. Without an explicit way to require developers to fulfill these needs, which for-profit companies create, open source software projects lag behind their commercial counterparts in usability. This view is not universally held, though, and at worst, open source projects lag behind their commercial counterparts just a bit.

Specific software that's required on most Linux-based desktop computers includes:

  • image The X Window System GUI (X for short)
  • image A popular desktop environment, such as GNOME, KDE, Xfce, or Unity
  • image A Web browser, such as Mozilla Firefox
  • image An email client, such as Mozilla Thunderbird or Evolution
  • image A graphics editor, such as the GIMP
  • image An office suite, such as OpenOffice.org or the similar LibreOffice

Additional requirements vary depending on the user's needs. For instance, one user might need multimedia editing tools, whereas another might need scientific data analysis software.

Linux distributions such as Fedora and Ubuntu typically install these popular desktop tools by default, or as a group by selecting a single install-time option. These distributions are also designed for relatively easy maintenance, so that users with only modest skill can install the OS and keep it running over time.

Understanding Server Computers

image Server computers can be almost identical to desktop computers in terms of their hardware, although servers sometimes require bigger hard disks or better network connections, depending on how they're used. Many popular network server programs were written for Unix or Linux first, making these platforms the best choice for running them. Examples include:

  • image Web servers, such as Apache
  • image Email servers, such as sendmail and Postfix
  • image Databases, such as MySQL
  • image File servers, such as the Network File System (NFS) or Samba
  • image Print servers, such as the Common Unix Printing System (CUPS) or Samba
  • image Domain Name System (DNS) servers, such as the Berkeley Internet Name Domain (BIND)
  • image Dynamic Host Configuration Protocol (DHCP) servers, such as the Internet Software Consortium's (ISC's) dhcpd
  • image Time servers, such as the Network Time Protocol (NTP)
  • image Remote login servers, such as Secure Shell (SSH) or Virtual Network Computing (VNC)

Remote login servers enable users to run desktop-style programs on a computer remotely. Therefore, they're sometimes found even on desktop systems.

In a large organization, each of these services may have a distinct associated server computer. It's possible, though, for one computer to run many of these server programs simultaneously.

Most of these servers do not require a GUI, so server computers can do without X, desktop environments, or the typical desktop programs you'll find on a desktop computer. One of Linux's advantages over Windows is that you can run the computer without these elements, and even uninstall them completely. Doing so means that the GUI won't be needlessly consuming system resources such as RAM. Furthermore, if an item such as X isn't running, any security bugs it might harbor become unimportant. Some distributions, such as Debian, Arch, and Gentoo, eschew GUI configuration utilities. This makes these distributions unfriendly to new users, but the reliance on text-mode configuration tools is not a problem to experienced administrators of server computers.

The people who maintain large server computers are generally technically quite proficient and can often contribute directly to the open source server projects they use. This close association between users and programmers can help keep server projects on the cutting edge of what's required in the real world.

Note that the distinction between desktop and server computers is not absolute; a computer can run a mixture of both types of software. For instance, you might configure desktop computers in an office environment to run file server software. This configuration enables users to more easily share their work with others in the office. In a home or small office setting, running other servers on desktop computers can obviate the need to buy specialized hardware to fulfill those roles.

THE ESSENTIALS AND BEYOND

Linux's development history is tied to that of Unix and to open source development generally. Open source software is provided with source code and with the right to modify and redistribute the source code. This guarantees your ability to use the software even in ways the original author did not anticipate or support, provided you have the knowledge and time to alter it, or the resources to hire somebody else to do so. These open source principles have led to a great deal of popular software, particularly in the server arena; however, open source developers have been less able to capture the general public's excitement with applications designed for desktop computers.

SUGGESTED EXERCISES

  • image Read the Features Web page on FreeBSD, http://www.freebsd.org/features.html, a competitor to Linux. How would you say it differs from Linux?
  • image Research the features of two or three open source programs that interest you, such as Apache, LibreOffice, and Mozilla Firefox. Do the feature lists seem complete? Are there features missing that are present in commercial counterparts?

REVIEW QUESTIONS

  1. What type of multitasking does Linux use?
    1. Preemptive
    2. Multi-user
    3. Co-operative
    4. Single-tasking
    5. Single-user
  2. Which of the following is a characteristic of all open source software?
    1. The software cannot be sold for a profit; it must be distributed free of charge.
    2. It must be distributed with both source code and binaries.
    3. Users are permitted to redistribute altered versions of the original software.
    4. The software was originally written at a college or university.
    5. The software must be written in an interpreted language that requires no compilation.
  3. Which of the following programs is most likely to be installed and regularly used on a desktop computer that runs Linux?
    1. Apache
    2. Postfix
    3. Android
    4. Evolution
    5. BIND
  4. True or false: VMS was a common OS on x86 PCs at the time Linux was created.
  5. True or false: Some DVRs run Linux.
  6. True or false: A Linux computer being used as a server generally does not require X.
  7. Linux uses a _________ kernel design, as contrasted with a microkernel design.
  8. A type of software that's distributed for free but that requires payment on the “honor system” if a person uses it is called _________.
  9. A _________ computer is likely to run a word processor and Web browser.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.59.103.67