A Linux system is only as good as the software you install on it. The Linux kernel by itself is pretty boring; you need applications such as web servers, database servers, browsers, and word processing tools to actually do anything useful with your Linux system. This chapter addresses the role of software on your Linux system and how you get and manage it. First we discuss just how software is created in the age of open source and how you retrieve and compile software code. Next, we explore the ways Linux makes things easier for us by bundling pre-built software packages to make installation and removal of applications a breeze.
The “source” part of the open-source world refers to the availability of the actual programming code used to create applications. While commercial applications hide their source code from prying eyes, open-source projects make their program code openly available for anyone to peruse and modify if needed. Most applications in the Linux environment are distributed as open-source projects, so you’re free to download, modify, compile, and run those applications on your Linux system.
While this may sound complicated, it really isn’t. The following sections walk through the process of downloading, extracting, compiling, and running open-source application code on your Linux system.
Once developers are ready to release their open-source applications to the world, they publish them on the Internet. Developers for most open-source packages use a website to host their code and documentation, and many even provide user forums that allow customers to discuss issues and possible improvements.
While you can use a graphical browser to connect to a website and download source code, that’s not always available, especially in Linux server environments. Linux provides a couple of command-line tools to help us download source code files directly from the command line.
The wget application is a command-line tool from the GNU Project that allows you to retrieve files from remote servers using FTP, FTPS, HTTP, or HTTPS. You specify the protocol, server name, and file to download using a standard URL format, where remotehost
is the full hostname for the location hosting the files, and filename
is the name of the source code file you wish to retrieve, including the folder path required:
wget http://remotehost/filename
The wget application supports lots of command-line options to help you customize the connection and download. These especially come in handy if you write scripts to automatically download files. Check out the manual pages for the wget application to see what all you can do.
Yet another solution is the cURL application. It does the same thing as wget but supports many more protocols, such as DAP, DICT, FILE, Gopher, IMAP, LDAP, POP3, RTSP, SCP, SFTP, SMTP, and TFTP. It too uses the standard URL format for you to specify the protocol, server name, and file to download.
One nice feature of cURL is its ability to work with the secure HTTPS protocol. It will warn you if the remote website is using a self-signed certificate or if the certificate is signed by an untrusted certificate authority (CA).
A relatively recent advancement in software distribution is GitHub (https://github.com). It provides a centralized location on the Internet for projects that use the git version control system (see Chapter 27). The code for many open-source projects is now posted in GitHub, even if there is already a dedicated website for the project. You can use both wget and cURL to download project code from GitHub.
Distributing the source code for applications can be a bit tricky. Source code projects often comprise many different files:
Trying to distribute a large batch of files for a project can be a challenge. Linux provides somewhat of an odd solution for that.
The tar program was originally developed for archiving files and folders to tape drives for backups (the tar name originally stood for tape archiver). These days it also comes in handy for bundling project files to distribute on the Internet.
The tar
command allows you to specify multiple files, or even multiple folders of files, to bundle together into a single output file. You can then transfer the entire project bundle as a single file and extract the files and folders on a remote system. It’s so versatile in what it can do that there is a long list of command-line options available, which can become somewhat imposing.
For most bundling operations, three basic option groups are commonly used for the tar
command:
-cvf
: Create a new tar file-tvf
: Display the contents of a tar file-xvf
: Extract the contents of a tar fileTo create a new tar archive file, specify the output file name and then the list of files and folders to bundle, as shown in the example in Listing 13.1.
Listing 13.1: Using the tar
command to bundle files
$ tar -cvf test.tar test1.txt test2.txt test3.txt
test1.txt
test2.txt
test3.txt
$ ls -al
total 32
drwxr-xr-x 2 rich rich 4096 Dec 5 08:33 .
drwxr-xr-x 19 rich rich 4096 Dec 5 08:28 ..
-rw-r--r-- 1 rich rich 795 Dec 5 08:19 test1.txt
-rw-r--r-- 1 rich rich 1020 Dec 5 08:19 test2.txt
-rw-r--r-- 1 rich rich 2280 Dec 5 08:20 test3.txt
-rw-r--r-- 1 rich rich 10240 Dec 5 08:33 test.tar
$
In Listing 13.1, test.tar
is the name of the archive file to create. For the input files and folders, you can use wildcard characters to specify the names, or even redirect a listing of files to the tar
command, making it very versatile in scripts. One of the advantages of bundling folders with tar
is that it preserves the folder structure of your environment, including file and folder ownership, making it easier to extract the files and re-create the original environment.
While not required, it’s become somewhat of a de facto standard in Linux to use a .tar file name extension to identify a tar archive file. This is commonly called a tarball in Linux circles.
If you need to see what’s in a tar archive file, use the -tvf
option group:
$ tar -tvf test.tar
-rw-r--r-- rich/rich 795 2018-12-05 08:19 test1.txt
-rw-r--r-- rich/rich 1020 2018-12-05 08:19 test2.txt
-rw-r--r-- rich/rich 2280 2018-12-05 08:20 test3.txt
$
Notice that both the file ownerships and the file permissions are retained within the tar archive file. When you extract the files onto another system, they’ll be assigned to the userid that matches the user number assigned to the original files.
Extracting the files and folders from a tar file is just a matter of using the -xvf
option group:
$ tar -xvf test.tar
test1.txt
test2.txt
test3.txt
$ ls -al
total 32
drwxr-xr-x 2 rich rich 4096 Dec 5 08:38 .
drwxr-xr-x 20 rich rich 4096 Dec 5 08:38 ..
-rw-r--r-- 1 rich rich 795 Dec 5 08:19 test1.txt
-rw-r--r-- 1 rich rich 1020 Dec 5 08:19 test2.txt
-rw-r--r-- 1 rich rich 2280 Dec 5 08:20 test3.txt
-rw-r--r-- 1 rich rich 10240 Dec 5 08:38 test.tar
$
While the tar archive method makes bundling files for distribution easy, it does tend to create a very large file, which can be awkward to handle. Linux developers usually compress the final tar archive file using some type of file compression utility.
In Linux there is a plethora of ways to create compressed files. Table 13.1 lists the most popular methods you’ll run into.
Table 13.1 Linux compression methods
Method | File name extension | Description |
bzip2 | .bz2 |
Improvement to the gzip method that reduces file sizes |
compress | .Z |
The original Unix compress utility |
gzip | .gz |
Fast compression method that produces moderate-sized files |
xz | .xz |
Creates smaller compressed files, but can be very slow |
By far the most common zip utility used in Linux for tar archive files is the GNU gzip package. To compress a single file, use the gzip
utility with the file name, as shown in Listing 13.2.
Listing 13.2: Compressing a tar archive file
$ gzip test.tar
$ ls -al
total 24
drwxr-xr-x 2 rich rich 4096 Dec 5 08:53 .
drwxr-xr-x 20 rich rich 4096 Dec 5 08:39 ..
-rw-r--r-- 1 rich rich 795 Dec 5 08:19 test1.txt
-rw-r--r-- 1 rich rich 1020 Dec 5 08:19 test2.txt
-rw-r--r-- 1 rich rich 2280 Dec 5 08:20 test3.txt
-rw-r--r-- 1 rich rich 204 Dec 5 08:33 test.tar.gz
$
As seen in Listing 13.2, the gzip
program adds a .gz file name extension to the end of the file that’s compressed.
Often, with compressed tar archive files, you’ll see developers shorten the .tar.gz
file name extension pair to just .tgz.
To decompress a compressed tarball and extract the original files, you have a couple of options. One option is to use a two-step approach. First use the gunzip command directly on the compressed tar file:
$ gunzip test.tar.gz
This restores the original test.tar
file. Then you extract the tar file using the standard -xvf
options of the tar
command.
The second option is to decompress and extract the tarball file in one step by just adding the -z
option to the tar
command line:
$ tar -zxvf test.tgz
test1.txt
test2.txt
test3.txt
$ ls -al
total 24
drwxr-xr-x 2 rich rich 4096 Dec 5 09:03 .
drwxr-xr-x 3 rich rich 4096 Dec 5 09:02 ..
-rw-r--r-- 1 rich rich 795 Dec 5 08:19 test1.txt
-rw-r--r-- 1 rich rich 1020 Dec 5 08:19 test2.txt
-rw-r--r-- 1 rich rich 2280 Dec 5 08:20 test3.txt
-rw-r--r-- 1 rich rich 204 Dec 5 09:02 test.tgz
$
One important thing to note is that when you use the gunzip
program directly, it removes the compressed file and replaces it with the original file, but when you use the -z
option with the tar
program, it retains the compressed file along with decompressing and extracting the original files.
Once you have the source code package files downloaded onto your Linux system, you’ll need to compile them to create an executable file to run the application. Linux supports a wide variety of programming languages, so you’ll need to know just what programming language the application was written in. Once you know that, you’ll need to install a compiler for the program code. A compiler converts the source code into an executable file the Linux system can run.
The most common tool used for compiling programs in Linux is the GNU Compiler Collection (gcc). While originally created to support only the C programming language, gcc now supports an amazing array of different programming languages, such as Ada, C++, Fortran, Go, Java, Objective-C, Objective-C++, and OpenMP.
Most Linux distributions don’t include the gcc program by default, so most likely you’ll need to install it on your Linux system. For Ubuntu, it’s part of the build-essentials
package, while for CentOS you’ll find it in the Development Tools
package group.
To compile simple one-file programs, just run the gcc
command-line command against the source code file to produce the executable file that you run on your system. The -o
command-line option allows you to specify the name of the compiled output file; otherwise it defaults to the ugly a.out
file name:
$ cat hello.c
#include <stdio.h>
int main() {
printf("Hello, this is my first C program! ");
return 0;
}
$ gcc -o hello hello.c
$ ./hello
Hello, this is my first C program!
$
As mentioned earlier, most larger applications require additional header and library files besides the source code files to build the final application file. Depending on just how many source code, header, and library files are required for an application, the gcc
command process can get very long and complicated. Separate library files need to be compiled in the proper order before the main program file can be compiled, creating a difficult road map to follow to generate the application.
There’s a simple solution available for you to help keep track of all that. The make utility allows developers to create scripts that guide the compiling and installation process of application source code packages so that even novices can compile and install an application from source code.
Usually there are three steps involved with installing an application that uses a make
script:
make
script to build the application for your environment.make
utility by itself to build the necessary library files and executable files for the application.What makes C language programs so complicated is that they often split the application functions into separate library files. Each library file contains one or more specialized functions used in the application code.
The benefit of splitting functions into separate library files is that multiple applications that use the same functions can share the same library files. These files, called shared libraries, make it easier to distribute applications but more complicated to keep track of what library files are installed with which applications.
While not necessary for compiling the application source code, the ldd utility can come in handy if you need to track down missing library files for an application. It displays a list of the library files required for the specified application file:
$ ldd hello
linux-vdso.so.1 (0x00007fff0f378000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f8e16063000)
/lib64/ld-linux-x86-64.so.2 (0x00007f8e16656000)
$
My simple hello
application requires two external library files, the standard linux-vdso.so.1
and libc.so.6
files, which provide the ability for the printf()
function to display the output. The ldd
utility also shows where those files were found on the Linux system. That in itself can be helpful when troubleshooting issues with applications picking up the wrong library files.
While the tar
, gcc
, and make
programs make it easier to distribute, compile, and install application source code, that’s still somewhat of a messy process for installing new applications. For most Linux users, all they want to do is download an application and use it.
To help solve that problem, Linux distributions have created a system for bundling already compiled applications for distribution. This bundle is called a package, and it consists of all the files required to run a single application. You can then install, remove, and manage the entire application as a single package rather than as a group of disjointed files.
Tracking software packages on a Linux system is called package management. Linux implements package management by using a database to track the installed packages on the system. The package management database keeps track of not only what packages are installed but also the exact files and file locations required for each application. Determining what applications are installed on your system is as easy as querying the package management database.
As you would expect, different Linux distributions have created different package management systems for working with their package management databases. However, over the years, two main package management systems have risen to the top and have become standards:
Because of their popularity, these are the two package management methods covered by the Linux+ exam, so these are the two package management methods we’ll cover in detail in this chapter.
Each package management system uses a different method of tracking application packages and files, but they both track similar information:
The following sections discuss the tools for using each of these package management systems.
Both the Debian and Red Hat package management systems have similar sets of tools for working with software packages in the package management system. We’ll now take a look at both systems and the tools to use with them.
As you can probably guess, the Debian package management system is mostly used on Debian-based Linux systems, such as Ubuntu. Debian bundles application files into a single .deb package file for distribution. The core tool to use for handling .deb
files is the dpkg program.
The dpkg
program is a command-line utility that has options to install, update, and remove .deb
package files on your Linux system. The basic format for the dpkg
command is as follows:
dpkg [options] action package-file
The action
parameter defines the action to be taken on the file. Table 13.2 lists the more common actions you’ll need to use.
Table 13.2 The dpkg
command actions
Action | Description |
-C |
Searches for broken installed packages and suggests how to fix them |
--configure |
Reconfigures an installed package |
--get-selections |
Displays currently installed packages |
-i |
Installs the package |
-I |
Displays information about an uninstalled package file |
-l |
Lists all installed packages matching a specified pattern |
-L |
Lists the installed files associated with a package |
-p |
Displays information about an installed package |
-P |
Removes an installed package, including configuration files |
-r |
Removes an installed package but leaves the configuration files |
-S |
Locates the package that owns the specified files |
Each action has a set of options that you can use to modify the basic behavior of the action, such as to force overwriting an already installed package or ignore any dependency errors.
To use the dpkg
program, you must have the .deb
software package available, from either an installation DVD or downloading the package from the Internet. Often you can find .deb
versions of application packages ready for distribution on the application website, or most distributions maintain a central location for packages to download.
The Debian distribution also provides a central clearinghouse for Debian packages at https://www.debian.org/distrib/packages.
When you download a .deb
package for a precompiled application, be careful that you get the correct package for your workstation processor chip. Source code files are compiled for specific processors, and trying to run the wrong one on your system will not work. Usually the processor type is added as part of the package name.
Once you download the .deb
package, use dpkg
with the -i
option to install it:
$ sudo dpkg -i zsh_5.3.1-4+b2_amd64.deb
Selecting previously unselected package zsh.
(Reading database ... 204322 files and directories currently installed.)
Preparing to unpack zsh_5.3.1-4+b2_amd64.deb ...
Unpacking zsh (5.3.1-4+b2) ...
dpkg: dependency problems prevent configuration of zsh:
zsh depends on zsh-common (= 5.3.1-4); however:
Package zsh-common is not installed.
dpkg: error processing package zsh (--install):
dependency problems - leaving unconfigured
Processing triggers for man-db (2.8.3-2ubuntu0.1) ...
Errors were encountered while processing:
zsh
$
You can see in this example that the package management software checks to ensure that any packages that are required for the application are installed and produces an error message if any of them are missing. This gives you a clue as to what other packages you need to install.
If you’d like to see all of the packages installed on your system, use the -l
option:
$ dpkg -l
Desired=Unknown/Install/Remove/Purge/Hold
| Status=Not/Inst/Conf-files/Unpacked/halF-conf/Half-inst/trig-aWait/Trig
|/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad)
||/ Name Version Architecture Description
+++-==============-============-============-===========================
ii accountsservic 0.6.45-1ubun amd64 query and manipulate accounts
ii acl 2.2.52-3buil amd64 Access control list utilities
ii acpi-support 0.142 amd64 scripts for handling ACPI
ii acpid 1:2.0.28-1ub amd64 Advanced Config and Power
ii adduser 3.116ubuntu1 all add and remove users
ii adium-theme-ub 0.3.4-0ubunt all Adium message style for Ubuntu
ii adwaita-icon-t 3.28.0-1ubun all default icon theme of GNOME
ii aisleriot 1:3.22.5-1 amd64 GNOME solitaire card game
...
You can also provide a search term on the command line to limit the packages returned in the output:
$ dpkg -l openssh*
Desired=Unknown/Install/Remove/Purge/Hold
| Status=Not/Inst/Conf-files/Unpacked/halF-conf/Half-inst/trig-aWait/Trig
|/ Err?=(none)/Reinst-required (Status,Err: uppercase=bad)
||/ Name Version Architecture Description
+++-==============-============-============-=============================
ii openssh-client 1:7.6p1-4ubu amd64 secure shell (SSH) client
un openssh-server <none> <none> (no description available)
$
If you need to remove a package, you have two options. The -r
action removes the package but keeps any configuration and data files associated with the package installed. This is useful if you’re just trying to reinstall an existing package and don’t want to have to reconfigure things. If you really do want to remove the entire package, use the -P
action, which purges the entire package, including configuration files and data files from the system.
Be very careful with the -p
and -P
options. They’re easy to mix up. The -p
option lists the packages, while the -P
option purges the packages. Quite a difference!
The dpkg
tool gives you direct access to the package management system, making it easier to install applications on your Debian-based system.
The Red Hat Linux distribution, along with other Red Hat–based distributions such as Fedora and CentOS, use the .rpm package file format. The main tool for working with .rpm
files is the rpm program.
Similar to the dpkg
tool, the rpm
program is also a command-line program to install, modify, and remove .rpm
software packages. The basic format for the rpm
program is as follows:
rpm action [options] package-file
The actions for the rpm
command are shown in Table 13.3.
Table 13.3 The rpm
command actions
Action | Description |
-b |
Builds a binary package from source files |
-e |
Uninstalls the specified package |
-F |
Upgrades a package only if an earlier version already exists |
-i |
Installs the specified package |
-q |
Queries if the specified package is installed |
-U |
Installs or upgrades the specified package |
-V |
Verifies if the package files are present |
To use the rpm
command, you must have the .rpm
package file downloaded onto your system. While you can use the -i
action to install packages, it’s more common to use the -U
action, which installs the new package or upgrades the package if it’s already installed. Adding the -vh
option is a popular combination that shows the progress of the update and what it’s doing:
$ sudo rpm -Uvh zsh-5.0.2-31.el7.x86_64.rpm
Preparing... ################################# [100%]
Updating / installing...
1:zsh-5.0.2-31.el7 ################################# [100%]
$
You use the -q
action to query the package management database for installed packages:
$ rpm -q zsh
zsh-5.0.2-31.el7.x86_64
$
If you need to remove an installed package, just use the -e
action:
$ sudo rpm -e zsh
$ sudo rpm -q zsh
package zsh is not installed
$
The -e
action doesn’t show if it was successful, but it will display an error message if something goes wrong with the removal.
The dpkg
and rpm
commands are useful tools, but they both have their limitations. If you’re looking for new software packages to install, it’s up to you to find them. Also, if a package depends on other packages to be installed, it’s up to you to install those packages first and in the correct order. That can become somewhat of a pain to keep up with.
To solve that problem, each Linux distribution has its own central clearinghouse of packages, called a repository. The repository contains software packages that have been tested and known to install and work correctly in the distribution environment. By placing all known packages into a single repository, the Linux distribution can create a one-stop shopping environment for installing all applications for the system.
Most Linux distributions create and maintain their own repositories of packages. There are also additional tools for working with package repositories. These tools can interface directly with the package repository to find new software and even automatically find and install any dependent packages the application requires to operate.
Besides the officially supported distribution package repositories, many third-party package repositories have sprung up on the Internet. Often specialized or custom software packages aren’t distributed as part of the normal Linux distribution repository but are available in third-party repositories. The repository tools allow you to retrieve those packages as well.
The following sections walk through how to use the Debian and Red Hat repository tools.
The core tool used for working with Debian repositories is the apt suite of tools. This includes the apt-cache program, which provides information about the package database, and the apt-get program, which does the work of installing, updating, and removing packages.
The apt suite of tools relies on the /etc/apt/sources.list
file to identify the locations of where to look for repositories. By default, each Linux distribution enters its own repository location in that file, but you can add additional repository locations as well if you install third-party applications not supported by the distribution.
There are a few useful command options in the apt-cache
program for displaying information about packages:
depends
: Displays the dependencies required for the packagepkgnames
: Displays all the packages installed on the systemshowpkg
: Displays information about the specified packagestats
: Displays package statistics for the systemunmet
: Displays any unmet dependencies for installed packagesThe workhorse of the apt suite of tools is the apt-get
program. It’s what you use to install and remove packages from a Debian package repository. Table 13.4 lists the apt-get
commands.
Table 13.4 The apt-get
program action commands
Action | Description |
autoclean |
Removes information about packages that are no longer in the repository |
check |
Checks the package management database for inconsistencies |
clean |
Cleans up the database and any temporary download files |
dist-upgrade |
Upgrades all packages, but monitors for package dependencies |
dselect-upgrade |
Completes any package changes left undone |
install |
Installs a package and updates the package management database |
remove |
Removes a package from the package management database |
source |
Retrieves the source code package for the specified package |
update |
Retrieves updated information about packages in the repository |
upgrade |
Upgrades all installed packages to newest versions |
Installing a new package from the repository is as simple as specifying the package name with the install action:
$ sudo apt-get install zsh
Reading package lists... Done
Building dependency tree
Reading state information... Done
Suggested packages:
zsh-doc
The following NEW packages will be installed:
zsh
...
Setting up zsh (5.4.2-3ubuntu3.1) ...
Processing triggers for man-db (2.8.3-2ubuntu0.1) ...
$
If any dependencies are required, the apt-get
program retrieves those as well and installs them automatically.
The upgrade
action provides a great way to keep your entire Debian-based system up-to-date with the packages released to the distribution repository. Running that command will ensure that your packages have all the security and bug fixes installed. However, that also means that you fully trust the distribution developers to put only tested packages in the repository. Occasionally a package may make its way into the repository before being fully tested and cause issues.
The core tool used for working with Red Hat repositories is the yum tool (short for YellowDog Update Manager, originally developed for the YellowDog Linux distribution). The yum
tool allows you to query, install, and remove software packages on your system directly from a Red Hat repository.
The yum
command uses the /etc/yum.repos.d
folder to hold files that list the different repositories it checks for packages. For a default CentOS system, that folder contains several repository files:
$ cd /etc/yum.repos.d
$ ls -al
total 44
drwxr-xr-x. 2 root root 187 Sep 17 21:47 .
drwxr-xr-x. 142 root root 8192 Dec 15 16:55 ..
-rw-r--r--. 1 root root 1660 Sep 17 21:39 CentOS-Base.repo
-rw-r--r--. 1 root root 1309 Aug 13 10:34 CentOS-CR.repo
-rw-r--r--. 1 root root 649 Aug 13 10:34 CentOS-Debuginfo.repo
-rw-r--r--. 1 root root 314 Aug 13 10:34 CentOS-fasttrack.repo
-rw-r--r--. 1 root root 630 Aug 13 10:34 CentOS-Media.repo
-rw-r--r--. 1 root root 1331 Aug 13 10:34 CentOS-Sources.repo
-rw-r--r--. 1 root root 4768 Aug 13 10:34 CentOS-Vault.repo
$
Each file in the yum.repos.d
folder contains information on a repository, such as the URL address of the repository and the location of additional package files within the repository. The yum
program checks each of these defined repositories for the package requested on the command line.
The yum
program is very versatile. Table 13.5 shows the commands you can use with it.
Table 13.5 The yum
action commands
Action | Description |
check-update |
Checks the repository for updates to installed packages |
clean |
Removes temporary files downloaded during installs |
deplist |
Displays dependencies for the specified package |
info |
Displays information about the specified package |
install |
Installs the specified package |
list |
Displays information about installed packages |
localinstall |
Installs a package from a specified .rpm file |
localupdate |
Updates the system from specified .rpm files |
provides |
Displays information about packages that provide a feature |
remove |
Removes a package from the system |
resolvedep |
Displays packages matching the specified dependency |
search |
Searches repository package names and descriptions for specified keyword |
shell |
Enters yum command-line mode |
update |
Updates the specified package(s) to the latest version in the repository |
upgrade |
Updates specified package(s), but removes obsolete packages |
Installing new applications is a breeze with yum:
$ sudo yum install zsh
[sudo] password for rich:
...
========================================================================
Package Arch Version Repository Size
========================================================================
Installing:
zsh x86_64 5.0.2-31.el7 base 2.4 M
Transaction Summary
========================================================================
Install 1 Package
Total download size: 2.4 M
Installed size: 5.6 M
Is this ok [y/d/N]: y
...
Installed:
zsh.x86_64 0:5.0.2-31.el7
Complete!
$
One nice feature of yum
is the ability to group packages for distribution. Instead of having to download all of the packages needed for a specific environment (such as for a web server that uses the Apache, MySQL, and PHP servers), you can download the package group that bundles the packages together. This makes for an even easier way to get packages installed on your system.
Recently, another RPM package management tool has been gaining in popularity. The dnf program (short for dandified yum) is included as part of the Fedora Linux distribution as a replacement for yum
. As its name suggests, dnf
provides some advanced features that yum
is missing. One such feature is speeding up resolving dependency searches with library files.
The openSUSE Linux distribution uses the RPM package management system and distributes software in .rpm
files but doesn’t use the yum
or dnf
tool. Instead, openSUSE has created its own package management tool called zypper
.
Both the Debian-based and Red Hat–based package management systems have graphical tools for making it easier to install software in desktop environments. One tool that is available in both the Ubuntu and CentOS distributions is gnome-software.
The gnome-software program is a graphical front end to the PackageKit tool, which itself is a front end that standardizes the interface to multiple package management tools, including apt
and yum
. By including both PackageKit and gnome-software, Linux distributions can provide a standard graphical interface for users to manage their software packages. Figure 13.1 shows the gnome-software package as it appears in the Ubuntu 18.04 Linux distribution.
You can search for packages, view the installed packages, and even view the updated packages available in the repository. If you’re using the CentOS Linux distribution, the gnome-software interface looks the same, as shown in Figure 13.2.
Finally, some standardization is happening across Linux distributions, at least where it comes to graphical software package management tools.
This exercise demonstrates how to work with a package management system to install software.
sudo apt-cache pkgnames
. For Red Hat–based systems such as CentOS, use the command sudo yum list
.sudo apt-get install zsh
. For Red Hat–based systems, use the command sudo yum install zsh
. If the zsh package is already installed, try installing the tcsh package, which is an open-source version of the C shell found in many Unix systems.sudo apt-get remove zsh
. For Red Hat–based systems, use the command sudo yum remove zsh
.The ability to easily install and remove applications is a must for every Linux system. In the open source world, developers release their applications as source code bundles using the tar
and gzip
utilities to create a tarball file. After you download a tarball file, you must decompress and extract the files it contains to be able to compile the application. The gcc
program is the most common program for compiling many open-source applications. You use the configure
and make
utilities to create and run installation scripts to make it easier to install applications from source code.
Most Linux distributions help simplify application installation by precompiling the source code and bundling the necessary application files into a package. Package management software makes it easier to track what applications are installed on your Linux system and where their files are located. Debian-based Linux distributions use the .deb
package management format, with the dpkg
tool, while Red Hat–based Linux distributions use the RPM
package management format, with the rpm
tool.
While package management systems make it easier to install and remove packages, it’s still somewhat of a hassle finding packages. Most Linux distributions now maintain their own repository of packages and provide additional tools, making it easier to retrieve and install packages from the repository. For Debian-based systems, the apt suite of tools, including apt-cache
and apt-get
, are used to retrieve packages from the repository and maintain the package management database. For Red Hat–based systems, either yum
or dnf
is the package tool to use.
Describe how developers bundle their open-source applications for distribution. Linux developers bundle source code files, headers, libraries, and documentation files into a single file for distribution. They use the tar
utility to archive multiple files and folders into a single archive file and then often compress the archive file using the gzip
utility. You can use the wget or cURL program to download the source code distribution files and then use the gzip
and tar
utilities to decompress and extract the source code files.
Explain how to generate an executable program from a source code tarball. After you decompress and extract the source code files from a distribution tarball file, you must compile the source code to create an executable file for the application. First, you must use the configure
utility. This examines your Linux system to ensure that it has the correct dependencies required for the application and configures the installation script to find the dependencies. Next, you run the make
utility. The make
utility runs a script that uses the gcc compiler to compile the necessary library and source code files to generate the executable file for your system. Once that script completes, use the make
script with the install
option to install the executable file on your Linux system.
Describe how Linux packages applications for distribution. Linux uses a package management system to track what applications are installed on your system. The distribution bundles precompiled application files into a package, which you can easily download and install. The package management database keeps track of which packages are installed and the location of all the files contained within the package. You can also query the package management database to determine what packages are installed and remove packages from the system using the package management tools. Debian-based Linux systems use the dpkg
tool to interact with the package management database, while Red Hat–based Linux systems use the rpm
tool.
Describe how Linux distributions use repositories. While using packages makes installing, tracking, and removing software applications easier, you still must be able to find the latest packages for your applications. Most Linux distributions help with that by creating a centralized repository of current application packages, along with tools to work with the repository. For Debian-based systems, the apt suite of tools allows you to query the repository for package information and download any new or updated packages. Red Hat–based systems use the yum
or dnf
tools to interact with their repositories. All three tools allow you to query the remote repository for packages, query the local package management database, and install or remove packages as you need.
Which two programs should you use to download tarballs from an application’s website? (Choose two.)
dpkg
rpm
Fred received an application in source code format. What script should he run to create the executable application program?
dpkg
rpm
yum
make
wget
Sherri is trying to compile an application from source code. Before she can create the application executable file, what script should she run to create the make
script?
make
make install
configure
gcc
dpkg
What is the most common compiler used for open-source Linux applications?
gcc
make
configure
dpkg
rpm
Harry has finished writing his application source code but needs to package it for distribution. What tool should he use so that it can be extracted in any Linux distribution?
dpkg
rpm
yum
apt-get
tar
What tar
command-line options are commonly used together to extract and decompress files from a tarball file?
-Uvh
-zxvf
-xvf
-zcvf
-cvf
What file name extension does the CentOS Linux distribution use for packages?
.deb
.rpm
.tgz
.tar
.gz
Sally needs to install a new package on her Ubuntu Linux system. The package was distributed as a .deb
file. What tool should she use?
rpm
yum
dnf
dpkg
tar
What tools do you use to install packages from a Red Hat–based repository? (Choose two.)
dpkg
tar
yum
apt-get
dnf
Where should you place a new configuration file to add a third-party repository to a Red Hat–based package management system?
/etc/yum.repos.d
/etc/apt/sources.list
/usr/lib
/usr/bin
/proc
18.216.255.250