DOC HOME SITE MAP MAN PAGES GNU INFO SEARCH PRINT BOOK
 
An overview of the system

How to think about system tools

Having toured the operating system at the service level, you can now see more clearly how the tools it provides are able to work together. Because all files and devices are equivalent, it is possible to provide software tools that have a uniform interface. The generic UNIX system tools see any file or device as a stream of bytes coming from a standard input. Their function is simple; they carry out some transformation on the stream of bytes, and send the results to an output stream which may be another file or a device like a terminal.

One of the main design goals of the UNIX system was to enforce this equivalence. A secondary goal was to promote flexibility. There are no obvious limits to the ways in which tools can be connected via pipes and shell scripts. It is possible in principle to take the output from any program and feed it to the same program as input; it might not be sensible to do so, but it is at least practical. (Under some other operating systems which enforce rules governing file types, it is impossible to do this.) It is therefore possible to construct self-modifying programs, or arbitrarily complex programs operating repeatedly on the same data files, from first principles using the tools provided with the system.

In addition to having standardized inputs and outputs, the software tools follow a design philosophy. Each tool was originally designed to do one task, and to do it as efficiently as possible. This is still visible in the default behavior of many of the programs. For example, grep(1) is designed to search for regular expressions (patterns) in text. Its behavior can be modified by various flags, but in principle it will perform adequately with no arguments other than a pattern to search for and a stream to read its input from. Consequently, there are a number of lowest common denominator standards that ensure that almost all UNIX-like operating systems can be used to run the same shell scripts, as long as they are written with the standards in mind and do not use the specific value-added features of the operating system tools.

Some of the tasks carried out by the system tools are quite complex. A lot of research into the theory of computer languages went into the early development of the UNIX system at the Bell Telephone Laboratories and various universities. Consequently, a number of ``power tools'' are provided which, on any other system, would be considered to be programming languages in their own right. Other tools are unambiguously recognized as languages. To a large extent, these tools share a common core syntax that demonstrates their common ancestry. If you learn the C programming language, you will see great similarities with the C shell and awk(1); if you take care to learn the regular expression syntax recognized by egrep(1) and the search and replace operations of vi(1) you will have little difficulty generalizing them to basic commands in sed(1) and pattern matching commands in awk.

While the body of knowledge you need to master is not small, there are similarities between apparently different tools. After a while, you will be able to learn new system utilities by analogy with those which you are already familiar with. This is the basis of a comprehensive understanding of UNIX systems.


© 2004 The SCO Group, Inc. All rights reserved.
UnixWare 7 Release 7.1.4 - 22 April 2004