What is concurrent programming?
most programs are examples of ``sequential programming''.
That is, they consist of a series of operations that are carried out
one at a time.
With ``concurrent programming'' the programmer can specify
sets of instructions that potentially can be executed in parallel
and still provide correct results.
The advantages of this style of programming are:
Concurrent programming has been available in the UNIX System
since its inception via the ``process model''.
In the UNIX System
problems are solved not just by running programs but by
running sets of programs (a running program is called a ``process'')
sometimes pre-existing ``tools''
or ``commands''; sometimes specifically written programs
that work together (often concurrently) to solve the problem.
Processes can communicate and synchronize with each other
by mechanisms that include:
A powerful programming paradigm
Programs are often written to emulate or respond to events in the real world.
In the real world,
concurrency is common and purely sequential events are the exception.
Modeling such behavior is facilitated if the programming environment
supports the notion of concurrency.
Possible performance improvement
If multiple processors are available, the program might be executed in
less real time (than sequential execution) if more than one processor
is working simultaneously.
This is called ``true concurrency''.
Even on uniprocessor machines,
there may be some performance gain from designing greater concurrency
into the program.
While one activity is blocked, others might still be executing.
there is an advantage to concurrent programming
even if the resources (processors) are not available
to provide ``true concurrency'' and the application is only
pipes (named and unnamed)
files and file/record locks
shared memory (IPC style shared memory or mapped files)
semaphores (IPC style)
What are threads?
© 2004 The SCO Group, Inc. All rights reserved.
UnixWare 7 Release 7.1.4 - 27 April 2004