"Those who do not understand UNIX
are condemned to reinvent it, poorly."
Henry Spence, Programmer
An operating system is a piece of software which facilitates
the use of a computer.
To this end, its two main responsibilities are supporting running
programs and maintaining persistent data (files).
GNU/Linux belongs to a large class of operating systems known as
UNIX-like, having an architecture, design philosophy, and
system-call interface similar to the 70's-era operating system.
History:
Time Sharing, Multics
To understand Unix and the origins of GNU/Linux, we have
to step back in time, to the 1960's, when the computer industry
was just beginning.
Computers then were immense— and largely uncontrolled— machines.
They processed users' instructions from punched cards in discrete
sessions called batches,
or supported some lucky user for an interactive session.[1]
Next-generation computers were interactive,
their use being, "...indescribably more pleasant and productive
than batch processing."[2]
Owing to the cost of the machines, though, it was necessary that they
support more than one user at a time, giving rise to the technique
of time-sharing:
The impetus for time-sharing first arose from professional programmers
because of their constant frustration in debugging programs at batch
processing installations. Thus, the original goal was to time-share
computers to allow simultaneous access by several persons while giving
to each of them the illusion of having the whole machine at his disposal.[3]
To that end, time-sharing operating systems were developed.
In 1960, one was devised to "steal" time
from batch jobs on behalf of interactive users, and thereby
support both interactive and batch computing simultaneously.[1]
In 1961, a system capable of supporting four users
was demonstrated.[1]
Taking this notion to its limit, it was widely believed
at the time that computing's future was
in the "information utility":
In a business model similar to telephone or electricity,
customers would connect their keyboards and monitors
to a central mainframe, then pay for computation on an as-used
basis.[4]
Multics
The MULTICS operating system project
was born at MIT,
with development supplemented by General Electric and AT&T.[5]
Its principal goal was to offer many users access to a single powerful
computer, as in the service of an information utility.[3]
The Envisioned Hardware
Multics introduced to the world several novel ideas, some of which have
become standard in computing.
To support the controlled execution of customers' software,
for example, they introduced the "process", a virtual environment
within which software could safely, and privately,
execute.[7]
To support multiple users, multiple processes
were supported.[7]
To manage these, allot CPU time to users, and track each user's
expenditure, they introduced a "supervising" program.
Although Multics was working by 1968,
it could only support a small number of users.
In time, it became clear that Multics would never deliver on its promise
of a profitable information utility.
AT&T withdrew its support.[2]
Multics soldiered on:
1970 saw its first installation, which supported
35 simultaneous users.
One of its later installations, at General Motors, supported
200 users simultaneously.[9]
Unix
Aside:
Bell Laboratories was AT&T's research arm.
In the wake of AT&T's withdrawal from Multics,
operating systems researchers working at Bell Laboratories
found themselves without a clear project.
Still reeling from the disaster of Multics, management
refused to purchase any new computation equipment.
For the researchers of Bell Labs, this period was
frustrating: Their proposals,
"...were never clearly and finally turned down, but yet were certainly never accepted."[10]
After finding an unused
PDP-7 with a nice monitor
and powerful disk drive,
Ken Thompson and Dennis Ritchie decided to port the game
Space Travel
to it.
This served, at least in part, as an introduction to the
"clumsy" practice of writing a program on one machine
(a GE 635) to
run on another (a PDP-7), by way of punched paper tape: Programs had
to be written, assembled and then punched to paper tape using the
GE 635, then carried over and read into the PDP-7 to be executed.[2][10][11]
Afterwards, Ken Thompson continued to work with the machines,
implementing a filesystem that he, Dennis Ritchie and
Rudd Canaday had devised on paper.
A filesystem without a means to interact with its contents is a
"sterile proposition,"
and so Thompson continued working on it, implementing the functionality
necessary to make it useable in practice.[10]
Aside
These four components correspond to exec(),
as ("assemble"),
ed ("edit"),
and sh ("shell"),
respectively.
About mid-1969, Thompson realized he was close to
an operating system. Four key pieces were needed to take what he had
to a self-supporting system.
They were: A system call to transfer the current thread of execution
to the start of an arbitrary file; an assembler to translate
human-readable code into machine code ("binary files");
a text editor to create and edit assembly-language files; and,
a shell to execute user commands.
The system call was "trivial," and rudimentary
implementations of the other three were done in one week
each.[2][10]
By the time he was done, the software was self-supporting, in the
sense that one could write, assemble, and run software
on the PDP-7 alone.
He and other researchers of Bell Labs quickly improved it
into a multi-user/time-sharing operating
system, later nicknamed UNIX.
About his colleague's work, Dennis Ritchie would later write,
"Thompson wanted to create a comfortable computing environment
constructed according to his own design, using whatever means were
available. His plans, it is evident in retrospect, incorporated many
of the innovative aspects of Multics, including an explicit notion of
a process as a locus of control, a tree-structured file system, a
command interpreter as user-level program, simple representation of
text files, and generalized access to devices."[11]
Owing to anti-monopoly legislation imposed upon AT&T at the time,
software generated by Bell Laboratories was available to any academic or
research institution that cared to ask, given that they pay for the cost
of media and shipping.
Thus, UNIX made its way to college campuses across the nation, to be
studied by an emerging cadre of computer scientists.[12]
UNIX had a simple, clear architecture that
inspired many
operating systems.
It is organized as a hierarchy of layers,
with each layer building upon the ones beneath it, and supporting
those above it:
Hardware:
The core of any computer is its hardware.
Each piece of hardware has a software interface defined by the
engineers that created it.
These interfaces are varied at best.
Aside:
Throughout the eighties and nineties, myriad variants
of Unix were produced.
Referred to as "Unix clones," they each offered their
own features and capabilities.
We are concerned primarily with two such clones: The GNU
operating system and the Linux kernel.
Drivers:
Drivers interact with hardware, and simplify their
interface for the kernel.
Drivers are the deepest and most remote elements
of operating systems, and are constantly being introduced
as new hardware becomes available.
Kernel:
The heart of any UNIX-like operating system is its kernel.
The kernel is responsible for implementing the filesystems
supported, scheduling the CPU, allotting memory,
maintaining file integrity, and many other tasks.
To say the least, it is an involved piece of software,
effectively, the "brains" of the operation.
System Calls:
The set of operations an operating system is willing
to undertake on behalf of a user's process is defined by
the system calls it supports.
The UNIX architecture is characterized by six essential system
calls:
fork(), exec(), open(),
close(), read(), and
write().[13]
Both fork() and exec() are concerned with
launching programs, while the other four deal with files.
Processes:
A process is an executing program.
After nearly sixty years of use and development, Unix has several
facets that, while technically are conventions, in practice are written
in stone. In the coming pages, we will encounter fork and exec,
which causes one program to launch another program, and the
standard file descriptors, which are so fundamental that they
are embedded in programming languages themselves.
The first attempt at producing a deliberately free and open-source Unix clone
was announced
by Richard Stallman in 1983.
At the time, Stallman was working for MIT's AI Laboratory,
and had grown accustomed to its culture of sharing source code:
"If you saw someone using an unfamiliar and interesting program, you
could always ask to see the source code, so that you could read it, change
it, or cannibalize parts of it to make a new
program."[14]
Yet, non-free software was becoming the norm, with students working
on projects intended for profit,
and companies going so far as requiring nondisclosure agreements be
signed to get an executable.[14][15]
On September 27, 1983, Stallman announced his intent:
To produce a "complete UNIX-compatible software system," and to
"give it away free."[16]
He called the project GNU, a recursive
acronym for
GNU's Not UNIX.[17]
Software Freedom
At the time of his announcement, Stallman had not yet quantified his
intentions behind the word free.
Certainly, "a free copy" did not embody the
culture of code-sharing MIT was once home to.
More, he did not wish to put the code in the public domain because
modified versions of it it could become
non-free.[18]
By 1985, Stallman had sorted these issues out:
Everyone will be permitted to
modify and redistribute GNU, but no distributor will be allowed to
restrict its further redistribution. That is to say, proprietary
modifications will not be allowed. I want to make sure that all
versions of GNU remain free.[19]
Thus, the notion of
free software
was born:
Users are permitted to run, study, modify, copy and distribute GNU,
but if they distribute, then those permissions must be preserved.
These stipulations are formalized in the GNU
General Public License, now in its third version.
To create an entire operating system is an enormous task, but
by 1990, the GNU Project had
found or written all major components except one, the kernel.[17]
Today, the GNU Project is
well-regarded for producing high-quality, well-documented free software;
they are as "official" as the open-source community gets.
They remain
active and prominent
among the free software community, with
monetary support from the
Free Software Foundation.[20]
Richard Stallman remains Chief GNUisance.[20]
On August 25, 1991, a Finnish graduate student by the name of
Linus Torvalds announced his own operating system project.
Again following the Unix architecture,
his project was intended to be "...just a hobby, won't be big and
professional like GNU."[21]
Whereas GNU sought to produce an entire operating system,
Linus restricted his attention to the kernel.
By October of 1991, he had GNU's Bash, GCC, Make and other
programs running on his creation.[21]
That month, he sent out the following invitation,
which has since become famous:
From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroups: comp.os.minix
Subject: Free minix-like kernel sources for 386-AT
Message-ID:
Date: 5 Oct 91 05:41:06 GMT
Organization: University of Helsinki
Do you pine for the nice days of minix-1.1, when men were men and wrote
their own device drivers? Are you without a nice project and just dying
to cut your teeth on a OS you can try to modify for your needs? Are you
finding it frustrating when everything works on minix? No more all-
nighters to get a nifty program working? Then this post might be just
for you :-)
As I mentioned a month(?) ago, I'm working on a free version of a
minix-lookalike for AT-386 computers. It has finally reached the stage
where it's even usable (though may not be depending on what you want),
and I am willing to put out the sources for wider distribution.
Properly speaking, the name Linux refers only to the Linux kernel.
Interest flourished, in large part because
of the informality of the project:
Everybody was invited, if only to tinker.
His development model— "release early and often, delegate everything you can,
be open to the point of promiscuity"[22]— raised eyebrows at the time,
but has since proved sound in virtue of its success:
It now dominates the open-source landscape.
The kernel is released under the GNU General Public License, and
Linus Torvalds remains the maintainer of the Development branch.
A major figure of today's Linux development effort is
Greg Kroah-Hartman, who is maintainer
of the stable branch, in addition to various subsystems.[23]
According to him, interest in the kernel continues to
grow, with around 1200 new developers contributing each
year.[24]
Their main development bottlenecks are code review and
maintainership.[24]
The Linux kernel
is released
by its developers in source-code
form, and requires compilation on a separate host machine
before it can be used.
No other software is included, so that
creating a functional Linux-based machine requires somebody
to obtain, compile and assemble a
number of disparate software components.
A distribution is a project that fills this gap.
HJ Lu's Boot/Root Floppies
The first distribution of GNU/Linux was undertaken by
HJ Lu,
in 1992.[25]
He packaged utilities alongside the kernel and distributed the
collection on a pair of 5.25" disks.
Known as the Boot/Root Floppies,
it was never widely known, nor used.[25][26]
Linux's first widely used distribution was, ironically,
a commercial venture— Soft Landing System ("SLS Linux").
Released in late 1992, it was sold under the slogan
"Gentle Touchdowns for DOS Bailouts,"[27]
but users complained that the software was buggy,[26][28] and when its developers
decided to change the executable file format, users began to leave.[26]
Slackware
In late 1992, Patrick Volkerding needed a LISP interpreter for a project
at school, and, having heard that clisp ran on SLS,
downloaded them both.[28]
His professor got wind, and asked Patrick to help him install the
same software
on school computers.[28]
Volkerding walked him through installing, fixing and patching SLS, and
afterward, his professor asked, "Is there some way we can fix the install disks so
that the new machines will have these fixes right away?"[28]
So, Volkerding began improving SLS, and internally named his project
Slackware, not wishing it to be taken too
seriously.[28]
By May 1993, his improvements were becoming substantial, and
his friends convinced him to release it onto the internet.
The first release proved so popular that the
number of download requests crashed the system hosting it,
and Slackware would be, for several years, the de-facto Linux distribution.[28]
Slackware 1.01, released 1993
Volkerding hadn't intended to produce a distribution,
but still maintains
Slackware today;
it is the oldest distribution still maintained.[28][29]
The Slackware Linux Project is focused on producing "the most UNIX-like
Linux distribution out there."[30]
Debian
At about the same time, Ian Murdoch was an undergraduate at
Purdue University.[31]
He saw distributions as essential to the future of Linux, but was
dismayed by their quality.[32]
The single-person and closed-room schemes had consistently
failed to produce quality products.
More, the most popular distribution– SLS Linux– was,
"...quite possibly the most bug-ridden and badly maintained Linux distribution available."[32]
Murdoch envisioned a distribution whose maintenance was
community driven— A modular operating system whose
components were maintained by people with appropriate desires and
specialities.
He named his project
Debian,
a portmanteau of Debra, his wife, and Ian, himself.
"Debian Linux is a brand-new kind of Linux distribution.
Rather than being developed by one isolated individual or group,
as other distributions of Linux have been developed in the
past, Debian is being developed openly in the spirit of Linux and GNU.
The primary purpose of the Debian project is to finally create a
distribution that lives up to the Linux name.
Debian is being carefully and conscientiously put together and will be
maintained and supported with similar care."[32]
Debian 4 "Etch", released 2007
Debian is one of the most respected distributions of our time.
The quantity of software available through its repositories is
unparalleled, and
it is often used as a basis for derivate distributions.
These days, there are many Linux distributions. There's one for your
router, one for your server, one for your desktop, one for your
grandmother, one for gamers. There's one for space, one for
scientists, one for hackers.
There are Apple lookalikes and Windows
lookalikes and others that look like nothing you've seen before.
My favorite distribution is
Linux Mint, but then,
I'm a simple guy.
Mint is a characteristically professional operating system,
and focuses on productivity— That means you'll be up and
running before you know it.
Linux Mint 21 "Vanessa", Released 2022
Mint is available with the above Cinnamon desktop environment;
Windows users will feel right at home.
I find it to be an attractive desktop, but it's only moderately
customizable.
Like many users, I want a desktop that stays out of my way,
and Cinnamon does exactly that.
With version 19, Mint began shipping with
Timeshift,
a tool that automaticaly takes snapshots of the filesystem,
so that users can restore the system to some earlier date.[33]
Version 20 offers automatic updates.[34]
What sets Linux Mint apart from other distributions is that tools like
Timeshift and Cinnamon are preinstalled and ready to use.
When an example operating system is required,
Mint will be used, but similar remarks apply to
virtually any distribution, with a little perusing.
See Also:
balenaEtcher - Flash OS images to SD cards & USB drives
Kernighan, Brian. UNIX: A History and a Memoir. Published 2020, by Brian W. Kernighan via Kindle Direct Publishing.
F. J. Corbató, F. J., & Vyssotsky, V. A. (1965). In Fall Joint Computer Conference. Las Vegas; Association for Computing Machinery. Retrieved June 23, 2024, from https://multicians.org/fjcc1.html.
Corgato, F. J., & Vyssotsky, V. A. (1965). Introduction and Overview of the Multics System. Fall Joint Computer Conference. Available at https://multicians.org/fjcc1.html.
Salus, P. H. (2008). The Daemon, the GNU and the Penguin: How Free and Open Software is Changing the World. Reed Media Services.
Torvalds, L., & Diamond, D. (2002). Just for Fun: The story of an accidental revolutionary. New York ; London ; Toronto ; Sydney: Harper.
"The GNU Project." About the GNU Project - GNU Project - Free Software Foundation, Free Software Foundation, Inc., 12 Sept. 2020, 02:00:07,
https://www.gnu.org/gnu/thegnuproject.html.
"Overview of the GNU System." Overview of the GNU System - GNU Project - Free Software Foundation,
Free Software Foundation, Inc., 4 Sept. 2017, 10:21:27,
www.gnu.org/gnu/gnu-history.html.
The Structure and Administration of the GNU Project. The Structure and Administration of the GNU Project - GNU Project - Free Software Foundation,
Free Software Foundation, Inc., 10 October 2020, 13:07:59,
https://www.gnu.org/gnu/gnu-structure.html.
Anderson, T. (2020, October 26). Linux kernel's Kroah-Hartman: We're not struggling to get new coders, it's code review that's the bottleneck. Retrieved October 30, 2020, from https://www.theregister.com/2020/10/26/linux_kernel_intel/.