Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

A Short (?) History of Linux and Unix


  • Please log in to reply
10 replies to this topic

#1 Naught McNoone

Naught McNoone

  • Members
  • 308 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:The Great White North
  • Local time:09:46 PM

Posted 15 June 2015 - 10:16 AM

I originally wrote this about 10 years ago.  The most recent update was done for a night extension course last taught in 2005.  Some of it is obviously dated.

 

The original graphics have be removed, to condense the post.

 

Cheers!

 

Naught McNoone 

 

-----------------------------------------------------------------------------------------

 
A History of UNIX and Linux
An introduction to the origins of UNIX and Linux Operating Systems
 
Lesson 01 ­ 05.01.11.01
Computer Programming Languages
 
Ever since the invention of Charles Babbage's difference engine in 1822, computers have required a means of instructing them to perform a specific task.  This means is known as a programming language.
 
Computer languages were first composed of a series of steps to wire a particular program; these morphed into a series of steps keyed into the computer and then executed; later these languages acquired advanced features such as logical branching and object orientation. 
 
The computer languages of the last fifty years have come in two stages, the first major languages and the second
major languages, which are in use today.
 
In the beginning . . .
Charles Babbage's difference engine could only be made to execute tasks by changing the gears which executed the calculations. Thus, the earliest form of a computer language was physical motion.
 
Eventually, physical motion was replaced by electrical signals when the US Government built the ENIAC in 1942. It followed many of the same principles of Babbage's engine and hence, could only be "programmed" by presetting
switches and rewiring the entire system for each new "program" or calculation. 
 
This process proved to be very tedious.
 
Entering the modern era...
In 1945, John Von Neumann was working at the Institute for Advanced Study.
He developed two important concepts that directly affected the path of computer programming languages.
 
The first was known as "shared program technique". 
This technique stated that the actual computer hardware should be simple and not need to be hand wired for each program.  Instead, complex instructions should be used to control the simple hardware, allowing it to be reprogrammed much faster.
 
The second great idea!
The second concept was also extremely important to the development of programming languages. Von Neumann called it "conditional control transfer". 
 
This idea gave rise to the notion of subroutines, or small blocks of code that could be jumped to in any order, instead of a single set of chronologically ordered steps for the computer to take.
 
The second part of the idea stated that computer code should be able to branch based on logical statements such as IF (expression) THEN, and looped such as with a FOR statement. "Conditional control transfer" gave rise to the idea of "libraries," which are blocks of code that can be reused over and over.
 
A Baby Computers First Language
In 1949, a few years after Von Neumann's work, the language Short Code appeared. 
It was the first computer language for electronic devices and it required the programmer to change its statements into 0's and 1's by hand.  Still, it was the first step towards the complex languages of today.  
 
In 1951, Grace Hopper wrote the first compiler, A­0.  A compiler is a program that turns the language's statements into 0's and 1's for the computer to understand.  This lead to faster programming, as the programmer no longer had to do the work by hand.
 
The Low Level Language
In 1957, the first of the major languages appeared in the form of FORTRAN.  Its name stands for FORmula TRANslating system.  The language was designed at IBM for scientific computing.
 
The components were very simple, and provided the programmer with low ­level access to the computers innards. Today, this language would be considered restrictive as it only included IF, DO, and GOTO statements, but at the time, these commands were a big step forward. 
 
The basic types of data in use today got their start in FORTRAN, these included logical variables (TRUE or FALSE), and integer, real, and double­ precision numbers.
 
Then came COBOL . . .
Though FORTAN was good at handling numbers, it was not so good at handling input and output, which mattered most to business computing.
Business computing started to take off in 1959, and because of this, COBOL was developed. It was designed from the ground up as the language for businessmen.  Its only data types were numbers and strings of text. It also allowed for these to be grouped into arrays and records, so that data could be tracked and organized better. 
 
It is interesting to note that a COBOL program is built in a way similar to an essay, with four or five major sections that build into an elegant whole. COBOL statements also have a very English­ like grammar, making it quite easy to learn. 
 
All of these features were designed to make it easier for the average business to learn and adopt it.
 
. . . and all the rest
In 1958, John McCarthy of MIT created the LISt Processing (or LISP) language.  It was designed for Artificial Intelligence (AI) research.
 
The Algol language was created by a committee for scientific use in 1958.
 
Pascal was begun in 1968 by Niklaus Wirth. Its development was mainly out of necessity for a good teaching tool.
 
C was developed in 1972 by Dennis Ritchie while working at Bell Labs in New Jersey.  
 
In the late 1970's and early 1980's, Object Oriented Programming developed into the full­ featured language C++, which was released in 1983.
 
Newer languages include Basic, Visual Basic, Java and Perl.
 
Only for Programming Engineers?
Although early programming languages allowed the computer to be reprogrammed much easier and quicker, with more options, it was still a difficult task.  Some sort of method was needed to allow the operator to interact with
the computer.
 
More importantly, a simple method was needed to allow the operator to switch between pre set programs without having to shut down the system.  The idea of a task switching or “Operating” system was born.
 
What, no keyboard or mouse?
Early computers lacked any form of operating system. The user had sole use of the machine; he or she would arrive at the machine armed with his or her program and data, often on punched paper tape.
The program would be loaded into the machine, and the machine set to work, until the program stopped, or maybe more likely, crashed.
 
Programs could generally be debugged via a front panel using switches and lights; it is said that Alan Turing (1) was a master of this on the early Manchester Mark I machine.
 
Later, machines came with libraries of support code which were linked to the user's program to assist in operations such as input and output. This would become the genesis of the modern ­day operating system. 
 
However, machines still ran a single job at a time; at Cambridge University in England the job queue was at one time a washing line from which tapes were hung with clothes pegs. The colour of the pegs indicated the priority of the job.(2)
 
It Cost How Much?
As machines became more powerful, the time needed for a run of a program diminished and the time to hand off the equipment became very large by comparison.
 
Run queues went from being people waiting at the door to stacks of media waiting on a table to using the hardware of the machine such as switching which magnetic tape drive was on line or stacking punch cards on top of the previous jobs cards in the reader.
 
Accounting practices were also expanded beyond recording CPU usage to also count
pages printed, cards punched, cards read, disk storage used, and even operator action required by jobs such as changing magnetic tapes.
 
The cost of operation a computer was beyond the reach of most medium and all small businesses.
 
The SysOp and Why We Need Him.
Operating the computer went from a task performed by the program developer to a job for full time dedicated machine operators. 
 
Eventually, the runtime libraries became a program that was started before the first customer job, that read in the customer job, controlled its execution, cleaned up after it, recorded its usage, and immediately went on to process the next job. 
 
Jobs also evolved from being binary images produced by hand encoding to symbolic programs that were translated by the computer. 
 
An operating system, or "monitor" as it was sometimes called, permitted jobs to become multi step with the monitor running several programs in sequence to effect the translation and subsequent run of the user's program.
 
So an Operating System is?
The conceptual bridge between the precise description of an operating system and the colloquial definition is the tendency to bundle widely, or generally, used utilities and applications (such as text editors or file managers) with the basic OS for the sake of convenience;  As OSes progressed, a larger selection of 'second class' OS software
came to be included, such that now, an OS without a graphical user interface or various file viewers is often considered not to be a true or complete OS. 
 
To accommodate this evolution of the meaning most of what was the original "operating system" is now called the "kernel", and OS has come to mean the complete package.
 
In English, Please!
An operating system is a collection of software designed to allow computer users to perform multiple tasks using the computer.
 
It can be a simple Disk Operating System like IBM or MS DOS, or it can be a complex arrangement of files and libraries located on a central server, like Windows 2003.
 
Some Primitive Systems . . . 
Atlas I Supervisor ­ Supervisor Program for the first computer designed to use an operating system. Introduced system calls and virtual storage.
 
FMS ­ FORTRAN Monitor system. Operating System developed by North American Aviation for the
IBM 709 in the late 1950s.
 
HES ­ Honeywell Executive System. Operating System for the Honeywell 800. Early 1960s.
 
Input Output Selector ­ An IO Control System for the DDP­116 minicomputer. One of the earliest OSs for minis in the mid 1960s.
 
Input Output System  ­ A very early operating system developed by General Motors and North American Aviation for the IBM­704. About 1956.
 
SABRE ­ Semi­Automatic Business Related Environment. The first major transaction processing system, developed by IBM and American Airlines for the IBM 7090.
 
SAGE ­ Semi­Automatic Ground Environment system. Control Program for IBM AN/FSQ7 to monitor weapons systems. First real­time control system. Late 1950s.
 
The MULTICS operating system
Multics (Multiplexed Information and Computing Service)
The MULTICS project started in 1965 with AT&T/Bell Labs, General Electric, & MIT.
The project was funded by ARPA (Department of Defence Advanced Research Projects Agency)
 
The goal of the project was to make a modular, time sharing computer system.  So that parts could be taken off line and added without interrupting service and the computer could be made faster and more powerful by adding modules.
 
In 1969, MULTICS was far behind schedule and it creators had promised more they then could deliver.
 
The birth of UNIX
AT&T pulled out of the MULTICS project with two of their people who worked on MULTICS, Ken Thompson and Dennis Ritche.
 
Unix was developed with the same modular ideas of Multics.  
Most commands are programs/applications, and the core operating system (kernel) is separate from the user interface (shell).  In 1969, Ken Thompson creates Unix running on a PDP­7 mainframe.
 
AT&T UNIX
In 1973, Thompson rewrote UNIX using Ritche's C programming language.
In 1977, they started creating (or porting) version of UNIX which would run on other mainframes and minicomputers other than the PDP­11 it was designed on.
 
1977 was also the year the 1st commercial version of AT&T UNIX was made available.
 
BSD UNIX
Universities were able to purchase the source code to the UNIX operating system.  University of California at Berkeley spent the $400 dollars for UNIX and its source code.
 
Two grad students, Bill Joy and Chuck Haley, started making significant changes to the source code.
In 1978, BSD UNIX was made available to the world for $50 a copy.
 
Multitasking is here!
Advantages BSD UNIX had over AT&T version they started with was:
BSD UNIX Could switch between multiple programs running at the same time. 
Files names up to 255 characters added, where AT&T UNIX only allowed 14.
BSD UNIX could easily connect to a local area network (LAN).
 
Advantages of UNIX
UNIX is considered a real operating system. Two qualifications earn it this description.
More then one person can access the computer UNIX is running on (multi user).
Each user can run multiple applications/programs (multitasking).
UNIX will run on just about every platform made. 
 
Many companies purchases the source code and developed their own versions. The downside of UNIX is that even the PC version is very large.
 
A history of UNIX versions
SunOS and Digital Equipment Corporation both built their UNIX OSs from BSD 4.2.
 
In the early 1980's, Microsoft produced a 3rd major version of UNIX, Xenix and licensed it to Santa Cruz Operation (SCO).   Xenix/SCO UNIX was based off AT&T's System III UNIX
 
In the mid ­1980's, Data General, IBM, Hewlett Packard and Silicon Graphics built their UNIX OSs from AT&T's System V UNIX.
 
Spring of 1988, AT&T & Sun Microsystems began joint development of a merge of System V & BSD.  The result was Sun Solaris.
 
August 1988, the merge of Xenix and AT&T's System V was released as UNIX System V/386 release 3.12.
 
In 1993, AT&T sold  its UNIX Systems Laboratory (USL) to Novell as part of Novell's attempt to battle Microsoft Windows on the corporate desktop.
 
1994 UNIX trademark transferred to "The Open Group" by Novell. 
 
1995 saw the USL sold to SCO.
 
2000 Caldera Systems purchases parts of SCO.
 
2002 Caldera Systems changes name to SCO Group.
 
2003 Novell buys SuSe
 
 
 
A Penguin is Born!(3)
 
From: torvalds@klaava.Helsinki.FI (Linus Benedict Torvalds)
Newsgroups: comp.os.minix
Subject: What would you like to see most in minix?
Summary: small poll for my new operating system
Message­ID: <1991Aug25.205708.9541@klaava.Helsinki.FI>
Date: 25 Aug 91 20:57:08 GMT
Organization: University of Helsinki
Hello everybody out there using minix ­
I'm doing a (free) operating system (just a hobby, won't be big and
professional like gnu) for 386(486) AT clones. This has been brewing
since april, and is starting to get ready. I'd like any feedback on
things people like/dislike in minix, as my OS resembles it somewhat
(same physical layout of the file­system (due to practical reasons)
among other things).
I've currently ported bash(1.08) and gcc(1.40), and things seem to work.
This implies that I'll get something practical within a few months, and
I'd like to know what features most people would want. Any suggestions
are welcome, but I won't promise I'll implement them :­)
Linus (torvalds@kruuna.helsinki.fi)
PS. Yes ­ it's free of any minix code, and it has a multi­threaded fs.
It is NOT protable (uses 386 task switching etc), and it probably never
will support anything other than AT­harddisks, as that's all I have :­(.
 
 
LINUX in the evolution
1991 Linux Torvalds releases 1st version of Linux to the public.  
Its functionality was based off a PC­based version of UNIX called minix.
The unusual thing was he made the source code and operating system available for free.
 
Today Linux continues to be developed by a world­wide team, led by Linus, over the internet.
 
LINUX is FREE!?(4)
Linux uses no code from AT&T or any other proprietary source.
Much of the software developed for Linux is done by the Free Software Foundations' GNU project.
Linux can be provided to the world free of charge.
 
LINUX system requirments
A base Linux install can run on a 80386­16MHz computer with 2 Megabytes of RAM and 200 Megabytes of hard drive space.
 
The realistic minimum machine is a 80486 with 16 Megabytes of RAM and 300 Megabytes of hard drive space.
Current versions take advantage of multiprocessor motherboards.
 
LINUX advantages
Full multitasking and Virtual memory.
Hard drive space used to keep data in RAM not accessed frequently on the hard drive.
The X Windows GUI interface for UNIX systems.
Built­ in networking support.
Share libraries ­ common functions are kept in a central location and accessed by programs that need them.
Compatibility with the IEEE POSIX.1 standard.
Because of standard Linux supports many standards set forth for all UNIX systems.
Lower cost then most other UNIX systems and UNIX clones.
GNU software support.
 
Author's Notes:
 
1.  Since first written, more information has been released about Alan Touring under the "Official Secrets Act." He deserves a new chapter (entire book!) dedicated to his work alone.
 
2.  Perhaps the first reference to the term "on line"?           :rolleyes: 
 
3.  Not meaning to brag, but I copied this out of an archived message from one of my old 'Compuserve Account'  subscription lists. Wow!  Was I one of the first to know?            :trumpet: 
 
4.  Free as in "Free to Use!", NOT free as in "Free Beer!"  I will post more on licenses at a later date.
 


BC AdBot (Login to Remove)

 


#2 Rocky Bennett

Rocky Bennett

  • Members
  • 2,819 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:New Mexico, USA
  • Local time:07:46 PM

Posted 15 June 2015 - 01:24 PM

Thanks for that piece. Good read. Will there be a test later?


594965_zpsp5exvyzm.png


#3 Naught McNoone

Naught McNoone
  • Topic Starter

  • Members
  • 308 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:The Great White North
  • Local time:09:46 PM

Posted 15 June 2015 - 01:57 PM

. . . Will there be a test later?

 

Rocky,

 

Funny you should ask . . .    :)

 

A student once raised his hand in protest to another that was looking up all the answers on Google, during one of our monthly quizzes.  I asked if he read the whole test before starting to answer the questions.  That last line was "This is an open book test."

 

Imagine how dull support forums would be, if every one "Googled" their problems first!

 

Cheers!

 

Naught



#4 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,018 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:09:46 PM

Posted 16 June 2015 - 04:03 AM

Naught, great piece of history you wrote for us! :thumbup2:

 

I had no idea that the first ones are nearly 200 years old, though I knew that these were in use from the 1950's to date. 

 

Wonder what the next 200 years will bring? I would like to think that at least in the next 10-15, a 128 bit CPU. However it's been discussed time & time again, and from what I read, the 64 bit CPU will serve us for many years to come. While most are downsizing the physical CPU itself (the die), at the same time, many more components are included, especially transistor count & most consumer based ones has inbuilt graphics as well, making it all the more amazing. 

 

I do recall my first computer a 2000 model Dell, with a whopping 20GiB HDD at the time, running Windows 2000. Even as recent as 2004, 40GiB HDD's were stock equipment in many computers, both notebooks & desktop PC's. 

 

And to think, today it's practically a given that a 1TiB HDD is included out of the box in desktop PC's & some notebooks. My first HDD upgrade was a 40GB one back in 2002, and it cost $200 at the time. There are Flash drives larger than that for $35 or so today. And to think, that was in the last 15 years. Of course, along with HDD. though a bit more slower, more RAM was added the the package. 

 

I'd say that we've came along a long way in the last nearly 200 years.  :thumbup2:

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 


#5 NickAu

NickAu

    Bleepin' Fish Doctor


  • Moderator
  • 13,562 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:127.0.0.1 Australia
  • Local time:12:46 PM

Posted 16 June 2015 - 05:31 AM

 

Charles Babbage's difference engine could only be made to execute tasks by changing the gears which executed the calculations

Isn’t that how Windows still runs today? It's slow enough.

 

Great post.



#6 mremski

mremski

  • Members
  • 498 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:NH
  • Local time:09:46 PM

Posted 16 June 2015 - 05:41 AM

Naught, nice summary.  The stories around BSD/AT&T/SCO can fill books (and they have).  Current Free/Net/OpenDragonFly BSDs also don't use any code that was deemed AT&T proprietary by the courts :)

 

Nick:  When was the last time anyone tried to run any version of Windows on the "minimum recommended hardware"

 

cat:  Disk Drives.  Some of us remember working with 10MB drives the size of a dishwasher on PDP-11, core memory that you could actually watch the bits flip and running Linux on a 386sx that you had to buy a specific version of a Soundblaster card if you wanted a CDROM.

 

Anyone else remember "a.out to ELF" conversions?  How about paged memory systems that you had to spawn a different process when things got too big?

 

CPUs, Moore's Law.  Today's lowest end cell phone has more processing power than the computers used to send men to the moon. 


FreeBSD since 3.3, only time I touch Windows is to fix my wife's computer


#7 NickAu

NickAu

    Bleepin' Fish Doctor


  • Moderator
  • 13,562 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:127.0.0.1 Australia
  • Local time:12:46 PM

Posted 16 June 2015 - 05:53 AM

 

Nick:  When was the last time anyone tried to run any version of Windows on the "minimum recommended hardware"

Remember Vista? That long ago.


Edited by NickAu, 16 June 2015 - 05:54 AM.


#8 mremski

mremski

  • Members
  • 498 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:NH
  • Local time:09:46 PM

Posted 16 June 2015 - 06:10 AM

 

 

Nick:  When was the last time anyone tried to run any version of Windows on the "minimum recommended hardware"

Remember Vista? That long ago.

 

Yes, I remember Vista.  I remember DOS, Windows 3.x and Borland C compilers.


FreeBSD since 3.3, only time I touch Windows is to fix my wife's computer


#9 Naught McNoone

Naught McNoone
  • Topic Starter

  • Members
  • 308 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:The Great White North
  • Local time:09:46 PM

Posted 16 June 2015 - 09:24 AM

. . . the 64 bit CPU will serve us for many years to come . . .

 

"640K ought to be enough for anyone."

Bill Gates, InfoWorld magazine in January 1990

 

Cheers!

 

Naught



#10 Al1000

Al1000

  • Global Moderator
  • 7,979 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Scotland
  • Local time:02:46 AM

Posted 16 June 2015 - 04:35 PM

Good read Naught. Thanks for the post.

#11 cat1092

cat1092

    Bleeping Cat


  • BC Advisor
  • 7,018 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:North Carolina, USA
  • Local time:09:46 PM

Posted 17 June 2015 - 01:25 AM

 

 

Nick:  When was the last time anyone tried to run any version of Windows on the "minimum recommended hardware"

Remember Vista? That long ago.

 

 

Nick, bet it took nearly every second of 5 minutes to boot on the 'minimum recommended' settings.  :hysterical:

 

Microsoft was ready for Vista, however it took the OEM's by storm & until Windows 7 to catch up. By that time, most except netbooks (a fad with a long & expensive story) were running Windows 7 on 64 bit computers, though some OEM's still cheaped out on the RAM, installing the 2GB minimum. 4GB made 7 more bearable & so did SP2 for Vista, installed on more modern hardware. 

 

No wonder you moved on to higher & lighter ground in a Linux OS. :thumbup2:

 

 

 

cat:  Disk Drives.  Some of us remember working with 10MB drives the size of a dishwasher

 

mremski, no I don't recall it, yet it wasn't a lifetime ago either, have heard many in the business just 12-15 years older than myself speaking of this, the RAM & more. Though in my young years, unfortunately, wasn't into computers, at the time was a very expensive hobby & many who did have computers throughout the 80's & 90's had to take a loan out to purchase a PC. It was only in the early part of the 2000's that PC's were available to the masses, as Dell pumped out tens, if not hundreds of millions of $400-500 PC's in the first half of the first decade of this century, in what I presume to be a 'price war' between some of the larger OEM's. 

 

Most were low spec compared to those of today, yet a whole new & very large generation of computer users were born of all ages. Many of who has moved on to more modern standards, while a few has lagged behind. Unfortunately, I suspect that the day will come that even Puppy distros of all types will require PAE (some already doesn't), and then most of these will be good only for the recycle center. Linux OS's won't keep on supporting these aging computers forever, or at least modern versions won't. 

 

The last of the LTS versions (Ubuntu 12.04 non-PAE/Linux Mint 13) for these expires in 2017, so we'll see then how things are progressing. There will come a point in time that newer software will need newer hardware (not just CPU's) to run on & we're already seeing workarounds to force some of these components to run. 

 

 

 

Isn’t that how Windows still runs today? It's slow enough.

 

Install as much RAM as the computer will hold & a fast SSD, even Windows will run on cruise, though still a step behind Linux OS's. 

 

Naught, thanks for the article & great read, very much education in there :thumbup2:

 

I wonder if they teach the early part (nearly 200 years back) in the IT schools? 

 

Cat


Performing full disc images weekly and keeping important data off of the 'C' drive as generated can be the best defence against Malware/Ransomware attacks, as well as a wide range of other issues. 





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users