Everyone is familiar with the contributions of technical moguls such as Bill Gates and Steve Jobs. Likewise, computing history remembers people like Charles Babbage and John Von Neumann. There are many lesser-known, but equally-important figures, whose contributions made our current “online” lifestyle possible, and made computing both affordable and physically small enough to be almost ubiquitous.
Table of Contents
Number 10: Gordon E. Moore
Moore’s Law
Moore’s Law is the principle that the number of transistors on an integrated circuit doubles every 2 years. Although more of a guideline than a law, this means that as transistor density increases, you can buy more computing power, cheaper, as time progresses.
Intel’s first Central Processing Units (CPUs) had about 2,000 transistors on a single chip, whereas modern CPUs have 2 billion or more!
This concept also applies to storage, and other aspects of technology / computing. Storage capabilities seem to double every 24 to 36 months, while costs remain the same or decrease.
From a cost standpoint, a brand new IBM PC in 1981 was $5,000, with very basic capabilities. Now, you can buy a very decent laptop or tablet for about $500.
Co-founder of Intel Corporation
Intel corporation made the CPUs that powered all of the early IBM personal computers, that would eventually lead to the gold standard of “Windows on a desktop” computing, and then eventually go on to produce mobile chipsets and low-power CPUs that acted as a springboard for mobile computing in the form of laptops and tablets.
Although most early personal computers used CPUs from Zilog and Motorolla, these competing standards fell by the wayside, as Windows-on-Intel competitors such as Apple and Sun eventually switched to Intel CPUs.
Intel’s continued innovation provides low-power, embedded chips and chipsets that are positioned to bring about the next technology revolution: The “Internet of Things” (IoT, or M2M).
Number 9: Ada Lovelace
First Computer Programmer
Assistant to Charles Babbage, she developed instruction sets for Babbage’s Analytical and Difference engines – very early, mechanical-based general computation systems.
Today, all computer systems run a predefined sequence of instructions that we call “software”.
Ada Lovelace is generally recognized as the first “computer programmer”, and the programming language “ADA” is named after her.
Number 8: Ralph Baer
“Father of Video Games” and inventor of the Magnavox Odyssey
Computers and gaming are historically intertwined. Computing advances were often driven by new types of games that needed to have new capabilities, or needed to run on faster hardware.
In the 70’s and 80’s, the first personal computers were often used almost exclusively for gaming. Extra storage meant that developers could make larger games. Faster CPUs meant more elaborate, and more realistic games, while better graphics and multimedia capabilities provided entirely new gaming experiences.
Even today, gamers often have the newest technology and the fastest computers, so that they can have a more emersive experience while playing the newest games that demand better hardware performance.
Video games started with one man’s invention: Pong. Before Atari, a man named Ralph Baer built a system that used an oscilloscope to visualize two paddles that could be controlled by people, who could bounce a virtual ball back and forth. Eventually, he would go on to evolve this in to the Magnavox Odyssey, one of the first home gaming consoles that could be connected to a TV.
Pong was such a popular concept, that numerous competitors and copycats, including Atari, brought forth their own versions of Pong.
Number 7: Tim Paterson
Wrote 86-DOS
In 1980, IBM wanted a CP/M-like operating system for their in-development “Personal Computer”, the IBM PC. At the time, IBM and Digital Research were unable to reach an agreement. IBM went to Microsoft, who brokered a deal with Seattle Computer Products to write an operating system. Tim Paterson was assigned the task, and eventually produced 86-DOS, the predecessor to MS DOS and PC DOS.
After selling 86-DOS to Microsoft, all IBM compatible PC desktops would run MS DOS or PC DOS (IBM’s licensed version of MS DOS) through the 1980’s and 1990’s. Although no longer a core part of the operating system, modern versions of Microsoft Windows still have MS DOS embedded in the form of the “Command Prompt”.
Early on, DOS was hard to use, and was actually a barrier to entry for most people who wanted to purchase and use a computer. Vendors initially bundled proprietary menu software with their computers, so that non-DOS users could launch their applications. Later, DOS itself included a menu system, and eventually the popularity and ease-of-use of Windows (running on top of DOS at the time) replaced the need for a menu system.
Although difficult to use for early PC adopters, the importance of DOS is that it provided a standard Application Programming Interface (API), allowing software developers to write code using a standard specification that could be run on virtually any computer.
Number 6: William Shockley / John Bardeen / Walter Brattain
Team that invented the Transistor
The transistor acts as a tiny electronic switch, allowing electricity to flow when a voltage is applied to the base. Transistors can be combined in to the basic logic gates and memory circuits used by every digital computer.
Digital computing would not be possible today, without the transistor, which replaced physically-larger, and much more power-hungry vacuum tubes in use at the time, as well as other analog components.
The creation of the transistor ushered in the age of digital computing.
Number 5: Jack Kilby
Invented the Integrated Circuit
Integrated Circuits (ICs) allow multiple electronic components, such as transistors, capacitors, and resistors to be miniaturized, and combined on a single silicon chip.
CPUs, memory, and controller circuitry are all designed as ICs, whose small size and low power requirements facilitated the creation of the first “microcomputers” (personal computers).
Increasing levels of integration and component density follow Moore’s Law, delivering ever-increasing CPU power and storage at lower prices.
Today, “Computer on Chip” means that an entire computer, including all of its subsystems and components, can fit in a single IC! This level of integration and component density makes it possible to have small, mobile devices with vast capabilities. Not limited to tablets and phones, kits such as Raspberry Pi allow inventors and students to run an entire operating system, and write embedded applications on tiny computers the size of a pack of cards.
Number 4: Marc Andreessen
Co-wrote NCSA Mosaic
Long before Internet Explorer, Chrome, or Firefox, there was NCSA Mosaic.
Mosaic was one of the first HTTP web browsers, credited with popularizing the internet. Before browsers, internet users had to use a disconnected set of tools to play games, view pictures, share files, and communicate.
The “web browser” was a simple user interface that made the internet insanely popular. Browsing or “surfing” the ‘net became the “killer application” that drove both PC sales and demand for internet access in the mid-90’s, onward.
Andreessen eventually left NCSA and formed Netscape – an important and popular early browser, and successor to Mosaic, that helped define early browser and web standards and capabilities.
The development of HTML, Secure Sockets Layer (SSL), JavaScript (Browser scripting), and multimedia content embedding all came from Mosaic and Netscape.
Number 3: John Postel
RFC Internet Standards
John Postel was the first RFC editor. RFCs, or “Requests for Comment”, are the standards that govern how computers communicate with each other on the internet. They define standards and protocols for communicating, transmitting files, sending e-mail, and even how computers connect to each other.
John Postel wrote many of the early RFCs, effectively standardizing many of the protocols that were in common use, but may vary by implementation.
Having RFC standards allows any type of machine running any operating system or application to talk to any other machine.
Number 2: Dennis Ritchie
Helped develop the “C” programming language
Among other, very important contributions, Dennis Ritchie helped develop the “C” programming language.
C is the language used to write most of Unix, Linux, and Windows operating systems, as well as software that runs on them.
Known for performance and flexibility, C allows very efficient access to a computer’s memory, while providing high-level functionality. C can be extended infinitely through precompiled libraries, or through direct inclusion of external code through header files.
Microsoft’s C#, Java, PHP, and JavaScript are all C-like languages descended from “C”
Helped write the Unix operating system
Unix was one of the first multi-user, multitasking operating systems. Linux was originally a copy of Unix designed to run on IBM PC-compatible systems, but has now been ported to thousands of platforms.
Many of the utilities and standards created originally for Unix still exist today.
In addition to running on virtually every phone and tablet, hosts of tiny embedded / appliance systems, as well as a steadily-growing footprint on the desktop / laptop with operating systems like ChromeOS and Debian Linux, most of the internet’s servers now run on Linux or Unix-like operating systems.
Number 1: Alan Turing
Father of computing
Alan Turing invented a mathematical, theoretical construct that proved equality between computable systems.
The Turing machine used a hypothetical, infinitely-long, paper tape on which might be written a series of “1” or “0” digits, that the machine would then interpret and modify according to a series of instructions.
Not only is this the blueprint for all modern computers, with input / output to external storage, the use of binary to represent the “on/off” state of computer memory, shared program / data space, and a CPU that modifies data, this construct is the basis for virtualization used in cloud computing, video games, business, and even application software.
Other Contributions
At Bletchley Park, Alan Turing built a code-breaking computer called “the bombe”, designed to identify the initial configuration of German enigma machines, allowing the allies to decipher Nazi communications, and gain critical access to strategic information.
With a thought toward a time when computers would be artifically-intelligent, Turing developed a test by which an interviewer asks a series of questions. If the interviewer can’t tell if the subject is “real” or “artificial”, then the subject is essentially “real”. This concept was illustrated in the movie, “Blade Runner”. Today, “CAPTCHA”, the annoying series of letters and numbers that some web sites require in order to “prove you are human”, is based on the Turing Test.
Turing created or significantly progressed several branches of mathematics, including mathematical biology.
Summary
Although high-profile tech moguls tend to get much of the attention and credit, their success is built on many lesser-known and extremely important contributions to modern computing, and to our “online” lifestyle.
Pingback: Vanessa Smith
Pingback: Peter Smith
Pingback: Gregory Smith
Pingback: Technology-Related Movie Myths | Justin A. Parr - Technologist