WWII to WWW: The Evolution of The Computer

WWII to WWW: The Evolution of The Computer

As a kid in the late 80s, if there was one thing I championed, it was the computer. My grandmother had a Tandy Deskmate and it was my most-prized outlet for exploration and entertainment. There was no colour printing, there was no Internet, and there were only a handful of applications available. But boy, was it fun! That era of computing may seem pretty limited by today’s standards, but even at that point, the computer had come a long way. This article discusses the evolutionary timeline of the computer.

WWII and Military Use

Throughout WWII, many tactical computers were designed and built for military use. In 1941, J.V. Atanasoff, a professor of physics and mathematics at Iowa State University, partnered with a grad student named Clifford Berry to design The Atanasoff-Berry Computer (ABC). The ABC could solve 29 equations simultaneously, an impressive feat for its time. Despite this advancement, the ABC’s place in modern computer history has been debated as it is not programmable, nor is it “Turing complete,” – a concept coined by Alan Turing as defined by the ability to control a machine’s operation via internally programmed code, or instructions, stored in the computer’s memory. 

Konrad Zuse, a German engineer, created the Z3 computer, the first fully programmable and digital computer, based on Turing’s methodology. It performed aerodynamic calculations for two years before it was destroyed during a Berlin bombing raid. Meanwhile, the British Bombe (not to be confused with bombs!), was also invented in 1941 by Alan Turing and Harold Keen. It was based on an idea from Polish code-breaker Marian Rejewski’s 1938 “Bomba,” the British invention that decrypted Nazi Enigma messages. 

In 1943, the ENIAC was built by John Mauchly and J. Presper Eckert, two professors from the University of Pennsylvania. The ENIAC was considered to be the first programmable, general-purpose computer, functioning electronically without the use of slower mechanical parts. While able to perform a multitude of calculations, its primary design function was to measure external ballistics, making it another WWII relic. At the request of the US Army, Bell Laboratories would develop the Relay Interpolator, an analog computer made for aiming large guns toward their targets.

The Colussus was designed in 1944 by British engineer Tommy Flowers to crack complex code used by Nazis during the war. Unlike many second-generation computers of the era that had evolved from vacuum tubes to transistors, the Colussus was composed of over 2000 tubes. Despite its aged design, The Colossus significantly lessened the time required to decrypt Lorenz cyphers, the process used by the British to decode German army messages, reducing it from weeks to mere hours. Many historians believe the use of the Colussus shortened the war significantly as a result. It existed only in secret for over 25 years until it was later revealed to the public in the 70s.

The 50s to the 80s

Throughout the 1950s, computers became much more programmable, and as a result, various programming languages such as COBOL and FORTRAN emerged. COBOL, designed by computer scientist Grace Hopper in 1954, was the first computer language that saw the use of English words, as opposed to simply number-based instructions. A 1997 study estimated that more than 200 billion lines of COBOL were still in use at that time. The FORTRAN programming language was developed in 1954 by John Backus and a programming team at IBM.

By the mid-60s, computers were no longer exclusively designed or produced for scientists and mathematicians. For example, Digital Equipment Corporation’s PDP-8 computer saw businesses and manufacturing plants adopting the technology for word processing and mathematical calculations. The PDP-8 is considered to be the first of many commercially successful minicomputers.

Hewlett-Packard (HP), founded by David Packard and Bill Hewlett in a garage in Palo Alto, California, in 1939, wouldn’t create their first computer until 1966. The HP 2116A was a commercially available instrument controller for a variety of HP’s programmable test products. While originally designed for instrumentation markets (counters, electronic thermometers, voltmeters, etc.), its adoption in business data processing markets was unexpected. This led to the streamlined versions such as the HP 2115A and 2114A, removing much of the expansion capabilities, lowering costs, and increasing value in newer commercial markets.

In 1973, Xerox’s Alto computer was the first computer to support the use of an operating system based on a graphical user interface (GUI), a decade before such a format’s wider adoption. The Alto also combined many common features of today’s computers such as a mouse, printing, networking, word processing, and even email. Its influence on the computer industry, including Apple’s Lisa and Macintosh computers, later on, is truly ground-breaking. 

Enter, Apple. Catering to the hobbyist community with the Apple I, including its circuit-board form-factor, monochrome output, and no keyboard or monitor, the Company set its sights on something bigger, introducing the Apple II in 1977. Boasting colour graphics, integrated keyboard, game controllers, and cassette tapes for storage, the Apple II and its successor, the Apple IIe, would go on to sell millions of units for the following 16 years. This makes it one of the longest-lasting personal computer lines of all time.

Tandy, Atari, IBM and Commodore would all try their hand at the personal computer over the following five years after Apple’s innovations. IBM released a series of personal computers, starting with the IBM 5150 PC in 1981. Setting the standard for personal computers well into the millennium, the 5150 PC was powered by an Intel CPU chip, paired with floppy disk drives, and a colour monitor, all powered by Microsoft’s Disk Operating System (MS-DOS). 

The term “PC,” short for personal computer, started catching on as a result of IBM’s 5150 PC. While IBM attempted to attract the general public to the PC through accessible consumer advertising, the PC found its footing in the corporate world first. Not long after, IBM’s product was cloned by companies such as Compaq. These companies branded their new products as “IBM-compatible.” Genuine IBM PCs, as well as their compatible clones, would eventually achieve 83 percent of the PC market share by 1996.

In 1982, the Commodore 64 was launched with impressive graphics, thousands of software titles, and a low price tag. It was a resounding success, selling an estimated 12 to 22 million units in the 12 years that followed. To this day, the Commodore 64 holds the Guinness Book of World Records title for the best-selling single computer model of all time.

Over the 64’s lifespan, Commodore continued to cut production costs, resulting in an increasingly more affordable price tag as time went on, attributable to its sales and success. Due to its support of the BASIC programming language, it continued to be revered by hackers and hobbyist video game creators long past the hardware’s obsolescence. The rest of the 80s only got wilder, bringing the once-alien world of computing to schools, small businesses, and homes around the globe. 

In 1983, Apple’s Lisa model was the first personal computer with a graphical user interface (GUI), featuring both drop-down menus and icons. Despite its trail-blazing nature, the Lisa didn’t fare overly well commercially, but it served as the prototype for all of the Windows-based and Macintosh computers we know and love today. 

The Macintosh introduced itself to the world during the 1983 Super Bowl half-time show, airing a commercial taking advantage of the totalitarian theme of George Orwell’s book, 1984, implicating IBM as “Big Brother.” Apple conveyed their personal touch to computing as being the fun, friendly, down-to-earth alternative to sterile corporate behemoths. This is a tactic Apple has continued to pursue ever since. 

In competition with Apple’s GUI-based operating system, Microsoft announced Windows for the first time in 1985. This year also saw the first-ever dot-com domain name registration, which was Symbolics.com, years before the Internet’s official inception. 

The 90s to Now

During the 15 years following the innovations that occurred in the 80s, we saw the advent of the laptop. Apple’s Macintosh Portable was introduced in 1989, eventually redesigned and rebranded as the Powerbook, and became a common template for laptop design in the 90s. Within this decade, HyperText Markup Language (HTML), the term WorldWideWeb (WWW), the Internet, and Windows 95 entered our lives. Microsoft kick-started Windows 95 in 1995 with a $300 million promotional campaign, featuring the song “Start Me Up” by The Rolling Stones. 

While we have continued to see technology develop and permeate our contemporary culture ever since the late 80s, the PC itself has not evolved as rapidly in the past 20 years as it did in its first 20. The Internet has become exponentially faster, and we have tablets and smartphones acting as mini-computers in our pockets, but the desktop PC has remained roughly the same for 40 years. Between 1981 and now, the PC has consisted of a computer tower (a case with CPU, and components, inside), a display, a mouse and a keyboard. They are markedly faster and they can do much more impressive tasks than in the past, but the form factor, the peripherals, and the overarching concept have not changed.

Roughly 10 years ago, people spoke of the elimination, or obsolescence, of the desktop PC. We laugh now, as the PC’s tried, tested and true technology is still found in nearly every business (big or small), in stores of all types, and in the basements of all gamers. Considered the linchpin of modern technology, the computer now likens itself to common household appliances rather than a fad, gadget, or gimmick.

Author: Steve Gagnon

This Post Has One Comment

Leave a Reply