From monstrosity to laptop: the story of the personal computer
What began as a “bizarre” fantasy has now become reality.
In October 1950, the American researcher and peace activist Edmund Berkeley (1909-1988) introduced the world’s first personal computer in Radio-Electronics magazine.
The parts for the computer, which Berkeley called “Simon,” cost US$600, and he sold upwards 400 assembly guides.
Berkeley was driven by a desire to find everyday applications for the new computer technology and imagined that, one day, we’d all have tiny computers in our homes.
These “giant brains,” as he called them, would be able to help our children with homework, help us remember things, and manage our finances.
Few shared Berkeley’s enthusiasm
Back then, a single computer typically filled up an entire room – and required skilled technicians for operation. Berkeley’s vision was, in other words, bizarre, and few shared his enthusiasm for the home computer.
The expression “personal computer” appeared 12 years later in The New York Times when the computer scientist John W. Mauchly (1907-1980) shared his vision of the computer’s future:
“There is no reason to suppose the average boy or girl cannot be master of a personal computer.”
A few years later, the first programmable calculators entered the market. First the Olivetti Programma 101 in 1965 and then Hewlett-Packard’s 9100A three years later.
Hewlett-Packard initially branded the 9100A as a personal computer but quickly abandoned this concept. People were simply incapable of combining the word “computer” with a machine so small and user friendly.
Building the mainframe computer
During the 1960’s, both businesses and research institutions invested in user-friendly computer terminals.
This allowed more users to share the same computer—a mainframe computer. Computer programmers developed early text editors, graphical design programs, and games on these timeshare systems.
Another development was the microcomputer, albeit there was nothing “micro” about it according to modern standards: the average microcomputer back then was the size of a regular fridge.
The price was all but micro as well. Most microcomputers cost US$16,000 or more. But they were still smaller and cheaper than the mainframes, making them accessible to smaller businesses and research units.
The Xerox Alto microcomputer from 1973 is a good example of an early prototype of the personal computer, as we know it today:
A separate display, keyboard, and mouse placed on the desk, an operating system with a graphical user interface, and user-friendly programs.
The microprocessor marked a new direction
It wasn’t until the development of the microprocessor in the late 1960’s and early 1970’s that computers reached a size and price appropriate for the nickname “personal.”
A microprocessor contains the central processing unit (CPU) on a single integrated circuit. This invention led to drastic decreases in both price and size.
The first microprocessor to reach the market was Intel’s 4004, produced in 1971 and sold for as little as US$60. The year after, Intel released the 8008 with an 8-bit bus—a doubling of the 4004’s 4-bit architecture.
In 1974, the magazine Radio-Electronics featured Mark-8 “personal minicomputer” on its front page of in July 1974. It was a computer design based on the Intel 8008, put together by engineering students who had previously worked on a DEC PDP-8/L minicomputer.
The PDP-8 was one of the first commercially successful minicomputers, and it only took up the space of roughly two modern microwave ovens.
Microcomputers became a hobby
After seeing the Mark-8, the editors of Popular Electronics developed a similar computer design, releasing it in January 1975 as the Altair 8800 and cost US$621.
Slowly, the word “microcomputer” gained traction as more and more computer designs based on microprocessors hit the market.
They were sold to hobbyists with an interest in electronics—a group which had formed in the late 1960’s, before the development of the microprocessor.
As early as 1968, American bookshops sold books like How to Build a Working Digital Computer, which guided readers through the process of creating their own computer with paper clips and aluminium cans.
Clubs for computer enthusiasts spread throughout the country in the early 1970’s. In California, enthusiasts gathered for the first time in March 1975 to establish Homebrew Computer Club.
The highlight of the meeting was the new Altair 8800 computer. One of the participants was Steve Wozniak (1950-present). Inspired by the meeting, he started developing his own computer set.
Along came Apple
Together with his friend Steve Jobs (1955-2011), Wozniak established the company Apple releasing the Apple I computer set in 1976 for a price of US$666.66.
The market for home computers grew rapidly and demand increased as people beyond computer enthusiasts started to realise the potential. In 1977, the Commodore PET (Personal Electronic Transactor) became a popular product among schools and private homes.
The Apple II was – unlike its predecessor – an assembled product ready for use. Apple launched it the same year as the Commodore PET with the slogan: “the home computer that's ready to work, play and grow with you.”
Commodore marketed the VIC-20, a follow-up to the PET, in 1980 at the affordable price of US$300. It was a competitor to the emerging gaming consoles, and it became the first computer to sell more than one million units.
Meanwhile, the bigger computer companies were also beginning to see the potential in the personal computer market. IBM launched the 5100 model in 1975. The advertisements said it was the first “portable” computer.
Portability was expensive
The cheapest version of the 5100 cost almost US$9,000. IBM launched its successor, the IBM 5150, as the IBM Personal Computer in 1981.
People soon began to refer to the IBM Personal Computer simply known as “the PC”. With its competitive pricing at US$1,565, it was a huge success, and more than half of the sold units went to private customers.
The success inspired other computer manufacturers like Compaq, Dell, and Hewlett-Packard to produce IBM-compatible personal computers. IBM’s PC became a market standard for personal computers with Apple as the only real challenger.
Graphical user interfaces became necessary
IBM initiated a partnership with Bill Gates (1955-present) and Microsoft to develop a new operating system, the DOS (Disc Operating System). As Apple’s Mac OS had a graphical user interface, Microsoft also developed one under the trade name Windows.
Apple released Mac OS with the first Macintosh computer in 1984; Microsoft’s Windows 1.0 followed the year after.
In 1993, IBM broke with Microsoft, and Microsoft then began developing what would become Windows 95. Microsoft’s new operating system was a tremendous commercial success and became standard on most PCs around the world.
First laptop is hard to track down
Just like the murky origins of the personal computer, things get even hazier when it comes to the laptop. While IBM marketed its 5100 as a “portable computer”, it was very different from what we call a laptop today.
The Gavilan SC computer from 1983 was the first computer sold as a laptop.
It cost US$4,000, weighed four kilograms, and had the characteristic clamshell design of most modern laptops.
IBM created their first laptop in 1986. It was slightly heavier and a bit more expensive than the Gavilan. In 1989, Apple followed suit with the Macintosh Portable.
Stagnation due to smartphones and tablets
Over the past 20 years, the market for personal computers has grown from 70 million sold units in 1996 to 316 million in 2013.
In 2011-2012 the numbers began to stagnate and in 2013 they decreased by ten per cent.
Many suspect that the influx of tablets and smartphones were a direct cause of this.
In October 2014, however, most of the major computer producers reported that computer sales were growing again.
The personal computer has become something that almost everyone owns. Berkeley’s vision of a small computer in every home is no longer a “bizarre” fantasy; it is reality.
Translated by: Kristian Secher
A new path towards quantum computers
The supercomputer of the future operates with quantum bits, but quantum systems are fragile and they degrade easily. Now Danish scientists have managed to turn this degradation into an advantage, making it easier to create the special quantum states required for a quantum computer.