Saturday, June 9, 2018

40 years of "standard architecture" personal computers


I hadn't remembered the anniversary, but Extreme Tech reminds us that 40 years ago, the Intel 8086 chip came on to the market - and things have never been the same since then.  (Image below is courtesy of Wikipedia.)

Forty years ago today, Intel launched the original 8086 microprocessor — the grandfather of every x86 CPU ever built, including the ones we use now.



This, it must be noted, is more or less the opposite outcome of what everyone expected at the time, including Intel.

. . .

Initially, the 8086 was intended to be a stopgap product while Intel worked feverishly to finish its real next-generation microprocessor — the iAPX 432, Intel’s first 32-bit microprocessor. When sales of the 8086 began to slip in 1979, Intel made the decision to launch a massive marketing operation around the chip, dubbed Operation Crush. The goal? Drive adoption of the 8086 over and above competing products made by Motorola and Zilog (the latter founded by former Intel employees, including Federico Faggin, lead architect on the first microprocessor, Intel’s 4004). Project Crush was quite successful and is credited with spurring IBM to adopt the 8088 (a cut-down 8086 with an 8-bit bus) for the first IBM PC.

One might expect, given the x86 architecture’s historic domination of the computing industry, that the chip that launched the revolution would have been a towering achievement or quantum leap above the competition. The truth is more prosaic. The 8086 was a solid CPU core built by intelligent architects backed up by a strong marketing campaign. The computer revolution it helped to launch, on the other hand, transformed the world.

All that said, there’s one other point we want to touch on.

It’s Been 40 Years. Why Are We Still Using x86 CPUs?

. . .

Intel tried to replace or supplant the x86 architecture multiple times ... Far from refusing to consider alternatives, Intel literally spent billions of dollars over multiple decades to bring those alternative visions to life. The x86 architecture won these fights — but it didn’t just win them because it offered backwards compatibility. We spoke to Intel Fellow Ronak Singhal for this article, who pointed out a facet of the issue I honestly hadn’t considered before. In each case, x86 continued to win out against the architectures Intel intended to replace it because the engineers working on those x86 processors found ways to extend and improve the performance of Intel’s existing microarchitectures, often beyond what even Intel engineers had thought possible years earlier.

. . .

Will we still be using x86 chips 40 years from now? I have no idea. I doubt any of the Intel CPU designers that built the 8086 back in 1978 thought their core would go on to power most of the personal computing revolution of the 1980s and 1990s. But Intel’s recent moves into fields like AI, machine learning, and cloud data centers are proof that the x86 family of CPUs isn’t done evolving. No matter what happens in the future, 40 years of success are a tremendous legacy for one small chip — especially one which, as Stephen Morse says, “was intended to be short-lived and not have any successors.”

There's more at the link.

I feel old again . . . I can recall personal computers prior to the 8086.  They were basically hobbyist machines, without real commercial potential.  The 8086, plus Microsoft MS-DOS (the operating system IBM had Microsoft develop for its architecture) changed everything.  Suddenly, practical applications such as word processing, spreadsheets and rudimentary databases had enough horsepower to run them.  The office environment was almost completely transformed within a decade.

At the time of the 8086's introduction, I was involved with mainframe computers (IBM System/370 architecture under OS/VS1 and VM, for those of you who remember such things).  In the early 1980's I "cross-shipped" to minicomputers (DEC VAX) and microcomputers (the IBM PC architecture), and ended up managing an end-user computing department before returning to more complex computers.  I was thus one of the originals when it came to implementing the 8086 architecture as a business solution rather than a technological gee-whiz gizmo.  Today, of course, the (relatively low-powered, not-top-end) smartphone in my pocket has many times more computing power than those original IBM PC's and XT's, and their clones.

Ah, memories . . .

Peter

20 comments:

Flugelman said...

Memories indeed. I started out in the civilian computing world working with Ohio Scientific systems writing Basic-In-ROM around a 6502. From there to Digital Research CPM, trying to build a multi-user system for medical office support. The advent of the IBM PC was a competitive threat to our systems at the time but I could see what was coming. Convincing the bean counters to invest in a new architecture at a time when we were making money with old technology was a futile exercise.

Old NFO said...

Hacker's Club at Stanford 1973... sigh

Unclezip Is Pointing&Laughing said...

I recall playing with the 4004, 8008, 8085, Z80...

So old.

Uncle Lar said...

Space Shuttle flight systems were run off 386SX processors long after more powerful chips were available.
Two reasons, first because changing the configuration of flight systems is a massive undertaking, but the main reason was that the 386SX was remarkably impervious to space radiation upsets. Faster more powerful processors such as those incorporated into the onboard Thinkpad laptops would regularly require a reset and often a full reboot, not something really desirable in your flight or life support systems.

MrGarabaldi said...

Hey Peter;

my first "real computer" was an "XT" I bought in 1991 and I still have the keyboard. There is something for compatibility. My keyboard uses a DIN5 plug, with a PS2 adapter on it followed by a PS2 to USB adapter. I am running Windows 10 and using the same keyboard, LOL. There is something for the continuity I suppose.

Tregonsee said...

Peter I think you'll find that your average cell phone of today would kick ass and take names amongst the DEC 11/780, IBM 370's etc. Heck I sat and did the math and I think a $35 Raspberry Pi 3 (based off a Broadcom phone chip) would crush a late '80's Cray 1 without breathing hard. And it is amazing the x86 architecture did so well. What helped I think is moving the pile of existing software was just too hard. The x86 is loosing dominance though. Just heard last year that tablet sales exceeded that of deskside PC's. But people have thought that before and the Intel Designers have pulled a rabit out of their hats...

Jon said...

So, the core really isn't the same. That's been redesigned a number of times. What's the same is the address architecture, and the instruction set. The 8086 had no math processor (floating point was software), no cache, no pipelining, no multi-threaded cores, no predictive branching.

It's sort of like saying we are really still little lemur like animals, instead of fully developed humans.

RG Ainslee said...

About 1958, I visited the computer room at Kelly Air Force Base in San Antonio. The computer was about the size of a small mobile home. They said the same computing power after WWII would have filled an aircraft hangar.

1964 in the Army, our computer was the size of a compact SUV. I spent many frustrating hours banging out data input on yellow teletype tape. One mistake and you had to start over.

My first real PC was a Radio Shack that ran on MS-DOS. A real disk operating system, I had to load the operating system with a floppy disk. It worked fine and got me through grad school.

Borepatch said...

RACF?

Peter said...

@Borepatch: Yes, among other things. Early RACF - the one with all the bugs.

The Lab Manager said...

Too bad CP/M was not competitive enough with MSDOS. It would have been nice to have more competition in computer architectures and operating systems. Amiga, Atari ST, Next, Mac, and the PC. GEM, GEOS, Windows and then Linux.

Ray - SoCal said...

Intel did a brilliant job of marketing against Motorola. Great book marketing high technology that covers operation crush:


https://www.amazon.com/Marketing-High-Technology-William-Davidow/dp/002907990X

Andrew Smith said...

Who can forget the sound of a dot matrix printer late at night? A sure sign that the somebody had an assignment due the next day.

CGR710 said...

The problem with dominant technologies is that it tends to also propagate the bugs. See the whole Meltdown/Spectre story which is going to plague us a looong time...

Andy said...

Memories indeed. Years ago, but long after IBM "legitimized" the pc market with their platform, I had a conversation with a friend who owns a manufacturing biz. As they were fairly diverse, I cannot say exactly what but he said they still used outdated Commodore 64 computers to control certain aspects of their manufacturing because they were cheap and they did the job. To me this answers your question about the X86.

On another note, I remember when we finally broke down and purchased our first IBM clone - a 286. I convinced my wife to splurge and buy a 20 meg hard drive, explaining, that it had all the storage we would ever need.

TheOtherSean said...

IIRC there were GPU's and ARM CPU's impacted by at least some of the problems in the Meltdown/Spectre mess, so it wasn't just having a dominant technology that was a problem (unless you're meaning register-based processors generally, as opposed to the x86 architectured).

Roy said...

My first personal hard disk PC was also a 286 XT with the 20 MB hard drive. Back then, it would have been hard to fill that space, but today I have a digital camera whose picture files are bigger than that.

Anybody here remember the 80186? It predated the 286 but was almost entirely used in embedded systems such as process controllers.

I am now 64 years old and I have been in computing a long time - since the mid 70's - and I am still amazed at how far the technology has come. I have a thumb drive I keep on my key ring that holds over a thousand times more data and is much faster than the old 10mb, washing machine sized hard drives from those days.

Ben Yalow said...

Note that the IBM/360 (and later /370) series mainframes still used a 24 bit addressing scheme, so that memory was limited (and expensive). But Assembler E was designed to run on a 32K 360, and even high level languages like PL/I were designed to run on an F (64K) machine -- note that once the smallest of the operating systems is included, that left about 45K for the compiler to fit in.

CGR710 said...

I was thinking on the pipelined speculative execution algorithm, which is part of the dominant architecture of practically all CICS processors...

rick said...

When we bought the PA/AT for our physics lab at college, we sprung for the extra 80287 math co-processor.