I hadn't remembered the anniversary, but Extreme Tech reminds us that 40 years ago, the Intel 8086 chip came on to the market - and things have never been the same since then. (Image below is courtesy of Wikipedia.)
Forty years ago today, Intel launched the original 8086 microprocessor — the grandfather of every x86 CPU ever built, including the ones we use now.
This, it must be noted, is more or less the opposite outcome of what everyone expected at the time, including Intel.
. . .
Initially, the 8086 was intended to be a stopgap product while Intel worked feverishly to finish its real next-generation microprocessor — the iAPX 432, Intel’s first 32-bit microprocessor. When sales of the 8086 began to slip in 1979, Intel made the decision to launch a massive marketing operation around the chip, dubbed Operation Crush. The goal? Drive adoption of the 8086 over and above competing products made by Motorola and Zilog (the latter founded by former Intel employees, including Federico Faggin, lead architect on the first microprocessor, Intel’s 4004). Project Crush was quite successful and is credited with spurring IBM to adopt the 8088 (a cut-down 8086 with an 8-bit bus) for the first IBM PC.
One might expect, given the x86 architecture’s historic domination of the computing industry, that the chip that launched the revolution would have been a towering achievement or quantum leap above the competition. The truth is more prosaic. The 8086 was a solid CPU core built by intelligent architects backed up by a strong marketing campaign. The computer revolution it helped to launch, on the other hand, transformed the world.
All that said, there’s one other point we want to touch on.
It’s Been 40 Years. Why Are We Still Using x86 CPUs?
. . .
Intel tried to replace or supplant the x86 architecture multiple times ... Far from refusing to consider alternatives, Intel literally spent billions of dollars over multiple decades to bring those alternative visions to life. The x86 architecture won these fights — but it didn’t just win them because it offered backwards compatibility. We spoke to Intel Fellow Ronak Singhal for this article, who pointed out a facet of the issue I honestly hadn’t considered before. In each case, x86 continued to win out against the architectures Intel intended to replace it because the engineers working on those x86 processors found ways to extend and improve the performance of Intel’s existing microarchitectures, often beyond what even Intel engineers had thought possible years earlier.
. . .
Will we still be using x86 chips 40 years from now? I have no idea. I doubt any of the Intel CPU designers that built the 8086 back in 1978 thought their core would go on to power most of the personal computing revolution of the 1980s and 1990s. But Intel’s recent moves into fields like AI, machine learning, and cloud data centers are proof that the x86 family of CPUs isn’t done evolving. No matter what happens in the future, 40 years of success are a tremendous legacy for one small chip — especially one which, as Stephen Morse says, “was intended to be short-lived and not have any successors.”
There's more at the link.
I feel old again . . . I can recall personal computers prior to the 8086. They were basically hobbyist machines, without real commercial potential. The 8086, plus Microsoft MS-DOS (the operating system IBM had Microsoft develop for its architecture) changed everything. Suddenly, practical applications such as word processing, spreadsheets and rudimentary databases had enough horsepower to run them. The office environment was almost completely transformed within a decade.
At the time of the 8086's introduction, I was involved with mainframe computers (IBM System/370 architecture under OS/VS1 and VM, for those of you who remember such things). In the early 1980's I "cross-shipped" to minicomputers (DEC VAX) and microcomputers (the IBM PC architecture), and ended up managing an end-user computing department before returning to more complex computers. I was thus one of the originals when it came to implementing the 8086 architecture as a business solution rather than a technological gee-whiz gizmo. Today, of course, the (relatively low-powered, not-top-end) smartphone in my pocket has many times more computing power than those original IBM PC's and XT's, and their clones.
Ah, memories . . .