These days, AMD CPUs can be found in a myriad of products: laptops, PCs, even single-board computers, consoles, phones and Teslas. Many of these products are good, providing competitive performance against Intel systems, particularly in the iGPU sector.
But AMD wasn’t always competitive. For almost 10 years before Ryzen, AMD CPUs were completely pointless to buy; and before that, they were the only good choice. Want to know when?Read this article, as it will outline AMD’s history.
Early History: pre-1995
AMD has existed as a company since the 70’s. Some of their earliest products were the AM2900 series of chips. These weren’t whole CPUs; rather, they were chips that represented individual parts of a system, such as the AM2901 ALU, the AM2905 Bus Transceiver, or the AM2914 Priority Interrupt Controller. As making computers from individual chips was common in the 70’s, this made sense. They were used in many computers, such as in some early Hewlett-Packard systems and even some Atari arcade machines.
Later, AMD made clones of the Intel 8080 as the AM9080. Initially, they were reverse-engineered, but they later got a license from Intel to produce them as a second-source manufacturer.
They also made clones of later Intel designs, such as clones of the 386, 486, and the enhanced 486 clone the AM5x86, which had comparable performance to Intel Pentium but was compatible with most later 486 motherboards. It wasn’t as fast in gaming as a Pentium, but as it was an enhanced 486, a system could be made for it for cheaper, and it would be a good upgrade path for 486 systems. It’s ability to be overclocked to 160 MHz meant that it could compete with 90 MHz Pentiums as well.
Attempts To Compete: 1995-1999
In 1996, AMD released it’s first completely in-house chip, the K5. While it had an advanced design, its clock speeds were low due to its many logic levels. Thus, while it could compete, it was released too late to compete with Pentium and was never really accepted as a good option.
On the other hand, the later AMD K6 was an amazing processor. It had high performance and a good floating-point unit. It had similar performance to a Pentium II but was much cheaper and, importantly, was compatible with Intel’s Socket 7 motherboards. Quoting Wikipedia, “The K6 had a considerable impact on the PC market and presented Intel with serious competition.” Like Ryzen, it forced Intel to innovate in a stagnating market.
Despite being named in such a way as to suggest it was developed from the K5 architecture, it was actually developed from an entire different processor design that AMD acquired along with the company NexGen.
A later version of the K6, the K6-2, was a competitor to Intel’s Pentium flagship at the time. It had finally achieved matching FPU performance (Important for 3D games in an era when graphics cards still relied on the CPU), by adding the 3DNow! FPU technology. Not all apps supported it, however, despite being decently fast.
The K6-2 was very financially successful and it allowed AMD to have the finances to develop the initial Athlon processors. Many redesigns followed; eventually, it had moved to the budget segment, where it competed with Intel Celeron. It was replaced by AMD Duron in 2000, but death didn’t come until 2003.
The final K6 branded chip series, the K6-III, was released in 1999. It competed fairly equally with Intel, being a bit faster for business than a Pentium II and slightly slower than a Pentium III. However, for gaming, it was slower as many games didn’t support 3DNow!, or in this case, 3DNow+. Like the K6-2, death came in 2003.
AMD as King: 1999-2003, The First Part
In 1999, AMD released the Athlon Classic. It was revolutionary, being much faster than Intel. Between August 1999, when it was first available en masse, and February 2002, it was the fastest CPU in the world. That’s almost two and a half years! It was much faster in both Integer and FPU performance; it finally gained a FPU that could really compete with Intel’s. 3DNow! remained as well, with some improvements and known as Enhanced 3DNow!.
In addition, it was AMD’s first locked-multiplier CPU: you couldn’t overclock it. This was done for a good reason: resellers were overclocking and remarking chips, which caused stability and performance issues in all but the beefiest of systems. Eventually, a device called the “golden finger” was created to unlock the multiplier.
The next-gen chip, the Athlon Thunderbird, was another excellent CPU. It was “cherished for it’s overclockability”, despite having a locked multiplier; there was a workaround that allowed it to be overclocked significantly.
Meanwhile, AMD finally replaced the K6-2 as a budget option with the Duron line of processors, which competed with Intel’s Pentium III and Celeron processors. They had only 10% less performance than many of the more expensive Athlon Thunderbird processors they were based on, and so were enthusiast favorites. They were killed in 2004 to be replaced by Sempron.
The next chips to come out of team red were the Athlon XP processors, which dropped in 2001, with new versions coming out for many years as clock rates increased. They were another of a string of good AMD chips from the Athlon series. In most things, they performed 10% better than Athlon Thunderbird while consuming 20% less power. Their clock speeds were upped to a maximum of 2.2 Gigahertz. They also supported SSE and 3DNow! Professional. The last Athlon XP processors dropped in 2004.
AMD as King: 2003-2006, The Second Part
The next chips to drop was the Opteron series, specifically the Sledgehammer architecture models. They were super-high-performance systems for workstations, servers, and super-enthusiasts. Models designed for 2-CPU and even 4-CPU and 8-CPU systems were also available. In addition, they were AMD’s first consumer 64-bit CPUs. They weren’t discontinued for many years; specifically, they weren’t discontinued until Threadripper came out, in 2017.
On the heels of Opteron were the first AMD consumer-level 64-bit CPUs, the Athlon 64 processors. They were the fastest AMD CPUs yet, and better yet, were 64-bit, meaning they could use the improved performance that the new 64-bit version of Windows XP had available. They were demolishing the competition, which at this time was the Pentium 4 and P4 Extreme Edition, which was nicknamed the “Emergency Edition” as many considered it to be a marketing ploy (Although it did get good performance, but that’s another article!). In addition, they also supported SSE2 and SSE3. Death was postponed, with the series ending in 2009.
Along with Athlon 64, the Athlon 64 FX was released. It was a super-high performance, enthusiast chip designed specifically for gamers. It had the highest clocks of all Athlon chips on it’s release, and that trend continued as new versions came out. Unlike other Athlons, its multiplier was unlocked.
After that, AMD decided to kill Duron and released Sempron, which replaced Duron as AMD’s entry level chips (The K6-2 was killed in 2003). These chips were based on the mid-end Athlon XP architectures, such as Thoroughbred, and as such had similar stats. Mid-end and low-end gamers would often use these Semprons as they had decent performance.
As time went on, AMD continued adding performance by increasing clock rates and cache in all of their CPUs. Sempron became Athlon 64 based, but without the 64-bit instructions; and then, in 2005, they gained full 64-bit support. Also in 2005, the first 2-core AMD CPUs appeared in the Opteron lineup.
At he same time, Athlon gained 2-core models as well: the AMD Athlon 64 X2. They had high performance and were likely the chip of choice for the gamer that, today, would buy a 12900k. The release of these chips caused some confusion. AMD clarified that these were for “prosumers”, Athlon FX was for gamers, Athlon 64 was for the average consumer, and Sempron was for budget consumers; basically their versions of i7, i5, and i3, and Celeron. Opteron was Xeon.
More versions of these same AMD CPUs came out, with more cache and higher clocks. Eventually, they added DDR2 support. With Intel no longer the performance champion in any way, AMD didn’t have to do a lot to remain king. However, that would soon change.
If you enjoyed this article, please read more like this below. Better yet, share it with your friends! Stay tuned for part 2 of this series.