Guest Opinion: The Deconstruction of Falling Stars
By John Kocurek
Date: September 28, 2001
In The Beginning
Intel came into this world as a memory chip manufacturer, but it's mission was to examine everything it thought it could make a profit on, not unlike most any startup that actually survives. It managed to survive all right through the late '60s and early '70s, but it's big break came when it was approach by Busicom to design and manufacture a chipset for a desktop calculator. Seeing an opportunity to "think outside the box", Intel engineers pitched, and got approved for, a programmable 4 bit processor based chipset, what was to become known as the 4004. This was in 1972.
While not the first microprocessor designed, it was the first one that you could actually go out and purchase -- a big difference from the previous paper products. The mark of a successful company is the ability to actually recognize an opportunity when one drops into its lap, and Intel managed to do that. It developed newer and more complicated microprocessor chip families and it expanded its memory chip lines to provide the memory for those microprocessors. Intel eventually was able to become a power in the EPROM and the static and dynamic memory markets, in addition to the microprocessor market, thus putting it on the road to becoming one of the largest companies in the world.
By the late '70s, Intel had a strong lock on the 8 bit processor market, and it was looking to their next generation. Deciding to be very ambitious, Intel started what was to be known as their iAPX432. It was based on all of the latest research in that poorly named field, Computer Science (it's not a science), and would have support for Ada, a new language that was being specified at that time. The iAPX432 was called a micro-mainframe, and was going to have a long list of features, including the ability to have its computational power increased by adding more processors. Because it was so easy to add new processors, it would also have to ability to check itself and switch out failed processors, so it would have high availability. With it, Intel was going to sweep the market from the top to the bottom. Intel could leverage off of its manufacturing ability to challenge IBM itself, cutting the cost of producing high end computing products by orders of magnitude.
The iAPX432 was a very complicated design. It was taking so much time that it was decided to have a crash project and do a replacement for their 8 bit line. Two engineers, Stephen Morse and Bruce Ravenel, were reportedly able to do the specifications for a new processor, the 8086 in the amazing time span of a week. Then it took a year to get the chip laid out and in production, and Intel was able to hit the market with the first 16 bit processor.
And that is the chip, the 8086, that truly made Intel as a company.
But, you ask, what happened to the iAPX432? Well it finally made it to market in 1981. It hit all of its design goals, sadly performance was not one of those goals. The thing brought a whole new meaning to the word slow. And it was complicated to design for. As a result, IBM was more amused than threatened, and the iAPX432 quietly, and peacefully died in 1986. It quickly became apparent to Intel that the iAPX432 was not going to make it, so they quickly designed and brought to market the 80286 in 1982, thus immortalizing the 8086 (a.k.a. x86) architecture forever.
Born to the Purple
Intel kept working on its original market, computer memory. It was one of the original manufacturers of bubble memory, and it also produced EEPROM and other memory technologies. But the memory market was becoming more and more of a commodity market, and that means that there was less and less profit in it. Even though Intel had (and has) a very conservative approach, whenever manufacturing equipment becomes obsolete, Intel just starts using it to produce chips that don't need the latest production techniques. As a result it squeezes every last penny out of its investments and many companies have copied this approach.
Despite all of this, it was becoming un-economic for Intel to stay in the memory business. Intel started to shed most of its memory lines, bubble, static, dynamic and EPROMs all went. Eventually Intel was left with just flash memory. The chipmaker, especially under Andy Grove [ed: the Darth Vader of the computing world], was not very attached emotionally to its product lines. When they didn't pay anymore, it killed them. More and more of Intel's business became microprocessors, eventually they would come to dominate it.
As its microprocessor market grew, Intel naturally became more focused on it. As its focus increased, Intel began to take more control over it. First, Intel tried was to eliminate their second sources. In the early days, it was common to license out designs because many companies were uncomfortable depending on only a single source for key components. But, as semiconductors became more complex, true second sourcing became more and more difficult to do. Finally, by the end of the '80s, fewer companies stopped insisting on it.
And Intel decided that the time had come to cut out its loyal second-sourcer, AMD.
After a series of lawsuits, Intel was able to limit all of its previous second sourcing licensing to the 80486 generation. All of its previous licensees would be able to come out with derivatives of the 80486, but they could no longer copy the later generations of chips from Intel. Intel went as far as to change their naming scheme and eventually the basic interface so that the future chips could not be called the same as the Intel chips, and eventually would not even be able to plug into the same sockets and work as the Intel chips. And this is when Intel began to really prosper.
In the early '90s, Intel and HP started to work on a new processor based on the latest research in that poorly named field, Computer Science, and Intel was going to sweep the market from the top to the bottom. Intel could leverage off of its manufacturing ability, and, well, you probably know this story.
This new processor would eventually be called Itanium (where do they get these names?), and was to initially be introduced in 1997. As the '90s progressed, the x86 market slowly picked up some competitors, and Intel was able to eliminate them, one by one. Finally, by the end of the decade, all that was left was Intel and AMD, and AMD was on the verge of bankruptcy. And then it changed...
Midnight On The Firing Line
I won't go into the background here, but in 1999 the Athlon (code named K7) was introduced by AMD. For the first time in a long time, Intel had a viable competitor for its microprocessor line. And it could not have come at a worse time for the chip titan. Up until late 1999, everything had been going Intel's way. The Santa Clara chip firm was literally in total command. They had the technology and the products, and were in the position to be able to dictate to the industry what they wanted -- and they exercised this position often.
Intel has a long history of maximizing their profits as much as possible. I already mentioned their fabrication equipment, and they also lovingly hand optimized every transistor (ok, an exaggeration, but not by much) in every processor so that they could get the most performance for the least cost. The only downside to this is that it takes time, but Intel had always had all the time they needed.
When the Athlon was introduced, Intel was in the process of milking their flagship processor and bringing their next generation, the Itanium, to market. The introduction of the Itanium was a difficult one. It was already 2 years behind schedule, but there was still some life in their current generation. The Pentium III could be used for another couple of years. A refresh was already planned. It would cut their costs and would eventually be able to move to 800MHz, maybe even higher over the course of its lifetime. And if the Itanium was delayed even further, Intel had a novel processor, code named Willamette, that they could always introduce in a year or two if needed.
Intel had crafted a complex and segmented market strategy. They had different support chipsets and different variations of their processors and by combining chipsets and processors they could target a wide variety of price points and performance levels.
AMD didn't operate that way. Their processors weren't nearly as well optimized, and they didn't have as many types. Because of this, if a change was needed, a new version could be brought out in a matter of a few months, not the years that Intel required. AMD could try stuff, botch it, and they would be working on a fixed version before all of the pieces hit the ground. But, AMD's cost structure was not as optimized, nor did they have the resources that Intel has.
But the Athlon turned out to be a killer processor. It was bigger, drew more power and was more expensive to produce than the Intel processor. But it would kick around the equivalent Intel processor and, to add insult to injury, would even clock to a higher speed and yield [ed: yield is the percentage of good processors on a wafer] in good numbers.
Suddenly, Intel was in a bind. Their response was the natural one, and it was the wrong choice. Intel decided to speed up certain schedules and to trot out some of the more experimental processes that they are always working on. Intel then started to experience what I have come to call "The Morton Salt Syndrome", i.e. "when it rains, it pours". Intel had taken that quantum leap from everything working out well to everything failing horribly. So Intel moved to the Coppermine-core Pentium III, and that worked out okay. But the new chipsets, the i820 and the i840, well, they had problems from the beginning. Not only was the Rambus RDRAM memory they used staggeringly expensive [ed: $1,000 / 128 MB stick at the time], but only two out of every three of the available memory slots were usable [ed: Intel ended up dropping the third memory slot].
To increase the speed of the processors, a "notched transistor" process was implemented, but it apparently caused production problems while not improving the chip speed by much. It was soon to earn the name "botched transistor" process.
And that was just in 1999 -- 2000 would prove to be even worse for them.
The Geometry of Shadows
The year 2000 started out with a bang. Total global economic collapse due to Y2k bugs. Chaos, cannibalism as services broke down... oh wait, it didn't happen that way.
With the concerns over the Y2k stuff behind the industry, it looked to be a promising year for computers. Intel announced early on that they would be production limited for the first part of the year. Manufacturers started to use AMD processors in more of their equipment. For example, Gateway went from 100% Intel to 50% AMD by April of 2000. And Intel was having more and more problems competing with AMD on clock speed of their processors. This culminated with the February introduction of 1GHz processors from both companies. Within a month or so, you could find 1GHz AMD machines easily, but the 1GHz Intel machines would prove elusive through at least September.
Secondly, there was the problem with the i820 chipsets. Those weren't the mainstream chips and the market was rejecting RDRAM, so the MTH (the so-called Memory Translator Hub) was designed so that cheaper and more popular SDRAM could be used on the i820 and i840 motherboards. Unfortunately for Intel, the MTH chip turned up balky and would cause memory corruption under some circumstances, prompting Intel to recall the affected motherboards. This snafu cost Intel millions.
The chip giant was starting to feel pressure. Even though AMD did not have the production capacity to really squeeze Intel, AMD could produce enough to essentially take over the high end of the market. So Intel was looking at being squeezed out of the high end / high margin market, and then eventually the middle tiers, leaving only the low end, low margin market.
By mid year, most consumer PC manufacturers had Intel on the bottom end of their product line, and AMD at the top. But that wasn't the worst. In July, Intel announced limited availability of their 1.13GHz processor, but they had to withdraw it from the market only a few weeks later. It was more of a PR disaster than anything else, some reports have the number that actually got shipped to consumers as low as 10, but it really hurt that the recall was announced on the exact day that AMD announced their 1.1GHz Athlon.
So by September of 2000 the situation was that AMD ruled the consumer market from 800MHz and above, Intel was at the 800MHz and below segment. AMD has its own set of problems -- motherboards are expensive for example -- but, for many, Intel had been relegated to the bargain bin.
And the rest of the year looked no better. Willamette, now named the Pentium 4, had not shipped as of yet, but was rumored to have low performance for its clock rate -- the 1.4GHz Pentium 4 was likely to be on par with a 1.1GHz Athlon and on some things a little slower, on others a little faster, on a few things much faster and on a few much slower.
AMD had committed themselves to increasing the top speed grade of the Athlon by 100MHz or so every 5-6 weeks through the early part of 2001. This pressured Intel to either do the same or drop behind -- even with their spiffy new processor. That is assuming they can actually ship significant quantities of the Pentium 4 -- as consumers learned with Intel's infamous "paper launches," what good does a fast processor do if you cannot actually buy one?
And it only gets worse for Intel. AMD is now planning on coming out with a refresh of their Athlon line that will have lower power requirements, a smaller chip size and possibly have even more clock rate headroom. A smaller chip size is important since it costs the same to process a wafer (roughly) whether there is one good chip on it, or a thousand. So size is important, the smaller the better. [ed: but let's not forget Intel's Northwood]
In addition, are now versions for the server and workstation market, chipsets with more memory bandwidth, and worst of all, a chipset that can handle two processors on the same motherboard -- a high margin area Intel had all to itself. So the only areas that Intel is still making money is now under attack.
And as far as Itanium goes, well the good news is that it has finally been released this year -- if you can buy it. Many have tried and have actually been turned down because they had to be approved for this tardy toaster. The bad news is that it met all of its design goals except for one. Hint: this time performance was a design goal...
Meditations on the Abyss
So Intel is besieged. Isolated. Alone. It has no where to turn.
The bad thing is that it takes several years to design a new processor and get it into production. That is the rock that Intel broke on. 2 to 3 years ago there was no real competition for its product line except on the low end. And it had a plan to run all of the competition out of the low end and leave the whole market to its products. Since it had a product that would allow it to dominate the high end in its Itanium, all that was left was to make plans on exactly how it was going to shape the computing market. Intel examined the market and came to the conclusion that the industry was moving into the post-PC era.
Processor speed exceeded what was really needed and the attention was shifting from the desktop to the palmtop and the Internet. So Intel slowed investment in its bread and butter processor lines, and started to invest in other areas where it saw the future growth to be. It was a good, sound plan. Unfortunately, reality has a nasty way of intruding and ruining good plans. Like the Y2k bug predictions, Intel's projections were wrong, the future did not turn out like it thought. It may yet, but that day is not here. Intel forgot the cardinal rule of being the market leader in the computer industry, "Uneasy sits the butt that bears the Boss". As soon as you get comfortable in the Big Chair, it tips over backwards.
Right now the only advantage that Intel has over AMD is that it can outproduce it and Intel still has a lock on the corporate market. What it doesn't have is a couple of years to get a new design out and in the market. So in a single year Intel has gone from the Colossus of Computing to the primary producer of key fobs and paper weights.
So what is a proto-monopolist to do? An interesting question is whether or not Intel will stick with microprocessors which are rapidly approaching commodity pricing. Or will it pick up its marbles and go home? It is currently behind and dropping back daily. If AMD executes for another year as well as it has in the past year, Intel may not be in a position to catch up. By no means has AMD executed perfectly. But when you have momentum upward, mistakes only slow you a little. If you have momentum downward, things that go well only slow the decline a little. However, things aren't totally bleak for Intel. For one, they have a chip line called StrongArm, now called XScale (where do they get these names?) that has excellent features for the embedded space. Also, it is well positioned for growth once the post-PC era becomes established, but so far the industry hasn't co-operated in that. In addition, it has a lot of semiconductor manufacturing capability, expertise and engineering talent, albeit under the whip by some accounts.
Intel's biggest strength remains its big pile of money that it is burning through like jet fuel. The big problem that Intel has is that it is not as flexible as it used to be, but aren't we all? The problems that face the company are not as bad as those it had in the '70s when it was battling it out with Motorola, MOS Technologies, RCA, Fairchild and all of the others in the early microprocessor industry.
What Intel lacks is the instincts and mindset of a competitor. It is more attuned now to dominating its industry, issuing directives and controlling the destiny of the marketplace. This is not to say that Intel will be going bankrupt or even to lose money, but they are going down the same path to irrelevancy that IBM, DEC, Cray and others have traveled before it. The only question is whether or not it can turn around and take the high ground back, or even if it wants to do this anymore. It may be that the other areas it is into are more lucrative than the microprocessor industry will be over the next few years. AMD will likely push the microprocessor areas it is trying to enter down into a commodity space with little or no profit growth, and then we will really enter into the post-PC era that Intel believes will be the near future. Only time will tell.
Pssst! Our Shopping Page has been updated.