Andy - "They have such huge R&D resources I find it difficult to understand how significantly smaller chip companies can keep beating them"
You have to have the perspective of time to understand that...
Jerry Sanders III (founder of AMD) put it best...
"Playing in the semiconductor industry is like playing Russian roulette with a twist: You put a gun to your head and pull the trigger, and four years later you find out if you blew your brains out."
You can have all the R&D possible, but it still takes many years to implement that into manufacturing...and by the time you do the whole landscape has changed once again!
Remember that Intel was designing Prescott before x86-64 had ever been heard of. Changing their design to incorporate 64 bit was a HUGE ask (much harder than designing it from scratch).
Also, prior to the release of the Athlon (which is still very recent by semiconductor standards), Intel had never had any real competition...
I have a question off topic, when will the non-micro-atx form factor mobos, of the ati bullhead aka xpress-200 come out, and how much will they cost???
The problem is when certain tasks are linear and sequential in nature. For example you can't begin rendering a scene until the physics and AI have been processed, so there is very little opportunity to use concurrency. The traditional solution is to pipeline such that one core is working on the physics for the next frame while the previous frame is being drawn, but the corresponding AI for that next frame may depend on physics, so you cant run physics and AI concurrently, rather you have to finish one as quickly as possible, etc. You can only pipeline so far ahead before certain results must be completed in order and in finite time.
In linear and sequential tasks which do not translate well to multiple concurrent tasks will see poor performance on multi-core chips whose clock speed is lower than a single chip :(
Games fit into this category of being very linear in nature since the composition of the next frame depends on everything from the previous frame. The exception is limited to a few cases of threading for the sake of threading (to brag that multithreading as a feature), asynchronous file loading, network operations, etc. It can be used, but due to the non-concurrent nature of a game engine, it only serves to convolute the code.
There are a few GHz-enhancing (i.e. leakage-reducing) technologies around like tri-gate and metal gate. The last I saw these were an option for the 45nm process at the earliest, which is probably a 2008/9 thing (Intel say 2007 but they've been a year late with 90nm and it looks 65nm now too in terms of shipping).
This is probably known to most people reading this, but the Prescott core would have rocked _in a universe where leakage didn't exist_ and would have hit 5GHz by the sounds of it. They seemed to plan the NetBurst idea back around 2000 when leakage was less of a problem.
So now Intel are on the defensive. Marketing and spin is everywhere, but I'd guess they only have to become _really_ scared once AMD's new fab is up and running. That ought to be pretty interesting.
I'm more excited about Intel finally adopted AMD64 in a full line of their desktop processors. Sure, 6xx doesn't make huge gains where many would like, but the most critical thing about it seems to have been missed.
And the upcoming dual core processors (the 8xx) will also be AMD64 capable.
Now bring on the 64-bit Doom 3 (or Quake 4!) with a 512MB video card. :P
#12> Because all their money is going into Marketing rather than R&D. And judging by their success at selling crap just proves that the only thing that matters is marketing.
#7 - Oh, I understand the technical reasons, I was questioning along the lines of "why can't they make something better??" They have such huge R&D resources I find it difficult to understand how significantly smaller chip companies can keep beating them.
I think Intels problems run a bit deeper then everyone thinks. The same can be said about AMD. The leakage power of the smaller manufactoring processes is much more a problem now then it was previously. Check that out:
http://www.anandtech.com/cpuchipsets/showdoc.aspx?... This problem will not go away with the introduction of dual core processors. A bigger problem for the multicore movement might be that there are not many user apps that benifit from multiple threads.
I am more interested in what the Cell processor will and will not be able to do. I also head that IBM and Sony are writting an operating system...for Cell perhaps. We'll have to wait and see, but it would be pretty cool if Cell made its way into the PC.
#8, that is the whole system, not just the CPU. So, the whole P4 system uses 233 watts. Doesn't really change the whole outlook, but just wanted to point it out.
So you can run 11 Mac minis on the power necessary to run the latest Pentium 4 CPU
Add the rest of the PC and you'll be nearing 500watts
or 25 Mac minis running Xgrid
Seems like distributed power efficient CPUs is the way to go. Of course, you can't fit the I/O gear you need to the mini (no optical) - but it's the principle I'm pointing out.
I still can't fathom why Intel is having such trouble getting more performance out of it's chips. The last few releases have an air of desperation about them, with all sorts of ultimately unrewarding 'features' being applied to compensate for the lack of GHz. Hurry up dual core!
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
22 Comments
Back to Article
Viditor - Tuesday, March 1, 2005 - link
Andy - "They have such huge R&D resources I find it difficult to understand how significantly smaller chip companies can keep beating them"You have to have the perspective of time to understand that...
Jerry Sanders III (founder of AMD) put it best...
"Playing in the semiconductor industry is like playing Russian roulette with a twist: You put a gun to your head and pull the trigger, and four years later you find out if you blew your brains out."
You can have all the R&D possible, but it still takes many years to implement that into manufacturing...and by the time you do the whole landscape has changed once again!
Remember that Intel was designing Prescott before x86-64 had ever been heard of. Changing their design to incorporate 64 bit was a HUGE ask (much harder than designing it from scratch).
Also, prior to the release of the Athlon (which is still very recent by semiconductor standards), Intel had never had any real competition...
fuluku - Saturday, February 26, 2005 - link
http:\\www.highspeedsat.com\globalstar_satellite_phone_rental.htmThe_Necromancer - Thursday, February 24, 2005 - link
I have a question off topic, when will the non-micro-atx form factor mobos, of the ati bullhead aka xpress-200 come out, and how much will they cost???Chris - Thursday, February 24, 2005 - link
The problem is when certain tasks are linear and sequential in nature. For example you can't begin rendering a scene until the physics and AI have been processed, so there is very little opportunity to use concurrency. The traditional solution is to pipeline such that one core is working on the physics for the next frame while the previous frame is being drawn, but the corresponding AI for that next frame may depend on physics, so you cant run physics and AI concurrently, rather you have to finish one as quickly as possible, etc. You can only pipeline so far ahead before certain results must be completed in order and in finite time.In linear and sequential tasks which do not translate well to multiple concurrent tasks will see poor performance on multi-core chips whose clock speed is lower than a single chip :(
Games fit into this category of being very linear in nature since the composition of the next frame depends on everything from the previous frame. The exception is limited to a few cases of threading for the sake of threading (to brag that multithreading as a feature), asynchronous file loading, network operations, etc. It can be used, but due to the non-concurrent nature of a game engine, it only serves to convolute the code.
Stephen Brooks - Thursday, February 24, 2005 - link
There are a few GHz-enhancing (i.e. leakage-reducing) technologies around like tri-gate and metal gate. The last I saw these were an option for the 45nm process at the earliest, which is probably a 2008/9 thing (Intel say 2007 but they've been a year late with 90nm and it looks 65nm now too in terms of shipping).This is probably known to most people reading this, but the Prescott core would have rocked _in a universe where leakage didn't exist_ and would have hit 5GHz by the sounds of it. They seemed to plan the NetBurst idea back around 2000 when leakage was less of a problem.
So now Intel are on the defensive. Marketing and spin is everywhere, but I'd guess they only have to become _really_ scared once AMD's new fab is up and running. That ought to be pretty interesting.
JK - Thursday, February 24, 2005 - link
I'm more excited about Intel finally adopted AMD64 in a full line of their desktop processors. Sure, 6xx doesn't make huge gains where many would like, but the most critical thing about it seems to have been missed.And the upcoming dual core processors (the 8xx) will also be AMD64 capable.
Now bring on the 64-bit Doom 3 (or Quake 4!) with a 512MB video card. :P
Jim - Wednesday, February 23, 2005 - link
#12> Because all their money is going into Marketing rather than R&D. And judging by their success at selling crap just proves that the only thing that matters is marketing.f - Wednesday, February 23, 2005 - link
fThe_Necromancer - Wednesday, February 23, 2005 - link
Ahh, well Intel is going down hill, The death to Craig, the Death to Bill, THE Death to Nvidia, MuahahahahhahaAndy Bellenie - Tuesday, February 22, 2005 - link
...well, beating them in performance, anyway.Andy Bellenie - Tuesday, February 22, 2005 - link
#7 - Oh, I understand the technical reasons, I was questioning along the lines of "why can't they make something better??" They have such huge R&D resources I find it difficult to understand how significantly smaller chip companies can keep beating them.Rick - Tuesday, February 22, 2005 - link
I think Intels problems run a bit deeper then everyone thinks. The same can be said about AMD. The leakage power of the smaller manufactoring processes is much more a problem now then it was previously. Check that out:http://www.anandtech.com/cpuchipsets/showdoc.aspx?...
This problem will not go away with the introduction of dual core processors. A bigger problem for the multicore movement might be that there are not many user apps that benifit from multiple threads.
I am more interested in what the Cell processor will and will not be able to do. I also head that IBM and Sony are writting an operating system...for Cell perhaps. We'll have to wait and see, but it would be pretty cool if Cell made its way into the PC.
Mephisto - Tuesday, February 22, 2005 - link
#9 Jonny: "that is the whole system, not just the CPU"Well, that looks a lot better - thanks for the correction.
Jonny - Tuesday, February 22, 2005 - link
#8, that is the whole system, not just the CPU. So, the whole P4 system uses 233 watts. Doesn't really change the whole outlook, but just wanted to point it out.Mephisto - Tuesday, February 22, 2005 - link
So one Pentium 4 3.8GHz EE CPU under load uses 233watts (http://www.anandtech.com/cpuchipsets/showdoc.aspx?...One Mac mini under load uses 20watts (http://www.tomshardware.com/howto/20050216/apple-m...
So you can run 11 Mac minis on the power necessary to run the latest Pentium 4 CPU
Add the rest of the PC and you'll be nearing 500watts
or 25 Mac minis running Xgrid
Seems like distributed power efficient CPUs is the way to go. Of course, you can't fit the I/O gear you need to the mini (no optical) - but it's the principle I'm pointing out.
I can see Intel's next slogan:
"Pentium - doing our bit for global warming"
or, perhaps,
"Pentium - hotter than a Sun"
viditor - Tuesday, February 22, 2005 - link
Andy - "I still can't fathom why Intel is having such trouble getting more performance out of it's chips"You should read Johan's article here on AT. It will give you a better idea why Prescott has failed...
http://www.anandtech.com/cpuchipsets/showdoc.aspx?...
radeonguy - Tuesday, February 22, 2005 - link
#1 Even if Intel hurries up and releases a dual core cpu the way intel has been making proceesors latley its going to suckMichael2k - Tuesday, February 22, 2005 - link
#1: Maybe because, like Anand implies, they don't need to?bob - Monday, February 21, 2005 - link
"aging Anand" ??He isn't that old, is he?
Live - Monday, February 21, 2005 - link
"I don't get the impression that Intel is going to be any more attractive even by the end of this year"Unfortunately it looks like you are right aging Anand
crtfanboy - Monday, February 21, 2005 - link
What's up with the "AnandTech Deals" price finder looking up P3 600mhz's? More importantly, why are they still $100?Andy Bellenie - Monday, February 21, 2005 - link
I still can't fathom why Intel is having such trouble getting more performance out of it's chips. The last few releases have an air of desperation about them, with all sorts of ultimately unrewarding 'features' being applied to compensate for the lack of GHz. Hurry up dual core!