Comments Locked

41 Comments

Back to Article

  • silverblue - Tuesday, October 3, 2017 - link

    Very sad to hear.
  • HStewart - Tuesday, October 3, 2017 - link

    Sorry to here about this - but it sounds like from his time in Intel - that he made huge difference - including transformation to i3/i5/i7 series - up to probably the Haswell series.

    One thing I am curious I wonder if Sandy Bridge was name because of his wife Sandy.
  • nathanddrews - Tuesday, October 3, 2017 - link

    Paul's wife + translated Hebrew codename = game changer!
  • IGTrading - Wednesday, October 4, 2017 - link

    Personal opinion :

    Lets also remember that during his tenure Intel paid 6 billion USD in bribes just to DELL and more than 10 times more to companies all over the world.

    Lets not forget how Intel put the corporate boot on the throat of the healthy market competition and made its whole client base pay multiple times more billions for its technology (because those bribes came from us, the clients) .

    Lets not forget that this company had to be raided in multiple countries on multiple continents to get to the proof of its behavior, like an organized crime syndicate.

    And yes, lets acknowledge its technological achievements as well.

    So may Paul rest in peace, but we have to accept his memory with the good and the bad as well.

    If there are some who lack the documentation or were just too young when all of this was happening, this guy here made a very well documented video documentary on everything : https://www.youtube.com/watch?v=osSMJRyxG0k
  • ddriver - Wednesday, October 4, 2017 - link

    Yeah, his death is a huge loss to the efforts of unfair and illegal business practices, impeding of progress, abusing monopolies and milking the consumer.
  • nathanddrews - Wednesday, October 4, 2017 - link

    Be adults.
  • ddriver - Wednesday, October 4, 2017 - link

    You mean be cattle? By which I mean "do what you are expected to do and don't put any thought into what is going on".

    My previous post was 100% true, and if you felt like it put things in a negative perspective, that's not on me but on him.

    Not undermining the personal aspect of his tragedy, it looks that gets plenty of attention, I just don't feel like it offsets the amount of evil he played a central role in committing. And since AT is too biassed to tell the truth, that's up to conscientious citizens.
  • Topweasel - Wednesday, October 4, 2017 - link

    I am going to second the calls to look back at his body of work. His passing is notable because of what he did in his life in the Industry he worked for (and pretty much founded). With that comes all the damage he has done that he along with his subordinates are directly responsible for. For all the trouble IBM/Bell/MS have gotten into for Monopolistic practices, Intel's actions while he was at the helm were some of the most dirtiest seen in the corporate world and that is a high bar to pass.

    He might have built up the microprocessor industry but he also is responsible for decades of damage to it as well just so his company never had to compete on a technical level.
  • bigboxes - Wednesday, October 4, 2017 - link

    Yeah, I went Intel in 2006, but maybe I wouldn't if not for this man's actions.
  • extide - Tuesday, October 3, 2017 - link

    Funny they mention increasing the speed of their iGPU's by many times -- as when you start from such a low bar that is not that hard. There is a bit of an interesting story behind the origin of the Intel integrated graphics, though. Intel's integrated graphics started out in the chipset, in the northbridge back on the platforms that still had an FSB, external memory controller, etc. Since the northbridge was mostly used for I/O, it's size was commonly pad-limited, meaning it required so many pins that the die size actually had to be bigger than it otherwise needed to be in order to fit the required logic. This means they essentially had some free transistor budget, and so they decided to toss in a basic GPU. Since the GPU was literally an afterthought and thrown in using the spare space, it was never prioritized, and that showed. It wasn't until the Arrandale and Sandy Bridge era that Intel actually got serious about making their iGPU decent, and today the iGPU takes up a significant amount of die space, more than the processor cores themselves on the 2C dies.
  • CaedenV - Tuesday, October 3, 2017 - link

    Not to be blindly in the Intel fanboy camp...
    But the GPU criticism is a bit unfair in this case. Yes, Intel's on-board graphics stank for a very long time. But lets also keep in mind that they went from pretty much nothing to making some of the best (and most efficient) on-board graphics in a 8-10 year period. Most of this struggle was the IP base that they started with, and trying to dramatically improve performance without stepping on the toes of AMD/ATI and nVidea who were viewed at the time as close partners rather than direct rivles (at least in the GPU department). I was just as frustrated as anyone at how bad onboard GPUs were when they started improving, but at the same time I am glad they made their move, and did it in a way that did well for the company without making too many waves with their partners.
  • Zingam - Wednesday, October 4, 2017 - link

    All criticism is fair. I had never so many driver issues with any other vendor. Even normal everyday desktop applications have rendering issues on Intel's latest and greatest but expensive!
  • FunBunny2 - Tuesday, October 3, 2017 - link

    -- more than the processor cores themselves on the 2C dies.

    for some years, if you look at the dieshots, cores are down to may be 10% of area. too bad there aren't many embarrassingly parallel user space problems; if there were, and cores were reduced to just ALU, much more of that budget would go to actual computing. wait.... isn't that what GPU programming really is??
  • vailr - Tuesday, October 3, 2017 - link

    Intel's integrated GPU should be an optional feature, instead of what's now standard: being forced onto all retail desktop machines. Unless you want to go to the expense of buying a workstation or else swap out the CPU that came with the desktop machine.
  • Zingam - Wednesday, October 4, 2017 - link

    iGPUs are a terrible idea in the first place. Well maybe not on smartphone SOCs.
  • BrokenCrayons - Thursday, October 5, 2017 - link

    iGPUs are a good idea for a wide variety of computing scenarios where the performance of a dGPU is unnecessary. iGPUs have brought the cost and hardware complexity of PCs down which has helped them spread far and wide. The majority of computers sold contain no dedicated graphics. That was the case when the awful Extreme 3D was being foisted on us and is still the case today.

    There's certainly a place for dedicated graphics processors (a lot of places really), but offering "just enough" graphics power to toss a Windows desktop on a low resolution screen was sufficient for a lot of people. Since iGPUs released after the GMA950 were progressively given more priority and offered more performance, it's possible to pick up any relatively modern Intel processor and skip the GPU, but still play a few older games or watch a video in HD and not worry about buying a graphics card.
  • Jon Tseng - Wednesday, October 4, 2017 - link

    Yeah one thing I never understand is why high end enthusiast CPUs (6700Ks 7700Ks etc) have vast amounts of die areas wasted on an iGPU when any user is blatantly going to pair it with a discrete GPU anyway.

    AMD actually cottoned onto this with Ryzen - create dedicated enthusiast parts which use this transistor budget for CPU which allows them to offer more cores for less.

    Has always seemed nuts to me...
  • DanNeely - Wednesday, October 4, 2017 - link

    Because they don't sell enough of the K variants to justify a separate die without massively inflating the unit cost due to a much smaller customer base to spread the fixed costs of creating it over. LGA 115x K series are the top of the line mainstream dies. If you want something explicitly designed for enthusiasts and without the IGP baggage, step up to LGA 20xx.
  • Jon Tseng - Wednesday, October 4, 2017 - link

    Yeah that kinda makes sense given tape-out costs.
  • cwolf78 - Tuesday, October 3, 2017 - link

    Aha!! So HERE's the schmuck who was responsible for (or at least turned a blind eye to) Intel's anti-competitive practices throughout the years.
  • hansmuff - Tuesday, October 3, 2017 - link

    You're the schmuck who talks shit about a dead person you don't know, whose opinions you don't know, whose actions you don't know. It's disgraceful of you. He might very well have railed against those practices but no way to influence them.
  • Gigaplex - Tuesday, October 3, 2017 - link

    Being the CEO, he definitely had a way to influence them.
  • hansmuff - Wednesday, October 4, 2017 - link

    Because that's how business works in a publicly traded company. The CEO makes all the decisions and then that's what happens.. yeah.
  • maximumGPU - Wednesday, October 4, 2017 - link

    That's exactly how publicly traded companies work. The CEO does absolutely have the influence to change practices if he's against them. He's the one who has to explain any fallout from those practices.
  • hlm - Wednesday, October 4, 2017 - link

    Public companies may not always work that way. For example, in the case of Intel, there is a Board of Directors. Sometimes, the CEO becomes more of a front-man for the Board than a real leader. With Intel, I am not sure. Here is the current member list for Intel's Board of Directors: https://newsroom.intel.com/biographies/board-of-di...
  • Zingam - Wednesday, October 4, 2017 - link

    Ah, OK! Good thinking! Osama bin Laden and Hitler are now absolved!
  • hansmuff - Wednesday, October 4, 2017 - link

    Oh, right. Absolutely on the same scale.
  • Hurr Durr - Wednesday, October 4, 2017 - link

    Well, both are blamed for many things they didn`t do.
  • ddriver - Wednesday, October 4, 2017 - link

    If the truth about him is shit, I guess that makes him a shitty person. He paid billions to bribe OEMs to not sell AMD, if we had any actual justice, he would have died in prison.
  • IGTrading - Wednesday, October 4, 2017 - link

    We believe he was directly responsible for the illegal practices of the corporation, because you cannot "turn a blind eye" to bribe payments of tens of billions of USD for multiple consecutive years.

    Great respect for the technical achievements of its teams, but the business strategies were often the definition of what a fair-play competitor should never do and were illegal in all countries on all continents where Intel was investigated and proven guilty.
  • RedGreenBlue - Tuesday, October 3, 2017 - link

    Well that's a sad day. I know he had a huge impact on the markets Anand reviewed at the time. Up until the Pentium 4, that's great, but the fact that 10 years later you can expect to see someone comment on his tech obituary about threatening OEMs to only buy Intel says a lot about the goodwill he helped Intel destroy in the enthusiast market. I hope current Intel executives want to be remembered better than a nice story with an asterisk in people's minds like, "* also held back the advancement of computers".
  • CaedenV - Tuesday, October 3, 2017 - link

    Brought Intel out of the Pentium 4 swamp and lead the team through the Sandy/Ivy bridge era... things really have not improved after he left. Some may complain about some of Intel's business practices at the time, but he also knew how to keep the machine moving forward for a very long time without slowing down. Now they are floundering with no clear direction or purpose. Paul is surely missed.
  • Zingam - Wednesday, October 4, 2017 - link

    You, Sir, have no clue what you are talking about. P4 was fine. I had one. It did a decent job and wormed up my hand during cold winters!!!
    Mr. Otellini's decisions had direct influence on what Intel conceived up until at least Kaby Lake.
    The only good thing about Intel is their best-in-the-business manufacturing process. Architecturewise they are still churning over and over the same old PC architecture that IBM made up in the 80's. Zero invention since that time!

    I just watched a presentation by an NVIDIA architect who very well explained the bottlenecks on the PC and how they overcame them together with IBM on the server BY inventing NEW technology! IBM has that built into their Power CPUs.

    IBM created monsters - Intel and Microsoft.
  • damonlynch - Tuesday, October 3, 2017 - link

    Condolences to Mr. Otellini's family and friends. I learned only now that Mr. Otellini graduated from Cal with an MBA in 1974. Perhaps he took a class or two with C. West Churchman, who was a truly great thinker.
  • atirado - Wednesday, October 4, 2017 - link

    Wow, essentially, my first 10 years of professional career was spent using CPUs spearheaded by his customer first view...
  • wolfemane - Wednesday, October 4, 2017 - link

    Without fame, he who spends his time on earth leaves only such a mark upon the world as smoke does on air or foam on water. -
    Durante degli Alighieri
  • entity279 - Wednesday, October 4, 2017 - link

    Surely the article should also mention the anti-competitive practices Intel was accused of (and was charged for , in some cases) during his "mandate". I'm not accusing AT of any bias or anything, but us readers deserve to get a more balanced view.
  • ddriver - Wednesday, October 4, 2017 - link

    "and was charged for , in some cases"

    Intel was found guilty on every continent they sell their products.

    If anyone expects objectivity from AT... talk about unrealistic expectations.
  • entity279 - Wednesday, October 4, 2017 - link

    Technicaly, they were not found guilty on all of the charges, since Intel & AMD agreed on a settlement and some were droped as a result.

    I think the lack of objectivity is not the issue here. Who reads truly objective journalism in this industry floded by money, exclusives and free review samples? I'd be fine even if the article would have mentioned something in the lines of "some of the business practices in his time were controversials and they were targeted by lawsuits". But we didn't even get a single word.
    I repeat, I just would have expected factual information, no dramas. The time for that kind of passion has passed as Paul's death so eloquently shows it

    Only mentioning the positive achievements of a person, while omiting other very relevant information (they did touch on Intel's deals, so why be silent about Dell, one of their key partners?) is much, much worse. It insults the reader and it shows a disconnection between the journalist and the reality they were supposed to be bound by
  • abufrejoval - Wednesday, October 4, 2017 - link

    Privately: I feel sorry for his family, especially his wife, who probably saw far too little of him, during his tenure at Intel. At 66 I feel he didn’t get the deal he deserved from his maker.

    Publicly: What he did for Intel was great for Intel. Not so sure it was that great for us, the professional and private consumers. Intel has developed a lot of great technology as well as a lot of admirable failures (i432 wasn’t the first nor the last). But it has also severely abused its power (most notoriously against AMD) and continues to do so. It is very hard to admire the CEO of a company that fights so dirty, even if many of its products are good or even great. I have always felt that they wouldn’t be very far from where they are, even if they hadn’t gone dirty.

    But that’s what I used to think about VW, too.

    Chipset graphics: I’ve always tried to keep things fair around my home-lab and had AMD/ATI and Intel/Nvidia run side by side. And I was more focused around the new features and capabilities each generation brought than on one being “better” than the other.

    But just this week I took an old Core2 low-power Mini-ITX motherboard out of the closet, because an Atom J1900 turned out too slow handling pfSense on a 300Mbit broadband connection. It features a QX9100 Quad-Core at 2.3GHz and a GM45 chipset and I installed Ubuntu 16.04 with a Cinnamon desktop on it, mostly to run it through Geekbench 4, and to gage its relative performance to a potential Goldmont upgrade (it’s 35W TDP after all, vs. 6W on 14nm Atoms, which could amount to a noticeable difference on a 24x7 appliance).

    That’s true Northbridge graphics, definitely nothing to game on, but in terms of desktop experience it was astonishingly good. With Cinnamon you can quickly switch between accelerated and pure software rendering mode without changing the visual style much and the difference was between “I’d rather prefer something better” and “not bad at all” at 1920x1200 resolution.

    (BTW: Snort and ClamAV still managed to throttle 200 out of 300Mbit/s in pfSense, twice better than the J1900, so it will go back to the closet, soon)

    It might have sent several iGPU driver developers to insanity with the bugs it had, but they had evidently covered nicely for all of them, because the end-user experience was flawless and quite acceptable for OpenOffice and Firefox or “desktop work”. Xonotic (Quake engine based ego shooter) was good enough to explore the campgrounds (“Google Earth” like, so to say), but not fit for game play.

    At the other end I have intensively compared an Kaveri A10-7850K with pretty optimal DDR3-2400 against a mobile Skylake i5-6267U (Iris 550 with 64MB EDRAM) running DDR3-1866 modules at -1333 for lack of BIOS support and found that on pretty much all benchmarks I could throw at both of them, graphics, compute or both they performed identically; except at the power socket where the notebook took 1/3 of the power.

    For all the bad press Intel graphics have received over the years, in terms of Watts per visual satisfaction I think they have done rather well: You simply can’t squeeze GDDR5X or HBM2 performance out of DDR3 at 10Watts of power when the opposition is allowed to spend 30x as much.

    But I’d also be the first to prefer that Intel had produced desktop SoCs where the majority of the silicon real-estate had gone into CPUs: There is nothing wrong with having a bit of GPU on a SoC, but I would have gladly traded most of the real-estate currently wasted on the iGPU for additional cores on my E3 Xeons desktops or even on my GTX 1080m notebook: With twice the cores there would have been enough space left for the GM45 equivalent desktop graphics where now at least 60% of the CPU die real-estate lies dormant because I use dGPUs.

    I’m now thinking about converting a cheap Lenovo Skylake i5 notebook into the pfSense appliance, because it’s easy on power and has a UPS already built-in: A similar mini-server as Xeon-D or Atom C3000 costs much more, but I would have preferred 2 extra cores or 4 extra threads instead of the iGPU power such an appliance won’t ever need.

    And I’ll probably have to run pfSense as a VM using KVM (via VirtualBox as “GUI”), because nobody sells you dual NIC laptops and pfSense doesn’t know enough USB 3 to get better than 100Mbit out of a USB3 Gigabit NIC, due to its FreeBSD 10 base: No problem reaching 950Mbit of iperf3 throughput with a venet adapter from that “stick NIC” under KVM on Linux, through…

    Perhaps there is another “fanboy” lesson here: It’s very difficult to find a better free home firewall appliance than pfSense, which unfortunately runs on a rather outdated OS base. And it’s very difficult to find a better OS than Linux, which unfortunately doesn’t always have the best applications running natively on it. But that’s no motive to start a flame war, because you can still combine them into something that just works well.

    Intel’s technical savvy expresses itself best in the fact that whatever your use case, you’ll find one of their products is at 90% of what you’d want. But that they’ll wave another product at you for 3x the price to get the last 10%. And it from what I now hear and see, that’s Otellini’s main legacy to the vast majority of us.
  • Jon Tseng - Wednesday, October 4, 2017 - link

    Sad news.

    I think the guy had a tough gig. Obviously Intel achieved a great deal under his watch and even before - even if he wasn't CEO earlier he would have had a big hand in their rise to PC dominance.

    Unfortunately his legacy will like be overshadowed by the fact that 1) he wasn't Andy Grove and 2) Intel missed mobile under his watch. The first one he couldn't do anything about. The second one he candidly admitted was a mistake.

    As I said though, sad news.

Log in

Don't have an account? Sign up now