Wednesday, May 22, 2024

apple, arm, tsmc, intel, 15 years, Paul Otellini (historical)

 

  ── learning curve advantage over time
  ── (Apple, ARM, TSMC), Intel, 15 years
  ── why this moment was 15 years in the making
  ── former chief of staff (later general manager to Intel China) to Andrew Grove 
  ── mobile phone industry (growth s-curve)
     ── mobile phone (a personal computing device with a tiny screen)   
  ── Intel (manufacturing [the microprocessor - CPU] for the whole computer industry?)
  ── microsoft windows intel duopoly (oligarchy) over the PC industry 


How a Decision by Apple 15 Years Ago Hurts Intel Now (scmp.com)127

Posted by EditorDavid on Sunday August 23, 2020 @01:50PM

The former chief of staff to Intel CEO Andrew Grove (and later general manager of Intel China) explains why this moment was 15 years in the making:
Learning curve theory says that the cost of manufacturing a product declines as the volume increases. Manufacturing chips for the whole computer industry gave Intel huge advantages of scale over any other semiconductor manufacturer and resulted in the company becoming the world's largest chip manufacturer with enviable profit margins.

Chaos theory says that a small change in one state of a system can cause a large change in a later stage. In Intel's case, this was not getting selected by Apple for its iPhones. Each successive era of computing was 10x the size of the previous era, so while Intel produced hundreds of millions of microprocessors per year, the mobile phone industry sells billions of units per year. Apple's decision in 2005 to use the ARM architecture instead of Intel's gave Taiwan-based TSMC, the foundry chosen to manufacture the processor chips for the iPhone, the learning curve advantage which over time enabled it to pull ahead of Intel in manufacturing process technology.

Intel's integrated model, its competitive advantage for decades, became its vulnerability. TSMC and ARM created a tectonic shift in the semiconductor industry by enabling a large number of "fabless" chip companies such as Apple, AMD, Nvidia and Qualcomm, to name a few. These fabless companies began to out-design Intel in the mobile phone industry and accelerated TSMC's lead over Intel in high volume manufacturing of the most advanced chips. Samsung, which also operates a foundry business, has been another beneficiary of this trend.
   ____________________________________

https://www.textise.net/showText.aspx?strURL=https%253A//appleinsider.com/articles/15/01/19/how-intel-lost-the-mobile-chip-business-to-apples-ax-arm-application-processors

Daniel Eran Dilger | Jan 19, 2015

Last updated 9 years ago

Between 2005 and 2014, Intel fumbled the ball in mobile chips, losing its position as the world's leading processor supplier by failing to competitively address the vast mobile market and enabling Apple to incrementally develop what are now the most powerful mainstream Application Processors to ship in vast volumes. Here's how it happened, the lessons learned and how Apple could make it happen again.

How Intel lost the mobile chip business
Apple strategically targeted mobile Application Processors as a technology it wanted to own back around 2007, when the iPhone was barely a year old. It was effectively a reversal of a previous strategy that intended to simplify Apple's hardware operations by leveraging its 2005 partnership with Intel.

The initial development of the original iPhone made Apple realize that abandoning its history of custom chip development and delegating all silicon design to Intel had been a mistake. Prior to 2005, Apple had maintained a fluctuating but significant in-house custom chip design team. In the move from PowerPC to Intel, Steve Jobs eliminated that team.

However, while Intel was interested in selling its new Core x86 chips to Apple for use in Macs (and developing the supporting chipsets for them), it wasn't interested in building mobile chips for Apple's iPhone, at least not at the price Apple wanted to pay and in the quantity Intel expected Apple to buy.

Intel's former chief executive Paul Otellini (below) revealed last year that he didn't believe his company would able to earn enough money building mobile chips for Apple's new iPhone to cover its development costs, largely because he couldn't imagine Apple selling iPhones in large quantities.

[Image: Paul Otellini]

Intel gives up XScale
Intel at the time actually owned XScale, an ARM chip producer, but it announced plans to sell off the group to Marvell in the summer of 2006 after any hope of a deal with Apple was lost.

Intel's inability to foresee the potential of Apple's new iPhone may have been colored by its disappointing experiences with XScale, the rebranded StrongARM group it announced plans to acquire from Digital Equipment Corporation in 1997 as part of a patent infringement settlement.

StrongARM had been a collaboration between ARM and DEC to build a new class of higher end ARM processors. Apple had been using StrongARM chips in its Newton mobile devices, but Jobs terminated that tablet product line just after Intel took ownership of its chip supply.

Intel had planned to use XScale to expand its influence into mobile and embedded devices where x86 compatibility was not necessary. StrongARM's modern "RISC" architecture appeared suited to replace Intel's own failed attempts to introduce non-x86 compatible RISC processor families, including its i432, i860 and i960.

[Image: Intel XScale ARM]

However, after almost ten years of investment into XScale, Intel had seen few hit products using the chips and lots of duds (Palm Treos, Pocket PCs from Compaq and Dell, and Creative Zen MP3 players). Compared to the fat profit margins of its PC x86 chips, the XScale operation appeared to be nothing less than a money pit. That makes it more understandable why Intel wasn't exactly beating down Apple's door to supply it with a few million $30 ARM chips that would cost it many millions to develop and manufacture.

In hindsight however, Intel's failure to see the potential of the iPhone recalls HP not being interested in building the original personal computer designed by Steve Wozniak and Jobs; the two subsequently built their product into the first Apple. Thirty years later— thanks to Intel's lack of interest— iPhone eventually helped launch another new business for Apple: mobile Application Processors.

Intel begins badmouthing ARM, touting Atom
In 2008, a couple years after Intel exited the ARM business, two minor Intel executives made public comments dismissing Apple's iPhone and the ARM chip that powered it as being underpowered, at least compared to what it could be, were it to use an x86 Atom mobile processor from their company instead.

By that time, Apple's iPhone had already proven itself to be revolutionary and a major new feather in ARM's cap. Having sold off its own XScale ARM operations, Intel planned to target the potential for mobile devices using a new scaled down version of its desktop x86 processor (branded Atom), first in partnership with Microsoft's Windows Mobile, and later (after losing its bid for building the brain inside Apple's iPad) in 2011 partnerships with mobile Linux (using Intel's own Moblin distribution) and Google's rival Android.

[Image: Intel Linux 2011]

Given those circumstances, it's no wonder why Intel representatives were badmouthing ARM in 2008. However, the poor optics of Intel mocking Apple, a major client, resulted in effectively a public apology by Intel's senior vice president Anand Chandrasekher, who candidly "acknowledged that Intel's low-power Atom processor does not yet match the battery life characteristics of the ARM processor in a phone form factor," adding, "Apple's iPhone offering is an extremely innovative product that enables new and exciting market opportunities."

iPad appears without an x86 chip
Shortly before Apple launched its original iPad in 2010, Intel appeared confident that Apple would select its x86 compatible Atom chips to power it, as the ARM Application Processors currently being used in iPhones appeared to be very limited compared to the tablet chips Windows Tablet and UMPC licensees were using.

Even Samsung— which had manufactured the chips for Apple's iPhone, iPhone 3G and iPhone 3GS based on designs licensed directly from ARM— was building its own UMPC tablets (such as the Q1, below) powered by Intel's x86 Celeron M chip, designated as "Ultra Low Voltage."

Nobody expected Apple to develop a tablet powered by a wimpy ARM chip intended for cell phones, in part because of the nonstop x86 propaganda radiating from Intel through the media via press releases, and in part because of Apple's fairly fresh partnership with Intel in Macs, which was then barely four years old. Apple had even developed its Apple TV set top box using a similar, low power Intel Pentium M processor.

[Image: Samsung Apple copy]

Intel's Silverthorne (aka Atom) mobile x86 chip widely seemed to be the most logical choice for a new Apple tablet, particularly given the expectations set by Samsung and other Windows licensees adapting Microsoft's Tablet PC reference designs. Instead, Apple developed its own new A4 chip for iPad, and subsequently reused it in iPhone 4 and for a redesigned, iOS-based second generation Apple TV.

Five years later, Intel's Atom is still not competitive with Apple's rapidly advancing custom ARM chips. Intel's rather desperate recent efforts to pay Android manufacturers to use its chips has resulted in more than $7 billion in losses from the company's mobile division over the past two years.

Intel's latest earnings report notes that the company's mobile group again lost an astounding $1.11 billion in the winter quarter (on "negative revenues" of $6 million, meaning Intel was paying its clients to use its products). That makes Intel's Atom group responsible for cumulative losses within 2014 totaling over $4.2 billion.

[Image: Intel 2014 earnings]

It's no wonder why, for 2015, Intel has announced it will no longer detail to investors how much money it loses from mobile (just like Google) as it gives away products in hope of someday creating a business that can turn a profit. Within one year after restructuring to emphasize its "Internet of Things" strategy, Intel is now reworking its earnings reporting again, shoveling its mobile losses into the still burning furnaces of its PC processors to incinerate any evidence of failure.

There is simply no basis for arguing that Intel— the world's most sophisticated processor maker— didn't lose out big to Apple in the relatively new market for mobile processors over the past five years. This is particularly incredible given the fact that Apple assembled its chip design team essentially from scratch in just a few years after realizing how important it would be to own the supply for its new mobile devices.

How Apple got into the mobile chip business
With Intel disinterested in developing chips for iPhone, Apple sourced its Application Processor for the first iPhone from Samsung, which was already a major component supplier for Apple, having produced hundreds of millions of the simpler ARM chips used in iPods. However, Apple's 2005 strategy of reliance on Intel for its silicon design expertise complicated its increasingly sophisticated ARM appetite in mobile devices.

John C Randolph explained that Apple "not having their own chip design experts in-house made for very poor communication with Samsung, which is why the H1 processor in the iPhone wasn't quite what they wanted, although it was exactly what they'd asked for; in other words, mostly Apple's fault, not Samsung's."

The relatively generic, Samsung-manufactured APL0098 chip that Apple used in the original iPhone (featuring an ARM11 CPU using ARMv6 instruction set, built using 90nm process) was far more powerful than the ARM7TDMI (ARMv4) processors Apple used in the first iPod (and which powered most Nokia phones and Nintendo's GameBoy Advance). They were, in turn, far more powerful than the original ARM6 (ARMv3) and StrongARM (ARMv4) chips used in Newton MessagePads from the 1990s.

However, back in 2007 no ARM chip was anywhere near as powerful as Intel's Core processors used in Macs. That made it an astounding feat that Apple was able to effectively port the entire essential OS X Mac environment to run on such an ARM chip in the original iPhone, along with an entirely new multitouch-based user interface.

Even after Apple first demonstrated iPhone, high-level executives, pundits and discussion boards all expressed disbelief that the company actually had the Mac's full Unix environment running on a mobile device. At the time, it did just not seem possible.

[Image: iPhone 2007]

Other ARM devices had been running a far simpler OS environment such as Nokia's Symbian, Palm OS or Microsoft's Windows CE (which was related to desktop Windows PCs in name only).

Now that Apple had shown it could be done, there would inevitably be a race to duplicate its work. Microsoft ineffectually tried to beef up Windows CE; Nokia initiated efforts to improve Symbian or replace it with Linux; Google repositioned its initially far less ambitious JavaVM project into an iOS clone branded as Android; and both Palm and BlackBerry set out to develop "real" mobile operating systems for the modern era.

With so many fast followers behind it (all of them better capitalized and better connected than the Apple of 2007), it is now clear in hindsight that to stay ahead of the pack, Apple needed to not only drive a rapid OS development cycle, but also needed to drive hardware advances on its own.

By 2009, Palm webOS and Android would be looming as potential threats to iOS; a year later Microsoft and Nokia launched Windows Phone, followed by the 2011 release of the QNX-based BlackBerry Tablet OS.

Apple builds a chip design team
Apple's Jobs quickly realized that the company needed to rebuild an internal silicon design team and line up architectural licensing agreements with both ARM and Imagination Technologies that would enable it to work with Samsung to build its own optimized mobile chips, iterating new technology as rapidly as possible to stay ahead of competitors.

Somewhat ironically, over 15 years earlier Apple had co-founded ARM (in a joint partnership with British computer maker Acorn) with the express intent to create a new, mobile-optimized chip architecture capable of powering 1994's handheld "Personal Digital Assistant" Newton MessagePad tablet (below). Newton wasn't a tremendously successful product, but its openly licensed ARM processor architecture took off (thanks largely to adoption by Nokia) and subsequently took over the entire mobile industry.

[Image: Newton Message Pad]

In the late 1990s, Jobs not only shuttered Newton but also liquidated Apple's holdings in ARM, gaining the cash needed to keep the company alive until it could return to strong profitability. A major driver in that push was the iPod, which used ARM processors built by Samsung. That made Samsung a natural partner for sourcing a more powerful Application Processor for the iPhone.

AppleInsider exclusively reported on Apple's secret licensing agreements made within a year of the original iPhone's launch, and covered the company's acquisitions of fabless chip designers including PA Semi and Intrinsity.

Apple's custom Ax series Application Procssors
Those investments began to pay off with the A4, introduced in 2010. It incorporated clock speed and RAM data bus enhancements that enabled it to drive the increased resolution of iPad, manufactured at a 45 nm process. While many in the media shrugged off the new iPad as "just a big iPod touch," nobody else could copy it.

Even Samsung, with its own version of the A4 chip (S5PC110, later rebranded as an Exynos 3), struggled to bring its own Galaxy Tab to market eleven months later, with smaller screen to shave off costs. The next year, Motorola used a comparable Texas Instruments OMAP 3 chip to deliver its Xoom tablet, but it was still not even ready for sale.

Meanwhile, Apple had not only put its A4 in iPhone 4 and Apple TV, but was ready to ship iPad 2 the following year with its dual core A5, a chip featuring twice the CPU power and eight times the GPU performance of A4. It was subsequently used in iPhone 4S, then followed up by the A5X powering the Retina Display "New iPad" that debuted in March 2012.

Later that year in September, Apple shipped iPhone 5 with A6, a new chip featuring an entirely custom "Swift" core design and manufactured at a 32 nm process. One month later, Apple released its A6X variant powering iPad 4.

[Image: A6]

Apple's rapid advancement of Application Processors not only kept it ahead of Intel's x86 Atom chips, but also kept it competitive with rival ARM chipmakers. In fact, by the release of A6, Texas Instruments, one of the primary ARM fabs supplying chips products including Amazon Kindle Fire, Palm Pre, RIM BlackBerry Playbook, Motorola's Xyboard tablet and MOTOACTIVE music player, Nokia's N9, Google's Nexus Q and Galaxy Nexus— was ready to exit the consumer market, largely because it was unable to justify the expense of developing new generations of OMAP chips in competition with Apple.

Apple recruited chip design talent away from Texas Instruments, along with other struggling chipmakers including AMD, IBM and Freescale. Meanwhile, Apple had acquired Anobit and would later gobble up Passif Semiconductor.

Apple didn't just become a respectable, competitive chip designer; in 2013 it passed up the rest of the Application Processor industry (within three years of releasing its first custom A4 chip in 2010) by introducing A7, the first 64-bit ARMv8 to reach real production, using an entirely new Cyclone core design and a 28nm process.

[Image: A7]

This year, Apple further refined its 64-bit Cyclone architecture in A8 and A8X, manufactured at a 20nm process. Apple's closest competitors— Samsung, Qualcomm and Nvidia— still haven't produced a 64-bit ARM chip suitable for use in phones. Nvidia has dropped out of the phone business entirely. And when Samsung and Qualcomm do ship their first 64-bit chips later this year, they'll be using generic core designs by ARM.

[Image: A8X]

Designing its own Application Processors has given Apple enormous vertical advantages, and it recycles investment and vast economies of scale in a way that exclusively benefits itself rather than making it easy for competitors to catch up.

Apple's integration of hardware and software technology is not only keeping its iOS devices competitive, but also wiping out alternative supplies of higher end chip offerings. Intel has been forced to lose billions of dollars each year while paying manufacturers to use its chips. That's something most chip makers simply can't afford to do, and even Intel has said it will not keep subsidizing its Atom chips this year on a similar scale.

The tremendous investment expense required to keep up with Apple's Ax design has effectively given the company the luxury of keeping the premium high end market to itself, via its vertically integrated ownership of the chip supply. And increasingly, Apple's profits will enable it to buy up more and more of the finite chip production capacity of fabs such as TSMC and Samsung/ GlobalFoundries, particularly at the latest and greatest chip process fabrication node.

Google and Microsoft have already reached a point where they have few options for assembling Android or Windows tablets that can compete with Apple's latest iPad at similar price points, and the chip supply for advanced phones is also down to a trickle.

While it's grown popular to predict that Apple will duplicate its incredible success in Application Processors to next replace Intel's x86 chips in Macs with its own custom ARM chips, there are significant barriers in the way and a series of more valuable opportunities available to Apple's silicon design team, as the next article will examine.

   ____________________________________

look up Intel CEO, what's name, his interview and put it here
copy & paste the interview text


How Intel blew the opportunity to be inside the original iPhone
By Anupam Saxena | Updated: 17 May 2013 13:06 IST

Advertisement
As Intel's former CEO Paul Otellini hands over the charge to Brian Krzanich, he reveals that he missed a chance to get Intel's processor inside the first iPhone.

In an interview with The Atlantic, Otellini disclosed that Apple had approached Intel to source a chip that they wanted to put in the iPhone but there were differences in the price that Apple was willing to pay and what Intel estimated.

"We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we'd done it," Otellini said.

He also added that his gut told him to say yes. There was also a sign of regret. "The lesson I took away from that was, while we like to speak with data around here, so many times in my career I've ended up making decisions with my gut, and I should have followed my gut," said Otellini. "My gut told me to say yes."

"The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."

It's not clear from the interview if Apple wanted the iPhone to be powered by Intel's x86 chips or, as is more likely, Apple wanted Intel to manufacture ARM chips on contract. Going by the Steve Jobs biography by Walter Isaacson, Jobs had his reasons for not going with Intel chips for the iPhone.

"There were two reasons we didn't go with them. One was that they [the company] are just really slow. They're like a steamship, not very flexible. We're used to going pretty fast. Second is that we just didn't want to teach them everything, which they could go and sell to our competitors," said Jobs.

This implies that Apple did not choose Intel as it was not sure of the company's existing chips and of the company's ability to offer a customised solution in time. It also didn't want Intel using its know-how to help competitors.

He is quoted saying,"At the high-performance level, Intel is the best.They build the fastest, if you don't care about power and cost." Jobs also added that "We [Apple] tried to help Intel, but they don't listen much."

However the book also features a rebuttal from Otellini saying that the two companies did not agree on the price and on who'd control the design of the chip.

While Intel has been a late entrant to the smartphone chips segment as its early Atom chips were not optimised for offering a good battery life, there were some reports before the launch of the original iPhone that quoted an Apple executive saying that the phone runs on an Intel powered chip. Even Intel had said that Apple had committed to use its 'Silverthorne' chip in multiple products.

Apple's first iPhone ended up using an ARM chip produced by Samsung. Till the last generation, all iPhones ware powered by chips designed by Apple and manufactured by Samsung.

An year prior to the release of the iPhone, Apple had released the first Macs that were powered by Intel processors before moving to a completely Intel based architecture in 2009.

Meanwhile Intel is trying to get its share in the mobile chipsets segment partnering with device manufacturers to power Android based smartphones as it faces tough competition from the likes of Samsung, Qualcomm and even Nvidia.
Comments
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Intel, Apple, Paul Otellini, Steve Jobs, iPhone
A stretched Samsung chases rival Apple's suppliers
Windows Phone inches ahead of BlackBerry in smartphone marketshare: IDC
   ____________________________________


Intel ex-CEO looks back at biggest blown call: Missing out on the iPhone
Paul Otellini, who retired as the chip giant's CEO on Thursday, tells The Atlantic that he didn't do what it takes for Intel to be in Apple's initial smartphone.

Shara Tibken Former managing editor
Shara Tibken was a managing editor at CNET News, overseeing a team covering tech policy, EU tech, mobile and the digital divide. She previously covered mobile as a senior reporter at CNET and also wrote for Dow Jones Newswires and The Wall Street Journal. Shara is a native Midwesterner who still prefers "pop" over "soda."
See full bio
Shara Tibken
May 16, 2013 3:35 p.m. PT
2 min read
Paul Otellini retired as Intel's CEO on Thursday. Dan Farber/CNET
Paul Otellini passed up one of the biggest opportunities in Intel's history -- supplying chips for the first iPhone, the chipmaker's former CEO said.
Otellini "decided against doing what it took" to make the chips for Apple's smartphone, The Atlantic reported, based on an interview with the newly retired executive. Here's what he told the publication:

We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we'd done it. The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do...At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.
He added that "my gut told me to say yes."

Otellini stepped down as Intel's CEO on Thursday, handing the reins over to the company's manufacturing head, Brian Krzanich. During his time, Intel dominated the PC and server chip business, but it largely missed out on mobile devices. The company has made some traction in recent months, but it still isn't in any blockbuster, flagship devices. In addition, it's late with the development of 4G LTE, which will continue to hinder its chances in the market.

As Otellini noted, it wasn't clear several years ago how well Apple's first iPhone would sell or that such a device would change the entire computing industry. One thing Intel has long prided itself on -- and investors have come to expect -- is high margins. It gets those lofty margins from pricing its chips higher than processors from some rivals, and mobile chips sell for much less than the typical Intel PC processor. Apple often negotiates attractive component pricing, but Intel likely didn't want to give a price cut for an unproven product.
And while Otellini's comments make it sound like the main reason Intel wasn't in the iPhone was pricing, battery life likely also played a role. Intel chips generally have been much more power hungry than processors based on ARM Holdings technology, like those from Qualcomm and Apple itself. Intel only recently started focusing on lowering the power consumption of its chips, many years after the iPhone first launched.

Speculation has popped up recently that Intel may one day manufacture Apple's chips for the company, but it's pretty likely that pricing remains a sticking point for any agreement. Krzanich, Intel's new CEO, would be smart to remember the biggest regret of his predecessor.
   ____________________________________

https://www.textise.net/showText.aspx?strURL=https%253A//www.theatlantic.com/technology/archive/2013/05/paul-otellinis-intel-can-the-company-that-built-the-future-survive-it/275825/#main-content


Technology
Paul Otellini's Intel: Can the Company That Built the Future Survive It?
As the CEO steps down, he leaves the Intel machine poised to take on the swarming ecosystem of competitors who make smartphone chips.

By Alexis C. Madrigal
May 16, 2013
[Image: otopop.jpg]

In his cubicle at Intel headquarters, Paul Otellini shows a chip-circuitry diagram that is stitched into the lining of his sportcoat. (Alexis Madrigal)
Forty-five years after Intel was founded by Silicon Valley legends Gordon Moore and Bob Noyce, it is the world's leading semiconductor company. While almost every similar company -- and there used to be many -- has disappeared or withered away, Intel has thrived through the rise of Microsoft, the Internet boom and the Internet bust, the resurgence of Apple, the laptop explosion that eroded the desktop market, and the wholesale restructuring of the semiconductor industry.

For 40 of those years, a timespan that saw computing go from curiosity to ubiquity, Paul Otellini has been at Intel. He's been CEO of the company for the last eight years, but close to the levers of power since he became then-CEO Andy Grove's de facto chief of staff in 1989. Today is Otellini's last day at Intel. As soon as he steps down at a company shareholder meeting, Brian Krzanich, who has been with the company since 1982, will move up from COO to become Intel's sixth CEO.

It's almost certain that the chorus of goodbyes for Otellini will underestimate his accomplishments as the head of the world's foremost chipmaker. He's a company man who is not much of a rhetorician, and the last few quarters of declining revenue and income have brought out detractors. They'll say Otellini did not get Intel's chips into smartphones and tablets, leaving the company locked out of computing's fastest growing market. They'll say Intel's risky, capital-intensive, vertically integrated business model doesn't belong in the new semiconductor industry, and that the loose coalition built around ARM's phone-friendly chip architecture have bypassed the once-invincible Intel along with its old WinTel friends, Microsoft, Dell, and HP.

Intel generated more revenue during Otellini's 8-year tenure than it did during the rest of the company's 45-year history.
And yet, consider the case for Otellini. Intel generated more revenue during his eight-year tenure as CEO than it did during the rest of the company's 45-year history. If it weren't for the Internet bubble-inflated earnings of the year 2000, Otellini would have presided over the generation of greater profits than his predecessors combined as well. As it is, the company machinery under him spun off $66 billion in profit (i.e. net income), as compared with the $68 billion posted by his predecessors. The $11 billion Intel earned in 2012 easily beats the sum total ($9.5) posted by Qualcomm ($6.1), Texas Instruments ($1.8), Broadcom ($0.72), Nvidia ($0.56), and Marvel ($0.31), not to mention its old rival AMD, which lost more than a billion dollars.

[Image: sweetchart2.jpg]

Of course, Otellini has both his predecessors' ambition and inflation to thank for his gaudy numbers, but he kept Intel a powerhouse. Under his watch since 2005, it created the world's best chips for laptops, assumed a dominant position in the server market, vanquished long-time rival AMD, retained a vertically integrated business model that's unique in the industry, and maintained profitability throughout the global economic meltdown. The company he ran was far larger, more complex and more global than anything Bob Noyce and Gordon Moore could have imagined when they founded it in 1968. And the business environment was certainly no easier than any encountered by the other four Intel CEOs. Yet he delivered quarter after quarter of profits along increasing revenue. In the last full year before he ascended to chief executive, Intel generated $34 billion in sales. By 2012, that number had grown to $53 billion.
"By all accounts, the company has been incredibly successful during his tenure on the things that made them Intel," said Stacy Rasgon, a senior analyst who covers the semiconductor industry at Sanford C. Bernstein. "Tuning the machine that is Intel happened very well under his watch. They've grown revenues a ton and margins are higher than they used to be."

Even Otellini's natural rival, former AMD CEO Hector Ruiz, had to agree that Intel's CEO "was more successful than people give him credit for."

But, oh, what could have been! Even Otellini betrayed a profound sense of disappointment over a decision he made about a then-unreleased product that became the iPhone. Shortly after winning Apple's Mac business, he decided against doing what it took to be the chip in Apple's paradigm-shifting product. 

"We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we'd done it," Otellini told me in a two-hour conversation during his last month at Intel. "The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."

It was the only moment I heard regret slip into Otellini's voice during the several hours of conversations I had with him. "The lesson I took away from that was, while we like to speak with data around here, so many times in my career I've ended up making decisions with my gut, and I should have followed my gut," he said. "My gut told me to say yes."

"My gut told me to say yes" to the iPhone.
In person, Otellini is forthright and charming. For a lifelong business guy, his affect is educator, not salesman. He is the kind of guy who would recommend that a junior colleague read a book like Scale and Scope, a 780-page history of industrial capitalism. To his credit, he fired back responses to nearly all my questions about his tenure, company, and industry at a dinner during CES in Las Vegas and later at Intel's headquarters. And when he wasn't going to answer, he didn't duck, but repelled: "I'm not going to talk about that."

On stage, however, during the heavily produced keynote talks CEOs are now required to give, Otellini's persona and company do not inspire legions of cheering fans. When he steps on stage, there is no Jobsian swell of emotion, no one screams out, "We love you, Paul!" And yet, this is the outfit that pushes the leading edge of chip innovation. They are the keepers of (Gordon) Moore's Law, ensuring that the number of transistors on an integrated circuit continues to double every couple years or so. If Otellini's CV is lacking a driverless car project or rocketship company, it may be because the technical challenges Intel faces require a different kind of corporation and leader.

"He's super low-key guy. He's not a Steve Jobs. He's not a Bill Gates. But his contribution has been just as big," said the new president of Intel, Renee James, who has worked with Otellini for 15 years.

His management secret was his own exemplary drive, discipline, and humility. He came in early, worked hard, and demanded excellence of himself. "He didn't yell and scream. He never dictated. He never asked me to come in on a Sunday. He never asked me to stay late on a Friday. But he had this way of getting you to rise to the occasion," said Navin Shenoy, who served as Otellini's chief-of-staff from 2004 to 2007. "He'd challenge you to do something that we'd all be proud of."

Peter Thiel might complain that the Valley hasn't invented rocket packs and flying car because investors and entrepreneurs have been focused on frivolous nonsense. But Paul Otellini's Intel spent $19.5 billion on R&D during 2011 and 2012. That's $8 billion more than Google. And a substantial amount of Intel's innovation comes from its manufacturing operations, and Intel spent another $20 billion building factories during the last two years. That's nearly $40 billion dedicated to bringing new products into being in just two years! These investments have continued because of Otellini's unshakeable faith that eventually, as he told me, "At the end of the day, the best transistors win, no matter what you're building, a server or a phone." That's always the strategy. That's always the solution.

"At the end of the day, the best transistors win, no matter what you're building, a server or a phone."
Intel's kind of business and Otellini's brand of competent, quiet management are not in fashion in Silicon Valley right now. And yet, almost no one has can claim the Valley more than Otellini. Every day for four decades -- in a career that spans the entirety of the PC era -- Intel's Santa Clara headquarters have been the center of his working world.

As we stood outside Otellini's corner cubicle, marked by a makeshift waiting room with a television, a couple of display cases, and a plucky plant, I asked him to reflect on what the end might feel like. "It is strange. I've been pinning this badge on every day for 40 years," he said. "But I won't miss the commute from San Francisco." After making thousands of trips down 101 and racking up 1.2 million miles on United through hundreds of trips around the world, he seemed ready to stop going.

[Image: Otellini-Office.jpg]

The "hallway" to Otellini's "corner office."
The Many Computer Revolutions

Despite the $53 billion in revenue and all the company's technical and business successes, the question on many a commentator's mind is, Can Intel thrive in the tablet and smartphone world the way it did during the standard PC era?

The industry changes ushered in by the surge in these flat-glass computing devices can be seen two ways. Intel's James prefers to see the continuities with Intel's existing business. "Everyone wants the tablet to be some mysterious thing that's killing the PC. What do you think the tablet really is? A PC," she said. "A PC by any other name is still a personal computer. If it does general purpose computing with multiple applications, it's a PC." Sure, she admitted, tablets are a "form factor and user modality change," but tablets are still "a general purpose computer."

On the other hand, the industry changes that have surrounded the great tablet upheaval have been substantial. Consumer dollars are flowing to different places. Instead of Microsoft's operating system dominating, Apple and Google's do. The old-line PC makers have struggled, while relative upstarts such as Samsung and Amazon have pushed millions of units.

The chip challenges are different as well. Rather than optimizing for the maximum computational power of a device, it's energy efficiency that's most important. How much performance can a processor deliver per watt of power it sucks from a too-small battery?

The semiconductor industry itself has seen perhaps even larger changes. In the early days of Silicon Valley, chipmakers had their foundries right there in the Valley, hence the name. During the 1980s, Japanese chipmakers battled American ones, beating them badly until Intel turned the tide in the latter half of the decade. The factories moved out of the valley to places like domesticallyChandler, Arizona and Folsom, California, as well as to Asia, mostly Taiwan.

Meanwhile, each generation of chips got technically more challenging and the foundries required to build them got more expensive. Chipmakers needed to sell massive amounts of chips in order to make up the huge capital equipment costs. The industry became cruelly cyclical, booming and busting with a regularity that defied managerial skill. For all those reasons and more, during the last twenty years, the chipmaking industry has been consolidating. Almost all semiconductor companies are now "fabless," choosing to outsource the production of their silicon to Taiwan Semiconductor Manufacturing Company (TSMC), United Microelectronics Corporation (UMC), or GlobalFoundries, a venture backed by the United Arab Emirates. The new fabless chip designers don't have to build plants, which allows them to have more stable businesses, but they lose the ability to gain competitive advantage by tweaking production lines. The transition to this state of affairs killed off many companies and allowed others to thrive.

Add it all up and there are only a few chipmakers left standing. The aforementioned contract manufacturers like TSMC, Samsung, and, of course, Intel.

These two structural trends at the consumer and industry levels intersect at a formerly obscure British company called ARM Holdings. Originally founded as a partnership between Acorn Computers (remember them?), VLSI (remember them?), and Apple, ARM now just creates and licenses the chip architectures that other companies tweak and have manufactured. In a sense, they sell a chip "starter kit" that companies like Apple, Qualcomm, Broadcom, Marvel, and Nvidia build upon to create their own products.

Chips based on the ARM intellectual property are generally not as high-performance as Intel's, but they're fantastically energy efficient. While ARM did make chips for Apple's ill-fated Newton device, in the early 2000s, ARM became the dominant architecture supplier to the so-called "embedded" market.  These chips are not general computing devices, but have specific jobs in (for example) cars, hard drives, and factories. This specialization is also one of the reasons that ARM chips are cheap. An Intel microprocessor could sell for $100. ARM-based chips might sell for $10, and often less than a dollar. In the first quarter of this year, 2.6 billion chips using ARM's architecture were shipped.

The two key attributes of ARM's architecture -- energy efficiency and low cost -- developed before cell phone phones, but they were exactly what mobile designers were looking for. As the smartphone market exploded, so did ARM's share price as investors realized what a key node ARM had become in the burgeoning computer-on-glass phone and tablet market.

As the smartphone market exploded, so did ARM's share price.
For companies who are trying to decide whether to go with Intel or an ARM-licensee, it's a bit like being asked whether you'd rather deal with Switzerland or the Aztec empire. "With ARM, when you are tired of Qualcomm you can go to NVIDIA or another company," Linley Gwennap, the boss of the Linley Group, a research firm, told The Economist last year. "But in Intel's case, there's nobody else on its team."

ARM-based designs are now found in more than 95 percent of smartphones. ARM may not be dominant in the way Intel is dominant in PCs, but the system it underpins is.

Simon Segars is the man who will have to deal with the fallout from all of ARM's successes. He begins as the new CEO of the company on July 1. I met him after he spoke on a panel about "multi-industry business ecosystems" at the Parc 55 hotel in the heart of San Francisco. He was tall and genial, happy to patiently and thoroughly explain why ARM had found itself in possession of so many friends and so much good fortune.

"I can genuinely say that our approach is to work within an ecosystem that is a healthy ecosystem. By that I mean the people in it are making money from what they do," he said. "We get questions on a regular basis, Why don't you quadruple your royalty rates? Because you're so strong, what are you customers going to do? We could do that and we could probably enjoy some more revenue for some time, but our customers would go off and do something else or have less healthy businesses. If we tried to extract lots of money out of the ecosystem, we'd have less companies supporting the ARM architecture and that would limit where it could go."

ARM is a company that finds itself in the right place at the right time with a philosophy of innovation that lots of companies want to believe in.

"Through the '90s and early 2000s, we saw an explosion in the number of people who could build a chip. That led to a lot of innovation and all the electronic devices that we see today," Segars said. "The role we've played is providing this core building block, this microprocessor, that many of these devices require. We've provided that in a very cost-effective way to anybody who wanted it. And that's allowed people to put intelligence into devices that they couldn't have afforded to do because they would have had to do it all themselves."

The Mobile Mystery: What Did Otellini See and When Did He See It?

Many of the structural changes that occurred in these industries now seem predictable. It feels like somebody else could have positioned Intel differently to take advantage of these trends. At the very least, Otellini should have seen where the changes were leading the silicon world.

And the thing is, he did. He just wasn't able to get the Intel machine turning fast enough. "The explosion of low-end devices, we kinda saw as a company and for a variety of reasons weren't able to get our arms around it early enough," he admitted.

It was Otellini, after all, who had made the call to start developing the very successful low-power Atom processor for mobile computing applications. And it was Otellini, who upon ascending to the throne, drew a diagram that I'll call the Otellini Corollary to Moore's Law at the company's annual Strategic Long Range Planning Process meeting, or SLRP. He duplicated it for me in an appropriately anonymous Intel conference room, calling it half-jokingly "the history of the computer industry in one chart."

On the Y-axis, we have the number of units sold in a year. On the X-axis, we have the price of the device, beginning with the $10,000 IBM PC at the far left and extending to $100 on the far right. Then, he drew a diagonal line bisecting the axes. As Otellini sketched, he talked through the movements represented in the chart. "By the time the price got to $1000, sort of in the mid-90s, the industry got to 100 million units a year," he said, circling the $1k. "And as PCs continued to come down in price, they got to be an average price of 600 or 700 dollars and we got up to 300 million units." He traced the line up to his diagonal line and drew an arrow pointing to a dot on the line. "You are here," he said. "I don't mean just phones, but mainstream computing is a billion units at $100. That's where we're headed."

"What I told our guys is that we rode all the way up through here, but what we needed to do was very different to get to [a billion units]... You have to be able to build chips for $10 and sell a lot of them."

[Image: Otellini_lecture2.gif]

"This is what I had to draw to get Intel to start thinking about ultracheap," Otellini concluded.

"How well do you think Intel is thinking about ultracheap?" I asked.

"Oh they got it now," he said, to the laughter of the press relations crew with us. "I did this in '05, so it's [been more than] seven years now. They got it as of about two years ago. Everybody in the company has got it now, but it took a while to move the machine."

It took a while to move the machine. The problem, really, was that Intel's x86 chip architecture could not rival the performance per watt of power that designs licensed from ARM based on RISC architecture could provide. Intel was always the undisputed champion of performance, but its chips sucked up too much power. In fact, it was only this month that Intel revealed chips that seem like they'll be able to beat the ARM licensees on the key metrics.

No one can quite understand why it's taken so long. "I think Intel is still suffering with the inability of this very fine company to enter a new major segment that changes the game," Magnus Hyde, former head of TSMC North America told me. "That's been a problem before Paul, been a problem during Paul, and will probably be a problem going forward. They have all the things they need on the paper: the know-how, the customers, the cash to take over whatever they need. But somehow a little piece is missing."

"This is a company with 100,000 employees with a 40-year legacy. They are unbelievably good at what they do. No one can touch them," said Rasgon, the analyst. "There is a certain degree of arrogance that goes align with that."

"As CEO, that's your job: steer [the ship]," he continued. "It doesn't necessary mean [Otellini had] a failure of vision, but he couldn't get the ship to turn."

Ruiz, who led AMD's last battle with Intel while he was CEO from 2002 to 2008, told me he thought Intel's mobile progress had been slowed by their concentration on his company. "The focus the company has had for the past three decades on squashing AMD caused them to lose sight of the important trends towards mobility and low power," he said. "They should have focused more on their customers and the future than on trying to outdo AMD."

Some people seem to think someone else could have done better. And it's nice to believe in the transformative leader. Call it the Fire-the-Coach Fallacy. Sometimes, installing a new leader of an organization leads to better performance. But far more often, as some simple Freakonomics blogpost would tell you, we overestimate the importance of changing the coach or the CEO. It's not that CEOs are not important, but the preexisting conditions within and surrounding a company are just more important.

Unlike a lot of leaders, Otellini seems aware of this fact. "Intel's culture is blessedly not the culture of a CEO, nor has it ever been," he told me. "It's the Intel culture."

Otellini, of course, knew the Intel culture well. It had formed the substrate of his entire career. Starting out in finance in 1974, he'd worked his way up the chain on the business side of the operation, eventually landing the key gig of managing Intel's IBM account in 1983. It was right before Intel abandoned the memory business. He'd worked closely with Andy Grove, watching how he processed information, managed, and made decisions. He'd spent two years in the executive suite with Craig Barrett, watching him steer Intel in the rocky days after the Internet bust.

The Intel culture has been remarkably successful, of course. But it has also shown a resistance to change. It has managed to successfully surf massive transitions like getting out of the memory business in 1985 to focus on microprocessors and retaining a leading position in the move from desktop processors to laptops, but the same focus and scale that make Intel so powerful also prevent it from changing tacks quickly. If you've got 4,000 PhDs and 96,000 other people working for you, it's hard to turn on a dime.

Perhaps, though, the transformation that Otellini began in 2005 will finally be complete during Brian Krzanich's tenure. Intel's technical lead, perfectionism, and scale will create amazing chips at prices that cause phone and tablet makers to give up their commitments to the ARM ecosystem.

"They already have products in the marketplace that are competitive and I would not be surprised if they had best-in-class products in a few years," Rasgon said. "What they are doing on the [manufacturing] process has really driven that."

Otellini sees an analogy to the current situation in Intel's performance with Centrino laptop chips. "Intel made the big bet. [Chief Product Officer] Dadi [Perlmutter] and I made the big bet in 2001 to bet on mobile. This was when the desktop was 80 percent of all PCs, maybe 90 percent, and unabated growth and notebooks were luggables," Otellini said. "And we thought that there was an argument about what a computer could be and that led to what would become Centrino."

Centrino chips won over Apple's Steve Jobs because the silicon was so good they could not be ignored. "The head-to-head of comparison of an Intel based notebook and an Apple notebook were night and day in terms of performance, battery life, etc," he said. "That's what got their attention."

And if Apple -- so notoriously anti-Intel that a 1996 Mac commercial showed a burning Intel mascot -- could come to love Intel processors, couldn't all the current ARM licensees see the blue Intel light?

A Battle of Innovation Cultures: The Lab Vs. The Ecosystem

[Image: revolutioninprogress.jpg]

The cover image of Intel's 1983 history of itself, "A Revolution in Progress."
Silicon Valley has been, rightly or wrongly, synonymous with innovation for four decades. Now, it's as much a notion as a place. When Paul Otellini joined Intel in 1974, a year of bloodletting at the company that also saw two of its future CEOs hired (Otellini and his predecessor Craig Barrett), the peninsula south of San Francisco and the Santa Clara Valley had merged in the American mind into the crucible for the future. Though Intel would only make $20 million that year, it was clear that these chips, and their tendency to get cheaper so quickly, were a new force unto the world. The whole enterprise was shaped by individual humans, structured by capitalism, and aided by Cold War R&D money, but the effects of all this memory and computation, its exponentiality, were hard to predict. A story led the New York Times business section a couple years later with the banner headline, "Revolution in Silicon Valley." The subheadline read, "'The basic thing that drives technology is the desire to make money,' says one executive. Now, where can they use the technology?" 

Think of that as a kind of ur-mainstream media Silicon Valley story. It's got all the elements: an early reference to the orchards that used to exist, "low-slung" buildings as the unlikely seat of revolution, hot consumer products, hypercompetitive industries, massive innovation, great men, something like a formulation of Moore's Law, and the exceptionalist sense that this could only happen in this one place in California. 

There are two conflicting narratives about all this Silicon Valley innovation. On the one hand, there is the notion that Silicon Valley is an ecosystem of entrepreneurs and inventors, financiers and researchers. Companies can break up and reassemble. Spinoffs can pop out of larger corporations. Startups can disrupt whole industries. Competitors can cooperate and then compete and then cooperate. And when you add up all these risk-taking, failure-forgiving people, the sum is greater than the parts. Fundamental to this notion is the idea that innovation happens best in networks of firms and individuals, in an ecosystem (a word that itself gained credence thanks, in part, to Stanford ecologist Paul Ehrlich in the late 1960s). 

[Image: ecosystem.jpg]

The ARM biosphere, from the cover of the company's 2009 annual report (ARM).
On the other hand, we have Intel. Intel structured and thought of itself like a research laboratory, according to long-time Silicon Valley journalist Michael S. Malone, in his 1985 book, The Big Score. "The image of a giant research team is important to understanding the corporate philosophy Intel developed for itself," Malone wrote. "On a research team, everybody is an equal, from the project director right down to the person who cleans the floors: each contributes his or her expertise toward achieving the final goal of a finished successful product."

[Image: 1984annualreport.jpg]

From Intel's 1984 annual report (Intel)
Malone went on that the culture of Intel was not that of a bunch of loosey-goosey risk takers, but true believers, almost robotic in their dedication to Intel's goals. "Intel was in many ways a camp for bright young people with unlimited energy and limited perspective," he continued. "That's one of the reasons Intel recruited most of its new hires right out of college: they didn't want the kids polluted by corporate life... There was also the belief, the infinite, heartrending belief most often found in young people, that the organization to which they've attached themselves is the greatest of its kind in the world; the conviction they are part of a team of like-minded souls pushing back the powers of darkness in the name of all mankind."

This is a very different vision of innovation. This is an army of people tightly coordinated, highly organized, and hardened by faith. It was this side that competitors and suppliers have long encountered and complained about (sometimes appealing to the regulatory authorities).

"They are tough to deal with. I know some of the executives privately and they say, 'We're not really nice people to deal with.' They admit it. And it's true," Magnus Hyde, former head of Taiwan Semiconductor North America, told me. "They are really nasty when you get into negotiations."

And as for this whole "failure's cool!" mantra that seems to re-echo around Silicon Valley, Intel's Andy Grove enshrined what he called "creative confrontation," which encouraged and rewarded people to get after each other for flagging performance or mistakes.

[Image: tenkeytrends.jpg]

The cover of Intel's 1982 annual report (Intel).
Taken as a whole, Intel is a self-contained research, development, and deployment machine. That is not an ecosystem. Though obviously Intel has many partners with whom it makes money and has good relationships, on the leading edge of innovation, Intel goes it alone. 

Time and again, this strategy has worked as almost all of their competitors have fallen by the wayside. Intel is the only chip company in the world that's been able to hang on to its vertically integrated business model. "They have these methods, these Intel methods, that have worked very well for them," Hyde said.

The way Otellini vanquished AMD is a classic example of the Intel way. AMD had always played Brooklyn to Intel's Manhattan. Otellini himself had offers from both companies coming out of business school, and the competition remained fierce all the way until he took the reins. AMD was resurgent then. They had beat Intel to market with excellent 64-bit chips that were perceived to provide more performance for less money than Intel's processors. AMD's stock was on a climb that would take it to dizzying heights. By the end of 2008, Intel had destroyed AMD's momentum and sent the company into a tailspin. Finally, in early 2009, AMD spun out its fabrication facilities, exiting the chipmaking game. It was TKO in the longest-running bout in Silicon Valley. "They buried AMD," Rasgon put it bluntly.

[Image: vanquishedAMD.jpg]

Of course, there were several ugly court battles about Intel's hardball tactics in keeping AMD out of more machines. Intel eventually paid AMD $1.25 billion to settle the case in late 2009.

What's clear is that when Intel has a single competitor to focus on, they are hard to beat. "The thing about Intel is that we always come back," Otellini told me. "We put resources on it. We get focused. And watch out." They outinnovate, outmanufacture, and outcompete any company that comes into their targets. 

Which brings us back to the question of mobile, the space that has eluded Intel for a decade. What's fascinating is that it's a battle between Intel and a swarm of companies licensing chip designs from a relatively small IP company, ARM. Intel has bulk and strength, but they've come up against that other model of innovation: the ecosystem. It's two ideas about how Silicon Valley works locked in combat. If you're the swarm, with Qualcomm as the queen bee, the question is: How do you hold the coalition together? 

If you're Intel, which fly do you fire the shotgun at? Not ARM, that's for sure.

"ARM is an architecture. It's a licensing company," Otellini said. "If I wanted to compete with ARM, I'd say let's license Intel architecture out to anyone that wants it and have at it and we'll make our money on royalties. And we'd be about a third the size of the company."

"It's important for me, as the CEO, that I tell our employees who it is that we have to compete with and who we're focused on, and I don't want them focused on ARM. I want them focused on Qualcomm or Nvidia or TI," he continued. "Or if someone like Apple is using ARM to build a phone chip, I want our guys focused on building the best chip for Apple, so they want to buy our stuff."

I asked ARM's Segars about what I'd heard from Otellini, namely that Intel would beat the individual members of his coalition because they make the best transistors, and that would ultimately carry the day.

"There is a long track record of Intel investing very heavily on the leading edge of technology and implementing innovations of process technologies ahead of everybody else. That is a statement of fact and nobody would dispute that," Segars responded. "The transistors are, of course, important. The way in which the transistors are used is very important and really what the explosion of the technology space over the last couple of decades has shown is that there is a need to innovate and you can't focus innovation in just one company. If all the world's chips came from one vendor, whether it's Intel or anybody else, naturally that's going to limit innovation because there are only so many people and there will be a philosophy that's followed."

But Otellini, or Krzanich, can't focus Intel on ARM's "intangible" rhetoric. The questions industry watchers should be asking, Otellini said, are these ones: "Do you think Intel can beat Qualcomm? Do you think Intel can beat Nvidia? Do you think Intel can compete with Samsung?"

The answer might be yes, Intel can compete with each one, but maybe not with them all.

Or, maybe, the great machine will dominate once again. That's how Stacey Rasgon, the analyst who's been watching Intel and its rival chipmakers for two decades, sees it: "If I'm looking out five, ten years, they could potentially bury everybody else."
[Image: goinggoinggone.gif]

Alexis Madrigal is a contributing writer at The Atlantic and the host of KQED’s Forum.
   ____________________________________

Otellini in 2010
Born October 12, 1950
San Francisco, California, U.S.
Died October 2, 2017 (aged 66)
Kenwood, California, U.S.
Nationality American
Education St. Ignatius College Preparatory
Alma mater University of San Francisco
University of California, Berkeley
Occupation Ex-President & Ex-CEO of Intel
Predecessor Craig Barrett
Successor Brian Krzanich
Board member of
Google
Website Paul Otellini - Intel.com

Paul Stevens Otellini (October 12, 1950 – October 2, 2017) was an American businessman and one-time president and CEO of Intel. He was also on the board of directors of Google.

Early life and education
Otellini was born and raised in San Francisco, California, United States.[1] His family is of Italian origin.[1] Otellini graduated from St. Ignatius College Preparatory and held a bachelor's degree in economics from the University of San Francisco earned in 1972.[1] He received an MBA from the Haas School of Business at the University of California, Berkeley in 1974.[1]

Employment at Intel
Otellini joined Intel in 1974.[2] From 1998 to 2002, he was executive vice president and general manager of the Intel Architecture Group, responsible for the company's microprocessor and chipset businesses and strategies for desktop, mobile and enterprise computing.[3] From 1996 to 1998, Otellini served as executive vice president of sales and marketing and from 1994 to 1996 as senior vice president and general manager of sales and marketing.[4]

Previously, he served as general manager of the Microprocessor Products Group, leading the introduction of the Pentium microprocessor that followed in 1993.[5] He also managed Intel's business with IBM, served as general manager of both the Peripheral Components Operation and the Folsom Microcomputer Division, where he was responsible for the company's chipset operations, and served as a technical assistant to then-Intel president Andrew Grove.[6]

Otellini was appointed an operating group vice president in 1988, elected as an Intel corporate officer in 1991, made senior vice president in 1993, and promoted to executive vice president in 1996.

In 2002, he was elected to the board of directors and became president and Chief Operating Officer at the company.[7]

On May 18, 2005, he replaced Craig Barrett as the new CEO of Intel.[8] Otellini was considered a departure from the norm when he was promoted to CEO because he was not an engineer.[9] Otellini is reported to have been a major force in convincing Apple Inc. in the Apple-to-Intel transition, and being very fond of Mac OS X, saying Microsoft's Windows Vista is "closer to the Mac than we've been on the Windows side for a long time".[10]

In 2006, he oversaw the largest round of layoffs in Intel history when 10,500 (or 10% of the corporate workforce) employees were laid-off.[11] Job cuts in manufacturing, product design, and other redundancies, were made in an effort to save $3 billion/year in cost by 2008. Of the 10,500 jobs, 1,000 layoffs were at the management level.[12] In 2006, Otellini was named Haas Business Leader of the Year.[3]

In 2007, Otellini announced plans to build a $3 billion semiconductor manufacturing plant in the port city of Dalian, China.[13]

On November 19, 2012, Otellini announced his intention to retire in May 2013.[14][15]

Personal life
Otellini died in his sleep on October 2, 2017, at his home in Sonoma County, California.[16][17] He was survived by his second wife, of 30 years, Sandy Otellini; his son, Patrick; and his daughter, Alexis.[18]
   ____________________________________
https://www.aspeninstitute.org/videos/roundtable-paul-s-otellini/

Intel President and CEO Paul Otellini discusses the economy, innovation, and education at an Intel Innovation Economy Roundtable.
“A lot of the best engineers in China are from MIT,” said Intel President and CEO Paul Otellini as he emphasized the need to retain the best math and science minds in the world. They are the “lifeblood of our company.” Speaking at an Innovation Economy Roundtable, a partnership between the Institute and Intel, Otellini stressed the connection between high-level math and science innovators and the nation’s GDP. He also expressed concern that United States may no longer be a “magnet for great innovative companies.”

Participating in the discussion were FCC Chairman Julius Genachowski, New York Times columnist Tom Friedman, “The NewsHour”’s Judy Woodruff, and Amb. Elizabeth Bagley, the State Department’s Special Representative for Global Partnerships, Global Partnership Initiative. Fielding questions from all of them, Otellini engaged on topics ranging from the need for widespread broadband infrastructure (“It’s like having an electric car with no roads”) to China’s growing economic power (“Japan was just the warm-up for the real game: China. … There are more English speakers in China than there are Americans!”). But Otellini was particularly focused on driving the changes necessary to fix the economy—and fast. Specifically, Otellini worried that the Obama administration was too distracted by issues like health care and energy, saying that fixing the economy should take priority over all other legislative issues right now. “I’m worried that, by the time we wake up from this crisis, we’ll be in the abyss,” he said.
   ____________________________________

computer (desktop)
computer (laptop)
computer (workstation)
computer (rack - server - data center)
tablet (computing device)
smartphone (computing device)
watch (computing device)
fitness tracker (application specific computing device)
networking equipment (application specific computing equipment) 
   ____________________________________

Theodore Modis., Prediction : society's telltale signature reveals the past and forecasts the future, 1992.

p.12
A few months later, I was asked to forecast the life cycles of computer products and the rate at which recent models substitute for older ones.

pp.37-38
p.37
  Critics of S-curves have always raised the question of uncertainties as the most serious argument against forecasts of this kind.  
p.37
Obviously the more good-quality measurements available, the more reliable the determination of the final ceiling.  But I would not dismiss the method for fear of making a mistake.  
p.37
Alain Debecker and I carried out a systematic study of the uncertainties to be expected as a function of the number of data points, their measurement errors, and how much of the S-curve they cover.
p.37
We did this in the physics tradition through a computer program simulating “historical” data along S-curves smeared with random deviations, and covering a variety of conditions.  The subsequent fits aimed to recover the original S-curve. 
p.37
We left the computer running over the weekend. 
pp.37-38
On Monday morning we had piles of printout containing over forty thousand fits.  
p.38
The results were published,12  and a summary of them is given in Appendix B, but the rule of thumb is as follows:

  If the measurements cover about half of the life cycle of the growth process in question and the error per point is not bigger than 10 percent, nine times out of ten the final niche capacity will turn out to be less than 20 percent away from the forecasted one. 

p.38
The results of our study were both demystifying and reassuring.  The predictive power of S-curves is neither magical nor worthless. 

p.38
Bringing this approach to industry could be of great use in estimating the remaining market niche of well-established products with quoted uncertainties.  Needless to say life is not that simple.
products are not sufficiently differentiated, they are sharing the niche with others, in which case the combined populations must be considered. 

p.38
Like any powerful tool, it can create marvels in the hands of the knowledgeable, but it may prove deceptive to the inexperienced. 

p.57
The first computer model I tried turned out to be a showcase, one of Digital's early successful minicomputers, the VAX 11/750.  The cummulative number of units sold is shown at the top of Figure 3.1.  An S-curve passes quite closely to all twenty-eight trimesterly data points.  In the lower graph we see the product's life cycle, the number of units sold each trimester.  The bell-shaped curve is the  life cycle as deduced from the smooth curve at the top.
p.57
  When I produced this graph in 1985, I conclude that the product was phasing out, something that most marketers opposed vehemently at the time.  They told me of plans to advertise and repackage the product in order to boost sales.  They also spoke of seasonal effects, which could explain some of the recent low sales. 
p.57
  The data points during the following three (3) years turned out to be in agreement with my projections.  To me this came as evidence that promotion, price, and competition were condition present  throughout  a product's life cycle and have no singular effect.  The new program that marketers put in place were not significantly different from those of the past and therefore did not produce a modification of the predicted trajectory. 

p.128
   The first time I was impressed by such clustering of otherwise random events was when I read the book titled  Stalemate in Technology by Gerhard Mensch  in which there was a graph showing the time and number of the appearances of all basic innovations in Western Society.  Mensch classified what he regarded as basic innovations over the last two hundred [200] years and found that they did not come into existence at a steady rate but at a rate that went through well-distinguished periods of peaks and valleys. 
  Mensch defines basic innovation as a something that starts a new industry or results in a new kind of product ── the phonograph, for example. 

p.128 
Gerhard Mensch in Stalemate in Technology: innovations overcome the depression (Cambridge, MA: Ballinger, 1979).
German edition, Das technologische Patt (Frankfurt: Umschau Verlag, 1975)

p.129
The emerging clustering pattern, however, persists among several other attempts at such classification.4 

p.129
It has also been noted that innovations appear to be seasonal, like agricultural crops.  Throughout the winter after harvest a fruit tree undergoes a slow nurturing process, and in the spring its branches blossom in order to produce the next crop.  According to Marchetti, innovations in a social system are analogous:  The social system plays the role of the tree, and the innovations that of its fruit.5 

p.155
  Many of the problems the automobile industry is currently encountering around the world can be explained as follows.8  When the number of automobiles reaches the saturation level, the industry becomes a supplier of replacement only.  Productivity keeps increasing, however, because of competition, tradition, and expected pay increases.  A constant level of production coupled with increasing productivity creates redundant personnel and eventual layoffs.  Since saturation coincides with recession, so do the layoffs. 
p.155
  It is not only the automobile industry that is in trouble today.  Most technological breakthroughs become exhausted more or less at the same time because the cluster of basic innovations born together saturate together.  Even if they reach a geographical location late, they usually grow faster there.  The simultaneous saturation in many sectors of the economy generates a progressive reduction in employment and low growth in the gross national product ── in other words a recession.  

p.155
The new activities coming into existence must grow significantly before they have an impact on unemployment and economic development.  The energy clock says that we are now approaching the rock bottom of the recession, the mid-1990s. 
p.155
From then onward the rate of growth will progressively increase, but it will only reach a maximum ── the next boom ── in the early 2020s. 

p.158
affluence, decadence, alcoholism
cirrhosis of the liver
Its peaks coincide with periods of maximum prosperity, 

pp.166-167
  It was mentioned earlier that car population will saturate their niche in society for most European countries, Japan and, to a large extent, the United States by 1995. 

p.167
As for the construction of paved roads and highways, the main effort will be in maintaining and improving what already exists rather than adding more. 

p.167
We are at the end of the era in which people were preoccupied by the automobile. 

p.167
  In the 1960 the automobile was at its zenith. 

p.167
Cesare Marchetti believes that the automobile, in spite of its dominant position at the time, “felt” the rising threat of airplanes.1  He whimsically adds that at the moment when the automobile's market share started declining, cars were “masquerading themselves as airplanes with Mach 0.8 aerodynamics, ailerons, tails, and ‘cockpit’ instrument panels.  The competitor is the devil ── in this case the airplane ── and a peasants still do in the Austrian Alps, to scare the devil, one has to dress like the devil.” 

p.169
  The group at IIASA (Cesare Marchetti, Nebojsa Nakicenovic, and Arnulf Grubler) has indulged in fitting logistic functions on growth processes by the hundreds for over a decade. 

p.169
Arnulf Grubler's book, The Rise and Fall of Infrastructures, which contains many of the IIASA results, 

pp.169-170
p.169
  We then compared these natural-growth curves to the fifty-six-year {56-year} cycle of energy consumption, which coincides with the economic cycle.  We observed a remarkable correlation between the time these growth curves approach their ceiling and the valleys of the economic cycle. 

p.169
recession coincides with saturation of these technologies. 

p.170
saturation coincides with economic recession. 

p.185
Car users seem to be satisfied with an average speed of thirty miles per hour, and all performance improvements serve merely to counter balance the time lost in traffic jams and at traffic lights. 
  This invariant can be combined with another universal constant, the time available for traveling.  Yacov Zahavi studied the equilibrium between travel demand and supply, and urban structure.12 
He and his collaborators  showed the following: 
(a) Inhabitants of cities around the world devote about one hour and ten minutes per day to urban transport, be it by car, bus, or subway. 
(b) For this transport they all spend a rather constant percentage of their income, 13.2 in the United States, 13.1 percent in Canada, 11.7 percent in England, and 11.3 percent in Germany. 
(c) What varies from one individual to another is the distance they cover; the more affluent they are, the further they commute. 

p.218
  The french have a delightful fable written by La Fontaine about a busy fly that buzzes around the horses pulling a heavy carriage up a hill.  Once at the top, the fly feels exhausted abut self-satisfied with his good work.  The fable seems to apply to much of the behavior of decision makers.  

p.219
  No, there are things to be done and those who market products do them most of the time, but they believe they are shaping the future, while in reality they are only compensating for changes in the environment in order to maintain the established course.  They are reacting to change in the same way a biological system would in nature.  Whenever there is a danger of falling behind the established path, they bring out good ideas from manufacturing, product design, and advertising.  These ideas may have been dormant for some time; they are called upon as needed.  This is how it happens in biological systems.  Mutants are kept in a recessive state until a change in the environment brings them into the foreground. 

p.219
Innovation and promotion do not create new markets.  They are part of the competitive struggle, which becomes more and more important as growth processes saturate, and the only to way to maintain “healthy” growth is by taking the food from a competitor's mouth. 

p.224
  Once growth is complete, the level reached reflects an equilibrium.  Its signature becomes an invariant or constant that, despite erratic fluctuations, 

p.225
From half of a growth process one can intuitively predict the other half. 

p.234
Appendix B
Expected Uncertainties on S-Curve Forecasts 

p.235
In other words, a slower rate of growth correlates to a larger niche size, and vice versa.  This implies that accelerated growth is associated with a lower ceiling, bringing to mind such folkloric images as short life spans for candles burning at both ends. 

  (Prediction : society's telltale signature reveals the past and forecasts the future / Theodore Modis.,  1. forecasting., 2. creation (literary, artistic, etc.)., 3. science and civilization.,  CB 158.M63, 303.49--dc20, 1992, )
   ____________________________________
 

No comments:

Post a Comment

Libya, Ukraine, North Korea, and Iran situation

  https://copilot.microsoft.com/chats/4G4N26B9TUqUDSnMhqMVG Great approach! Comparing North Korea to Libya and Ukraine shows how different g...