As the first company to truly popularize discrete graphics processing units way back in 1999, you could say the folks at NVIDIA are probably a bit sentimental about their core discrete GPU business.
I suppose it also helps that NVIDIA's GPU segment accounted for nearly 88% of the company's total revenue last quarter, while at the same time single-handedly driving NVIDIA to its fourth consecutive quarter of achieving record margins across the board.
Let it suffice to say, then, that any threat to NVIDIA's discrete GPU market is a threat to its entire business.
That's where Apple comes in
As it stands, most would agree integrated graphics have long stood alone as the most significant antagonist to NVIDIA's discrete GPUs. So far, NVIDIA has done a remarkable job at making sure the performance gap between integrated and discrete graphics consistently remains large enough to justify the upsell for graphics and gaming enthusiasts -- especially against the widening backdrop of continued declines in PC unit sales.
However, if Apple has any say in the matter, that may all be about to change.
You see, Apple has made a series of interesting moves in the graphics arena lately.
First, a few weeks ago, the Apple announced that the entry-level version of its new 21.5-inch iMac will feature integrated Iris Pro 5200 graphics chips from Intel . To be sure, that's a solid win for Intel, especially considering NVIDIA scored all of the coveted spots in Apple's iMac lineup last year. What's more, this also marks the first time one of Apple's iMacs won't have its own discrete graphics chip.
To NVIDIA's credit, however, and much to Intel's chagrin, it's worth noting each of the higher-end iMac models will still feature various versions of NVIDIA's GeForce 700 series GPUs.
For those of you keeping track, we should also remember that Apple notably underwent the same transition to integrated graphics with its entire Mac Mini line last year, even though that move resulted in a slight performance regression when comparing the latest higher-end Mac Mini models to those available in 2011, the latter of which which sported a discrete graphics chip designed by AMD.
But if both the iMac and Mac Mini weren't enough, Apple also came out one week ago and did nearly the exact same thing with its newly announced 15-inch Retina MacBook Pro, which from now on will only use -- you guessed it -- integrated Iris Pro graphics from Intel. Once again, though, the upgraded version of Apple's 15-inch Retina MacBook Pro will still boast a discrete NVIDIA GPU.
All in all, this is clearly part of a broader trend at Apple, given the company has made no mystery of its unabated love for tightly integrated components. And while it's safe to say integrated graphics will almost certainly never cut it for hardworking graphics professionals and hard-core gamers looking for an edge, Apple knows all too well it'll suffice for the ever-growing ranks of everyday consumers and casual gamers.
NVIDIA won't go away anytime soon, but...
Don't get me wrong: Apple remains a relatively small player in the broader PC market, and iMacs, MacBooks, and Mac Minis aren't exactly NVIDIA's bread and butter. Windows-based PC gamers are, and that's not likely to change anytime in the very near future.
NVIDIA is also working feverishly to reduce its reliance on discrete graphics, most notably with the impending ramp of its new Tegra 4 line of mobile chips. If all goes as planned, then, and if NVIDIA's able to secure significant market share outside of discrete GPUs with Tegra 4, you can bet NVIDIA shareholders will happily reap the rewards.
But investors also need to recognize Apple's motivations are very different from that of other OEMs, which love to upsell their own systems to include more expensive discrete graphics and boost their own slim margins. Apple obviously doesn't have that problem, so it can afford to shun the expensive add-ons where possible in favor of a more seamless, integrated experience.
So, in the end, while Apple's influence shouldn't have NVIDIA shareholders running for the hills just yet, I think investors would still do well to remain cognizant of the growing risk posed by the increasing utility of integrated graphics.
Three compelling stocks you can own for the rest of your life
As every savvy investor knows, Warren Buffett didn't make billions by betting on half-baked stocks. He isolated his best few ideas, bet big, and rode them to riches, hardly ever selling. You deserve the same. That's why our CEO, legendary investor Tom Gardner, has permitted us to reveal "The Motley Fool's 3 Stocks to Own Forever." These picks are free today! Just click here now to uncover the three companies we love.
The article Apple Keeps Moving Away From NVIDIA originally appeared on Fool.com.Fool contributor Steve Symington owns shares of Apple and NVIDIA. The Motley Fool recommends NVIDIA. It recommends and owns shares of Apple and Intel. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.
Copyright © 1995 - 2013 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.