News Stay informed about the latest enterprise technology news and product updates.

Oracle sets sights on chips

So, Oracle’s in the market for chip companies. And AMD quickly leapt to the top of the list of prospective targets.

Oracle CEO Larry Ellison told financial analysts late last week that the company–which has bought something like 60 companies over the past few  years–is still in shopping mode. And that chip and vertical software companies top the wish list..

AMD carries a market cap of around $4.7 billion. For its most recently closed quarter, Oracle had more than $12 billion in cash. There’s no reason this deal couldn’t happen.

One might say that Oracle’s already bought a chip company in Sun Microsystems and its SPARC franchise. Of course, Sun no longer did any chip manufacturing per se. It retained the IP but fabrication of SPARCs Fabricating was left to Fujitsu.

In any case, a nagging  chip question has dogged Oracle since  it closed the Sun deal last January. For one thing, the only hardware it’s promoting –Exadata and Exalogic servers–is based on Intel microprocessors. 

And, at a meeting with a few Sun and Oracle partners in August, a hardware channel exec said Oracle has no interest in investing R&D dollars into Intel machines. The question from the VARs was obvious: “What about Exadata?”

Answer: “We’re looking at other platforms.”

Other platforms? Why not say SPARC? There’s something odd going on here.

In the meantime, the critiques on Oracle’s data center appliance plays are not all glowing. While Ellison painted Exalogic as the latest-and-greatest in hardware-and-software technology knit together from inception. Others called it a return to the mainframe, which may be a good or bad thing depending on what you think of mainframes.

John McCarthy rejected that analogy.

“It’s worse than that! It’s a return to the VAX! It’s the hardware, the OS, the database all combined. They’re putting all their wood behind one arrow,” said McCarthy, who is VP and principal analyst for Forrester Research.

 Check out more IT channel news on and follow us on Twitter!


Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

How big of an issue do you think data center power consumption really is?
This would have been spot-on 2-3 years ago, but we've made great strides.
Without know the current design and metrics used within the industry their conclusions are flawed.
The swipe on flywheels and the failure to mention initiatives like PUE or groups like the Green Grid struck me as huge omissions.
The cost of technology has finally arrived at a point where the major cost of delivering information is electrical power. As a result there is a major focus on power consumption and the industry is working to "right the ship".
While pointing a serious aspect - need for energy efficiency and preventing waste of resources - the article seems to miss considering the massive development being done the last years to improve power consumption by increasing utilisation and density as well as cooling technologies. At hp Critical Facilities Services we assessed and improved hundreds of our clients data centers energy efficiency profile worldwide. A further point which needs more rational look at is the aspect of business continuity. Demand for reliability of service and data drives the according design. Mixing different architectural components as dual feed, UPS design and emergency power doesn't stack redundancy but sticks to eliminating SPOF and building reliability as required by business. Sure, and if that was the intent to raise by NYT, it's not perfect and further developments are going on - some of them in emerging state, as revival of DirectCurrent, application of FuelCells, broad FreeCooling use and frameworks as CMM.

Seems that the 5-8 years old "Energy use in DC/ICT" as a topic reached mainstream.
The industry has made huge investments in improving energy efficiency and IT Utilization that have all but eliminated the concerns raised in this article.
Data centers are becoming more efficient. One thing the article does not mention, is the cost of downtime. Besides productivity or production losses, a company's reputation can be damaged.
By biggest objection is that the data is presented all out of proportion. It's 2% of our energy supply and it save 5 times its use in carbon (Smart 2020 report) by doing things smarter. Contrast this to cell phones which use as much or more energy, for instance, or lighting, or big screen TV's, or heating and cooling. Data Centers are making our world better, but you won't get that out of this article.
Who would believes anything from NYT
Clearly the NYT is not taking into consideration many technologies with in data centers that reduce waste (see the 1st response here, the VMware argument). I have to believe that optimizing data centers is something that most enterprises constantly look to address. Maybe most importantly, don't we have incentive's to improve inefficiencies? For example, it saves the average enterprise/SMB money on energy? The average data center is far from perfect, but definitely conscious of energy consumption.
We are very mindful of our electric bill and automatically bring up resources when needed and turn them off automatically when not needed.
The truth is generally more nuanced than a simple yes or no type answer and whenever your try to boil things down to that simple a level, you end up losing a lot of what is true...
Power used in the data centre has been topical over recent years. There is far more energy efficiency to be realised in the way that data is received and distributed. There is far more hardware "out there" The Data Centre has efficiency in economy of scale. Little in efficiencies distributed widly on a large scale will account for just as much of the global share, if not more than Data Centre's do. In that respect the data is old.
there are plenty of legacy data centers out there, but the economics of high energy costs and stakeholder demands will see the continued evolution towards greener DC's.
Crap NYT Technology coloumn as usual