Intel expands custom chip work for big cloud providers

Intel is developing unique chips for particular customers that combine its Xeon processors with FPGAs

Intel is stepping up its efforts to build custom chips for online giants like eBay and Facebook, giving them a boost in performance and, Intel hopes, another reason to keep buying its microprocessors.

The chipmaker is working with the big online firms to embed programmable chips, called FPGAs, in the same chip package alongside its Xeon server processors. Those FPGAs can be programmed to run unique algorithms that each online firm depends on to run its services.

"For the right application, we believe FPGAs can provide an order-of-magnitude improvement in performance" over standard Xeon chips alone, said Diane Bryant, senior vice president and general manager of Intel's data center division.

It's a shift from Intel's strategy in the past. Until a few years ago, all its customers got basically the same general purpose processors. They could select different clock speeds, core counts and other features, but everyone got the same basic chip design.

The rise of online giants like Google, Facebook, Amazon and eBay has changed that. Those companies run a relatively small set of applications, but they do so on a massive scale -- a single workload might run across tens of thousands of servers. They also have a lot of buying power.

That alters the economics of the chip business. If a customer is buying thousands of servers for a particular workload, it becomes viable for Intel to optimize a processor for that task. And customers will pay a bit more if it lets them squeeze out a bit more performance or consume less power.

Intel has built custom chips for customers before. Last year it delivered about 15 unique designs, including processors for Facebook and eBay. But they involved relatively minor changes, such as disabling cores and adding extra I/O ports.

Integrating an FPGA (field-programmable gate array) with its Xeon chips is a step further. And in some cases, Intel will hardwire the instructions for the algorithm directly onto the Xeon itself, Bryant said.

It's a new way for Intel to deliver custom chips, and this year it expects to deliver more than 30 unique designs, Bryant said. She was due to make the announcement at Gigaom's Structure conference in San Francisco Wednesday.

It's a smart move by Intel, said analyst Nathan Brookwood of Insight64. It gives its largest customers less incentive to license a competing chip design, such as the ARM architecture, and optimize that to run their algorithms instead, he said. IBM has also opened its Power8 design, which Google has been testing.

There are two ways customers can use the FPGAs, Bryant said. In one case, the online service provider deploys a Xeon package with the FPGA and tests a workload to ensure it delivers the desired benefits. If it does, Intel will burn the instructions onto the Xeon itself and manufacture the part without the FPGA.

The other use case takes advantage of the fact that FPGAs can be reprogrammed in the field. The service provider buys servers with the dual-chip package inside, and programs the FPGA depending on the workload they need to optimize. If their needs change later, they can reprogram the chip again.

Using FPGAs to accelerate workloads isn't new, but they're usually discrete components on the motherboard linked to the processor via PCIe. Integrating them into the chip package with Intel's QPI interconnect reduces latency and allows the FPGA to access the Xeon's on-chip cache and its main memory, Bryant said.

That doubles the performance gain that can normally be derived from the FPGA, compared to using it as a discrete component, she said.

Bryant said a handful of cloud providers are testing the FPGAs, though she wouldn't name them. She also wouldn't say whose FPGAs Intel will use, though it has a manufacturing partnership with Altera, making it a likely candidate.

It plans to begin production of the Xeon-FPGA chip packages soon, she said. They'll be socket-compatible with standard Xeons, meaning customers can use them in standard servers.

She pointed to two trends that are driving the need for custom chips: the rise of large-scale cloud applications running across huge pools of servers, and the move to a more flexible, software-defined infrastructure.

Applications are changing faster than new chips can be designed and brought to market, Bryant said. "This is a great way for the silicon to keep up with the pace of software innovation," she said.

James Niccolai covers data centers and general technology news for IDG News Service. Follow James on Twitter at @jniccolai. James's e-mail address is james_niccolai@idg.com

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags intelprocessorsComponents

More about AlteraAmazon Web ServicesARMeBayFacebookGoogleIBM AustraliaIDGIntel

Show Comments
[]