DIGITIMES daily IT news
Supply chain window
SEARCH
DIGITIMES Research

Digitimes Research: FPGAs getting a foothold in deep learning inference chip market

Osiris Hu, DIGITIMES Research, Taipei [Wednesday 27 September 2017]

While Nvidia is currently a leading player in the deep learning inference chip market, powered by its GPGPU (general-purpose computing on graphics processing unit) processors, other vendors are offering FPGA-based solutions to contest for market shares.

For example, Alibaba Cloud has chosen Intel Altera Arria 10 FPGAs to power its F1 instance as well as Xilinx KU115 FPGAs for its F2 instance.

Microsoft's FPGA-based Project Catapult servers are also designed to improve its Bing search engine services and Azure cloud computing services, further highlighting increasing influence of FPGAs in the deep learning inference processor market.

While the inherent ability of an FPGA to be reconfigured and reprogrammed at any time allows FPGA-based products to be developed rapidly to shorten time to market, the use of Hardware Description Language (HDL) also enables FPGAs to accelerate deep learning inference processes and adapt to the rapid evolving of different algorithms.

Meanwhile, Google is also developing its custom ASIC chips for deep learning inference processing. Such ASIC-based solutions will be able to optimize algorithm configuration for specific tasks and also allow streamlined hardware designs and minimized chip sizes to enhance the advantage of terminal-end devices, Digitimes Research believes.

COMMENTS

Feel free to write a comment. All comments will be screened before posting. Please avoid writing profanities, personal insults and spam. Comments that require too much editing are unlikely to be posted.

Sincerely,

The DIGITIMES Team


255 character limit. characters remaining.

Nickname 



DIGITIMES Research - quarterly shipments data

Advertisement



ABOUT | CONTACT US | ADVERTISING | TERMS & CONDITIONS | PRIVACY POLICY

© DIGITIMES Inc. All rights reserved.