Datacenter Trends

Qualcomm Back in Datacenter Fray with AI Chip

The chip maker joins a crowded field of vendors that are designing silicon for processing AI inference workloads in the datacenter.

There's more than one way to skin a cat -- and, apparently, to compete in the datacenter silicon market.

Qualcomm Technologies -- which last year seemed to be turning away from that market when it abandoned its Centriq line of CPUs and its datacenter technology chief, Anand Chandrasekher, departed -- just unveiled a new line of artificial intelligence (AI) processors aimed at the datacenter: the Qualcomm Cloud AI 100 platform for inference acceleration.

Qualcomm Cloud AI 100 is an all-new, highly efficient family of chips specifically designed for processing AI inference workloads. "Inference processing" refers to the ability of a trained neural network to infer things from new data inputs in which it was not specifically trained.

Datacenter demand for AI accelerators -- and inference acceleration -- has been nothing short of explosive. Many analysts are predicting that inference processing will outstrip the neural network training market, which is currently one of the top AI investments in the datacenter.

Cloud AI 100 was "built from the ground up" to meet the demand for AI inference processing in the cloud, Qualcomm said in a statement. It leverages Qualcomm's "heritage in advanced signal processing and power efficiency" to "facilitate distributed intelligence from the cloud to the client edge and all points in between."

The new line of inference accelerators was announced last week at Qualcomm's AI Day event. Keith Kressin, senior vice president in Qualcomm Tech's product management group, promised that the new accelerator "will significantly raise the bar for the AI inference processing relative to any combination of CPUs, GPUs and/or FPGAs [field programmable gate arrays] used in today's datacenters."

He added: "Qualcomm Technologies is now well-positioned to support complete cloud-to-edge AI solutions all connected with high-speed and low-latency 5G connectivity."

The company is promising a lot with this release, including greater than 10x performance per watt over the industry's most advanced AI inference solutions deployed today; a 7nm process node to provide performance and power advantages; and support for industry-leading software stacks, including PyTorch, Glow, TensorFlow, Keras and ONNX.

Qualcomm also plans to support the new product line with a full stack of tools and frameworks, and to facilitate the development of an ecosystem around the distributed AI model for everything from personal assistants for natural language processing and translations, to advanced image search and personalized content and recommendations.

Qualcomm faces plenty of competition in this space. Lots of vendors are designing silicon for inference processing in the datacenter. NVIDIA and Xilinx, for example, have been in the market for months with solid partnerships in place. NVIDIA unveiled its Turing-based Tesla T4 GPUs for inference processing last year. And AMD and Xilinx last year joined forces to set an AI inference processing record of 30,000 images per second. And Amazon Web Services (AWS) is marketing its Inferentia solution as a machine learning inference chip designed to deliver high performance at low cost.

The Qualcomm Cloud AI 100 is expected to begin sampling to customers in the second half of 2019.

About the Author

John has been covering the high-tech beat from Silicon Valley and the San Francisco Bay Area for nearly two decades. He serves as Editor-at-Large for Application Development Trends (www.ADTMag.com) and contributes regularly to Redmond Magazine, The Technology Horizons in Education Journal, and Campus Technology. He is the author of more than a dozen books, including The Everything Guide to Social Media; The Everything Computer Book; Blobitecture: Waveform Architecture and Digital Design; John Chambers and the Cisco Way; and Diablo: The Official Strategy Guide.

Featured

comments powered by Disqus

Office 365 Watch

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.