Intel's 'Cascade Lake' Datacenter Chips Tackle AI Inference
- By John K. Waters
Amid the expected toys noise generated by the massive annual Consumer Electronics Show (CES) in Las Vegas this month, there was an unlikely datacenter product announcement: Intel is now shipping its new Xeon Scalable CPU, code-named "Cascade Lake."
First announced in November, Cascade Lake includes 48 cores, which improves server performance for demanding workloads, and features enhanced artificial intelligence (AI) and memory capabilities. It also includes support for Intel's Optane DC persistent memory and instruction set (DL Boost), which facilitates deep learning (DL) inference.
Inference is the process of taking an AI algorithm and applying it to new data. Navin Shenoy, executive vice president of Intel's Datacenter Group, told CES attendees during his keynote presentation that inference "tends to be a more difficult problem" to solve than some others in AI.
He also described Optane DC persistent memory as a "truly groundbreaking" innovation his company has been working on for about a decade.
"Cascade Lake and Optane persistent memory have been designed for data from the ground up," he said.
Cascade Lake features many of the basic design elements of the Xeon Scalable lineup, including a 28-core ceiling, up to 38.5MB of L3 cache, the new UPI (Ultra Path Interface), up to six memory channels, AVX-512 support and up to 48 PCIe lanes. The Cascade Lake chips fit the same socket as the previous generation.
Shenoy also showed off his company's new custom dual-die Cascade Lake chip that stretches to up to 48 cores. The new processors will soon contend with AMD's Rome EPYC processors fabbed on TSMC's 7nm process.
Alibaba CMO Chris Tung joined the keynote via video and talked about how his company is working with the Cascade Lake processors and Optane persistent DIMMs. Tung announced that Alibaba, which is the world's largest online commerce company, will be working with Intel to enable "3-D" athlete tracking for the Olympic Games, using a system capable of analyzing an athlete's performance without the use of special sensors or suits.
Shenoy also said the long-awaited 10nm datacenter chips (code-named "Ice Lake") will be available in 2020. According to Intel, Ice Lake brings "a new level of integration with Intel's new Sunny Cove microarchitecture instruction sets to accelerate AI usage and a graphics engine, and Intel Gen11 graphics to improve graphics performance for richer gaming and content creation experiences."
"The product, innovation and partnership announcements we're making today highlight that Intel's strategy is working," Shenoy said in an earlier statement. "We are making excellent progress in pursuing a massive $300 billion data-driven market opportunity spanning the most important workloads, such as AI, 5G and autonomous driving. And on a scale unmatched by others."
John has been covering the high-tech beat from Silicon Valley and the San Francisco Bay Area for nearly two decades. He serves as Editor-at-Large for Application Development Trends (www.ADTMag.com) and contributes regularly to Redmond Magazine, The Technology Horizons in Education Journal, and Campus Technology. He is the author of more than a dozen books, including The Everything Guide to Social Media; The Everything Computer Book; Blobitecture: Waveform Architecture and Digital Design; John Chambers and the Cisco Way; and Diablo: The Official Strategy Guide.