News

Meta Taps Latest Azure Virtual Machines for AI Research

Meta, formerly known as "Facebook," is using the latest Azure virtual machines (VMs) with graphics processing unit (GPU) compute power for its "large-scale AI research workloads," Microsoft announced on Wednesday.

Meta has selected the Azure VM series "NDm A100 v4," which offers the supercomputing power of Nvidia A100 80GB GPUs, to boost performance and scale for its AI research. This series offers the "flexibility to configure clusters of any size automatically and dynamically from a few GPUs to thousands," according to the announcement.

Meta has already used the NDm A100 v4 series of Azure VMs to train its OPT-175B language model.

The two companies are AI research collaborators, particularly on the Meta AI-fostered PyTorch open source library for deep learning research. PyTorch uses GPUs for tensor computing, using arrays to train AI models on so-called "neural networks," which are modeled on human decision making.

Microsoft, for its part, is planning to bolster the use of PyTorch on Azure by building "new PyTorch development accelerators," which will be arriving "in the coming months." Microsoft also pledged to continue its support program for PyTorch developers, which is known as "PyTorch Enterprise."

Microsoft uses PyTorch in its own products, "such as Bing and Azure Cognitive Services," according to this Microsoft document. It also contributes to various PyTorch open source projects, including "PyTorch Profiler, ONNX Runtime, DeepSpeed, and more."

Meta is using its AI research, in part, to support "immersive" ad campaigns built using its Spark AR augmented reality toolkit. With these add campaigns, a user scans a QR code with their mobile phone and can then interact with augmented reality scenes, such as diving with sharks or visiting King Tut's tomb. It also lets people buying furniture place a three-dimensional representation of an object in their home at scale, so they can see how it may appear.

Meta's collaboration with Microsoft on the Azure VM series will benefit "more developers around the world," said Jerome Pesenti, vice president of AI at Meta, per Microsoft's announcement.

"With Azure's compute power and 1.6 TB/s of interconnect bandwidth per VM we are able to accelerate our ever-growing training demands to better accommodate larger and more innovative AI models," Pesenti added.

About the Author

Kurt Mackie is senior news producer for 1105 Media's Converge360 group.

Featured

comments powered by Disqus

Subscribe on YouTube