News

Azure Gets Bots Service Preview and New Compute Services

Microsoft announced a few Azure milestones today and showcased developer options to tap into its new compute capabilities.

Additionally, Microsoft offered a vision statement about its efforts to "democratize" artificial intelligence. To that end, Microsoft has partnered with OpenAI, a nonprofit research organization that aims to "advance digital intelligence" to "benefit humanity." OpenAI announced today that it will use Microsoft Azure infrastructure for "most" of its deep learning and artificial intelligence research.

By democratizing artificial intelligence, Microsoft means "making it accessible to everyone." Artificial intelligence systems today can recognize words in conversations and "provide real-time translation." These systems are getting better because of "advances in fields such as reinforcement learning," Microsoft contended.

Microsoft has a number of Azure services that can support artificial intelligence and deep learning capabilities. Some milestones associated with those services were announced today.

Azure N-Series General Availability
For instance, Microsoft is planning to make its Azure N-Series of virtual machines generally available on Dec. 1, with rollouts planned for Azure datacenters in "South Central US, East US, West Europe and South East Asia" regions, according to an Azure blog post.

There are two virtual machine types under the N-Series, namely "NC" and "NV." The NC virtual machines use Nvidia Tesla K80 graphical processing units (GPUs) for high-performance computing and artificial intelligence workloads. The NV virtual machines run Nvidia Tesla M60 GPUs.

The announcement by OpenAI indicated that it was "impressed" by the availability of "K80 GPUs with InfiniBand interconnects at scale," which is an option under the Azure NC24r virtual machine subscription plan. This option is good for "tightly coupled jobs," according to Microsoft's Azure blog post:

InfiniBand provides close to bare-metal performance even when scaling out to 10s, 100s, or even 1,000s of GPUs across hundreds of machines. This will allow you to submit tightly coupled jobs using frameworks like the Microsoft Cognitive Toolkit (CNTK), Caffe, or TensorFlow, enabling training for natural language processing, image recognition, and object detection.

The City of Hope is using the N-Series for biological modeling. Algorithmia, a marketplace provider of algorithms, is able to offer products based on the Azure's GPU-accelerated capabilities with the new N-Series.

Bot Service Preview
Microsoft announced the preview today of a new Azure Bot Service. The service taps the Bot Framework that was announced at the Microsoft Build developer event in late March, as well as Azure Functions, which was at preview at that time but which has now reached general availability status.

The Azure Bot Service lets developers integrate text or short messaging service bots into Web sites and applications via a REST-based API. Examples could include "Slack, Facebook Messenger, Skype, Teams, Kik, Office 365 mail and other popular services," according to Microsoft's announcement.

The Azure Bot Service preview supports C# or Node.js for browser development. The service also provides templates for building bots. Microsoft is promising that coders can use "the IDE and code editor of your choice." Deployment includes some automated capabilities, according to the announcement:

The Azure Bot Service uses an Azure Resource Manager (ARM) template to create an Azure Function App for easy deployment and automatically registers your bot in the Microsoft Bot Framework, which provides a public bot directory to increase the exposure of your bot.

The announcement suggested that bots can be further enhanced using other Azure services, such as Microsoft Cognitive Services and Azure Search.

Azure Functions General Availability
Azure Functions has reached general availability status in 12 regions. Full-price billing will start on Jan. 1, 2017. Customers that used Microsoft's Azure Functions service early on included Accuweather and Plexure, according to the announcement.

Microsoft describes Azure Functions as a "serverless compute on Azure" capability that can be "triggered by events in Azure or third party services as well as on-premises systems." Developers can use the service to "build HTTP-based APIs" that can be tapped by applications and devices.

Azure Functions is generally available today, but it just supports C# and JavaScript right now. The ability to use "F#, PowerShell, PHP, Python and Bash" is still at the preview stage.

Also at the beta test stage is a new command-line interface for Azure Functions, which can be used for "creating, running and debugging Functions locally on Windows." Microsoft is working on future Linux and Mac command-line interface tooling for Azure Functions as well, but the timeline wasn't provided in the announcement.

Microsoft is touting the ability of Azure Functions users to easily create bindings to services with just a few clicks. Azure Functions currently has this capability for some Azure services, such as "Blob Storage, Event Hub, Service Bus, [and] Storage Tables," according to Microsoft's announcement. However, Microsoft plans to preview a new "binding extensibility framework" next year for developers and independent software vendors that will create an "extension ecosystem." It'll add bindings extensions support for applications such as "SendGrid, Twillio, Box, DropBox and Google Drive," Microsoft's announcement suggested.

The pricing for Azure Functions is based on runtime and memory use. Microsoft claims that developers can "set a maximum dialing spending cap to prevent runaway functions."

About the Author

Kurt Mackie is senior news producer for 1105 Media's Converge360 group.

Featured

comments powered by Disqus

Subscribe on YouTube

Upcoming Training Events

0 AM
TechMentor @ Microsoft HQ
August 11-15, 2025