The Schwartz Report

Blog archive

Microsoft Helps Developers Build Emotion Detection into Apps

In most cases it's relatively easy to get some sense how others are reacting or feeling in a live situation but online or via videoconference, such subtleties are much more difficult to detect. Microsoft this week released the public beta of an API and tool that lets developers program the ability to detect emotions into their apps.

The new emotion API, debuted at Microsoft's Future Decoded conference in London, was developed by the company's Project Oxford team and demonstrated by Chris Bishop, head of Microsoft Research in Cambridge, U.K., during his keynote address. Microsoft revealed Project Oxford at its Build conference back in April.

Microsoft describes Project Oxford as a portfolio of REST-based APIs and SDKs that allow developers to add intelligence into their applications and services using the company's machine learning technology that comes out of Microsoft Research. Among the APIs now in beta are facial detection, speech and computer vision.

This week's new tool, released to beta testers, is designed to let developers build the ability to detect the eight most common states of emotion: anger, contempt, fear, disgust, happiness, neutral, sadness or surprise. The state is detected by reading the kind of facial expressions that typically convey those feelings. In a blog post, Microsoft described some scenarios where those APIs would be useful, such as to develop systems for marketers to gauge reactions to a store display, movie or food or creating apps that render options based on the emotion it recognizes in a photo or video.

Microsoft also showcased a scenario tied to the facial hair fundraising effort Movember, in which the company released MyMoustache, to rate facial hair. Microsoft also released a spell check API beta. It's a context-aware programmatic interface that can detect slang as well as proper word usage (such when "four," "for" or "fore" is correct). It also supports brand names and commonly used terms.

By year's end, Microsoft will be releasing additional tools coming from the Project Oxford team, including a video API based on some of the same technology found in Microsoft Hyperlapse that can automatically clean up video. Also coming by year's end are tools to recognize speakers and custom recognition intelligent services (CRIS), which can detect speech in noisy environments.

Posted by Jeffrey Schwartz on 11/13/2015 at 12:46 PM


  • Windows 10 Hyper-V vs. Windows Server Hyper-V: Which Platform for Which Workloads?

    The differences between these two Hyper-V versions are pretty significant, depending on what you plan to use them for. Here's a quick rundown of each platform, from their features to licensing quirks to intended use cases.

  • Office Mobile Apps To End as Microsoft Highlights New Office App

    Microsoft plans to end support for Windows 10 Mobile applications on Jan. 12, 2021, according to a Friday announcement.

  • Is Microsoft Finally Reinventing Office?

    Microsoft is testing out a new technology called "Fluid Framework." It could mean that Brien's dream of one Office app to rule them all might soon become reality.

  • Azure Active Directory Connect Preview Adds Support for Disconnected AD Forests

    Microsoft on Thursday announced a preview of a new "Cloud Provisioning" feature for the Azure Active Directory Connect service that promises to bring together scattered Active Directory "forests."

comments powered by Disqus

Office 365 Watch

Sign up for our newsletter.

Terms and Privacy Policy consent

I agree to this site's Privacy Policy.