Is That Person Mad or Sad? Google Glass Will Tell You [Video]
One day you'll be able to tell more about how people are feeling by using Google Glass.
The Sentiment Analysis prototype app—a facial expression recognition software for Google Glass—was announced Thursday by San Diego-based company Emotient. The technology can be used on anyone who is in view of the Glass's field.
"We believe there is broad applicability for this service to improve the customer experience, particularly in retail," said Emotient CEO Ken Denman.
The app will be able to measure positive, negative or neutral emotions. It'll also know primary emotions, which are joy, fear, contempt, anger, sadness, disgust and surprise. The Glass wearer will receive a report on how the app thinks surrounding people are feeling.
Emotient says that "the aggregated emotional responses are the only data that is stored," The Next Web reported.
Last year, Google said that it would not "add facial recognition features to our products without having strong privacy protections in place."
Emotient is currently focusing on healthcare and retail to begin with, but it plans to expand. It has raised $6 million to make this a possibility, which would be its first venture into wearable technology. The company plans to eventually become the go-to facial expression recognition software for "any connected device with a camera," according to TechCrunch.
The company will use its new funding to further commercialize, and it says it is not the same as Affectiva, which also deals with facial analysis, because of the way data is delivered.
"We believe our technology is differentiated in its ability to deliver sentiment and emotional insights in real-time and in its accuracy in uncontrolled environments, such as a crowded store," said spokeswoman Vikki Herrera to TechCrunch.