A Google Glass App That Detects People's Emotions

Sometimes people are hard to read. Why not leave all that work to a computer? Perhaps you could use this experimental app that works in Google Glass. Aim Glass's camera at a person's face and the app reads the human's facial expression and tells you to what extent the person is feeling happy, sad, angry, or surprised.

As a bonus, the app guesses the person's age and gender. Evaluating whether you want to hit on that person is still up to you.
Kidding aside, an app like this could help people with conditions, such as autism, that makes it hard for them to read emotions. The app is supposed to work entirely on Google Glass' CPU, so it doesn't need to send the images Glass records to the cloud. That means the app could work when the glasses don't have a data connection, which is nice. It could also keep the images the glasses record for the app more secure—the images are supposed to stay on the device and never enter the cloud.

Over the past few years, engineers working for several universities and companies have tried to make emotion-reading algorithms. Some are already on the market. Usually, the idea is that such algorithms could go into software for marketing departments (How is this new ad making viewers feel?), or into adaptive computer games (How is this level making players feel?). It's also a step toward loading the ability to read people's emotions into robots. It would certainly behoove a customer service robot, for example, to be able to sense frustration and confusion in people's faces.

Making a face-reading algorithm for private individuals to use is an unusual, but not unheard-of, idea. The market for this may not be large at present—in addition to needing the software, potential buyers have to be able to afford Google Glass—but perhaps what's especially useful here is knowing that this kind of computing can be miniaturized to something as small and light as Google Glass. That means it could show up anywhere.

Post a Comment

0 Comments