It’s nothing to be sad about. Instead, you should be filled with joy.
Why? Because soon your technology will be able to understand your emotional state.
This is a welcome application of technology, and I’ll tell you why below. But first, let’s take a look at examples of technology that can read — and convey — emotion.
Social Networks
Facebook is testing (US only) a new feature that lets you add an emotional component to your status updates. Clicking a smiley face icon drops down a menu of emotions and also activities (like eating, drinking, etc.), which will be conveyed to Facebook friends along with the update.
The emotion feature catches Facebook up to a feature added to Myspace 7 years ago.
The idea is that you can post a picture of, say, a puppy, then also convey your emotion about the picture.
Facebook is doing this, no doubt, because they know emotion is an important element of human communication, and they’re just not ready with technology to detect it yet. But Google is.
Google announced a new feature of Google+ this month called +Emotion.
Here’s how it works: Just open one of your own photos in the Google+ “lightbox” and click on the smiley face icon. (Have a backup because there’s no “undo” feature.) Google’s supercomputers will analyze the photo and put thought bubbles right on the picture showing everybody’s emotions. The emotion cartoons also show glasses, sunglasses, facial hair and other stuff. It can even detect dogs and cats.
Emotions are even detectable on Twitter, although not by Twitter – yet.
Researchers at the University of Vermont are already using Twitter to detect emotions en masse. They do it by analyzing the words people use in their tweets.
Once they gather data about “mood” or emotional states, they can then crunch other data, such as location, to find correlations. For example, they recently announced that people are happier the further they get from home. So much for “home is where the heart is.”
Social networking will increasingly deal in the communication of emotion. Someday they will combine technologies, using cameras, microphones and the words people use to figure out how users are feeling.
Our emotional states might be broadcast on our social media profiles in real-time and at all times, if we choose.
Smartphones
A researcher at the University of Rochester named Na Yang build a prototype smartphone app (not publicly available) called Listen-n-Feelwhile she interned at Microsoft Research last summer.
You can bet that major smartphone platform vendors, especially Google and Apple, are working on smartphone emotion-sensing as well.
Some rudimentary version of this kind of software is already deployed in some advanced call-center phone systems. When they detect a customer freaking out during the inevitable “press one if you’re calling about a warranty issue, press two if you’d like to take our survey” phase, they might accelerate access to a live human.
Building voice-based emotion detection into phones might be generally useful during regular calls. It might alert you to a distressed friend, or enable a friend to bypass voicemail.
Most of all, it will be deployed by future versions of Siri, Google Now and other virtual assistants. By reading your emotions, they’ll understand you much better. And they’ll “empathize” by sharing your joy – and your pain – in the tone of their voices.
PCs
There are many uses for emotion detection in PCs, but one of the best is for real-time avatar-based chat.
We have videoconferencing, of course, via Google+ hangouts or Skype. But many people, possibly a majority, feel too shy to be on camera. They might be more comfortable being represented by a real-time avatar — a cartoon character that works as a stand-in.
For example, researchers at Keio University are working on an avatar chat system that could convey the voice in real-time like a video chat, but use an avatar in place of the video.
The avatar’s lips would move in sync with the voice. Body and head orientation would instantly mimic the chatters. But more impressively, emotions would be conveyed through the avatar through facial expression.
Chatting with an avatar will be like chatting by video. Except your friend may be a cartoon rabbit, a celebrity or a Japanese cartoon character.
TV
The ultimate application for marketers and social science researchers is the coming ability of TVs to read the emotions of people watching.
MIT researchers are working on software that can harvest emotional states of millions of TV watchers at the same time.
First, MIT’s software identifies the face, then specific features about that face. It then tracks changes on 22 points around the face, and even monitors texture and color of skin to determine emotion.
Microsoft has applied for a patent that would enable a Kinect for Xbox 360 or some future version track the emotions of people in their living rooms.
The reason companies want this is that it becomes the ultimate tool both for improving the quality of programming and also the relevance of advertising.
The MIT research specifically looks at replacing Nielsen-type ratings, which have been vitally important to the television industry even though they are the bluntest of survey tools. Real-time emotion harvesting could tell the studios exactly where audiences were thrilled, bored or horrified. For example, many shows might be successful if not for one character or one scene that ruins it for audiences.
The Microsoft patent is all about advertising. The idea is to combine location, gender, age and other metrics to the specific moods of people in the room. This would enable advertising to reflect people’s feelings and become more relevant.
But people will want this too, and not only because it will improve both shows and commercials.
A new era of social TV watching is coming soon, where you’ll be able to share a movie or watch TV together with your friends, even if they’re in a different location. Sharing the emotions of everyone watching via icons or avatars on the edge of the screen will make it more social – and more enjoyable.
Why the Public Will Love Emotional Technology
Technology that understands, responds to and conveys emotion is a certainty. It’s coming. And almost everyone will accept and even enjoy it. Here’s why.
We’re emotional creatures. If we’re frustrated with our laptops or phones, we deep down want them to empathize with us and treat us differently because of it. Our brains are hard-wired that way.
We use our gadgets to communicate. And communication that fails to convey our emotions causes problems. E-mail is a perfect example. People constantly misread the intent of email because they can’t tell the emotional state of the sender.
Companies hoping to sell us things will really use this technology, because relevant advertising is effective advertising.
Emotional computing is coming. And it’s nothing to cry about. On the contrary, we should be happy about it!