On a recent family outing, my mother and sister got into a shouting match. They weren't mad at each other, though - they were yelling at the iPhone's turn-by-turn navigation system. I interrupted to say that the phone didn't understand - or care - that they were upset.
"Honey, we know," my mom replied. "But it should!"
She had a point. After all, computers and technology are becoming smarter, faster and more intuitive. Artificial intelligence is creeping into our lives at a steady pace. Devices and apps can anticipate what we need, sometimes even before we realize it ourselves. So why shouldn't they understand our feelings? If emotional reactions were measured, they could be valuable data points for better design and development. Emotional artificial intelligence, also called affective computing, may be on its way.
But should it be? After all, we're already struggling to cope with the always-on nature of the devices in our lives. Yes, those gadgets would be more efficient if they could respond when we are frustrated, bored or too busy to be interrupted, yet they would also be intrusive in ways we can't even fathom today. It sounds like a science-fiction movie, and in some ways it is. Much of this technology is still in its early stages, but it's inching closer to reality.
Companies like Affectiva, a startup spun out of the MIT Media Lab, are working on software that trains computers to recognize human emotions based on their facial expressions and physiological responses. A company called Beyond Verbal, which has just raised close to $3 million in venture financing, is working on a software tool that can analyze speech and, based on the tone of a person's voice, determine whether it indicates qualities such as arrogance or annoyance, or both.
Microsoft recently revealed the Xbox One, the next-generation version of its flagship game console, which includes an update of Kinect, its motion-tracking device that lets people control games by moving their hands and bodies. The new Kinect, which goes on sale later this year, can be controlled by voice but is not programmed with software to detect emotions in those interactions.
It does include a higher-definition camera capable of tracking fine skeletal and muscular changes in the body and face, though. The machine can already detect the physics behind bodily movements and calculate the force behind a punch or the height of a jump. In addition, one of the Kinect's new sensors uses infrared technology to track a player's heartbeats. That could eventually help the company detect when a player's pulse is racing during a fitness contest - and from excitement after winning a game. For avid gamers like myself, the possibilities for more immersive, interactive play are mind-boggling.
Albert Penello, a senior director of product planning at Microsoft, says the company intends to use that data to give designers insight into how people feel when playing its games - a kind of feedback loop that can help shape future offerings and experiences. He says Microsoft takes privacy very seriously and will require game developers to receive explicit permission from Xbox One owners before using the data.
Microsoft says games could even adapt in real time to players' physical response, amping up the action if they aren't stimulated enough or tamping it down if it's too scary. "We are trying to open up game designers to the mind of the players,"
Penello said. "Are you scared or are you laughing? Are you paying attention and when are you not?"
Eventually, he said, the technology embedded in the Kinect camera could be used for a broader range of applications, including tracking reactions while someone is looking at ads or shopping online, in the hope of understanding what is or isn't capturing the person's interest. He said those applications were not a top priority for the company, however. (Some companies have experimented with technologies such as eye-tracking software to see what parts of commercials draw the most attention from viewers.)
Online media companies like Netflix, Spotify and Amazon already have access to real-time consumer sentiment, knowing which chapters, parts of songs, movies and TV shows people love, hate, skip and like to rewatch. Such data was used to engineer the popular online Netflix series "House of Cards," whose creators had access to data about people's television viewing habits.
So it is not much of a leap to imagine Kinect-like sensors, and tools like the ones Affectiva and Beyond Verbal are developing, being used to create new entertainment, Web browsing and search experiences.
The possibilities go far beyond that. Prerna Gupta, chief product officer at Smule, a development studio that makes mobile games, spoke about the subject at South by Southwest, the conference in Austin, Texas, in March. She called her talk "Apps of the Future: Instagram for Cyborgs" and gazed far into the future of potential applications.
She says she thinks industries like health care may be revolutionized by emotionally aware technology - particularly as we enter a time when laptops, smartphones, smart watches, fitness trackers and home media and game consoles interact with one another.
"Tracking how our bodies are responding throughout the day could allow you to tailor your life according to what's happening to your body throughout the day," she said. It could allow nutritionists to build meal plans for clients, or doctors to come up with more efficient medical treatments.
That could be just a start, though. "When we are wearing five different computers and they can all talk to each other, that sort of input information will cause an exponential increase" in what humans can do, Gupta said.
Of course, the range of ethical and privacy concerns is enormous.
Clive Thompson, author of a forthcoming book, "Smarter Than You Think: How Technology Is Changing Our Minds for the Better," says these exciting possibilities need to be explored very carefully.
"We are talking about massive archives of personal data that are really revealing," Thompson said, "not to mention that there is definitely something unsettling about emotion recognition becoming another part of our lives that is archived and scrutinized."
He said an insurance company, for example, might want to know its customers' moods - so it can raise their fees if they show signs of becoming depressed or sick. Employers might also want to know when their staff members are bored, so they can give them more work or reprimand them if their attention wanders during an important meeting. He wondered whether we would all become better at masking our emotions if we knew that we were being watched and analyzed. And could machines use what they know about our emotions to manipulate us into buying things?
Once a phone really does understand our emotions, the possibilities - good and bad - seem to spiral without limit. We're not there yet, but the future starts now.
© 2013 New York Times News Service
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.