Empathy: The Killer App for AI
When psychologist Dr. Paul Ekman visited the Fore tribe in the highlands of Papua New Guinea in 1967, he probably didn’t imagine that his work would become the foundation for one of the most important developments in computing.
After studying the tribe, which was still living in the preliterate state it had been in since the Stone Age, Ekman believed he had found the blueprint for a set of universal human emotions and related expressions that crossed cultures and were present in all humans. A decade later, he created the Facial Action Coding System, a comprehensive tool for objectively measuring facial movement. Ekman’s work has been used by the FBI and police departments to identify the seeds of violent behavior in nonverbal expressions of sentiment. He has also developed the online Atlas of Emotions at the behest of the Dalai Lama.
And today his research is being used to teach computer systems how to feel.
Facial expressions are just one set of data that’s fueling the rapid advancement of a subset of AI called “affective computing.” Researchers and developers are creating algorithms that try to determine the emotional state of the human on the other side of the machine based on input such as gestures, facial expressions, text, and tone of voice.
More importantly, they’re using machine learning techniques to develop increasingly emotional-intelligent interfaces that can not only accurately detect a person’s mood but also respond to it appropriately. A number of startups have already amassed databases of millions of human facial reactions and libraries of written communication and are actively hunting for patterns to predict human emotion – and resulting behavior – on a large scale.
Just as once-novel voice recognition technology is now a ubiquitous part of human–machine relationships, so too could this kind of mood recognition technology soon pervade digital interactions and help businesses peer into our inner feelings.
“Once you can detect and analyze what is causing a person’s affective state, you potentially can respond to it and influence it,” says Stacy Marsella, a professor in Northeastern University’s College of Computer and Information Science with a joint appointment in psychology. However, Marsella warns, challenges and ethical concerns can arise at every stage of the process, from analyzing human emotions to responding to them or trying to shape them.
At the most basic level, organizations that want to collect and analyze affective data from customers or employees will want to get their approval to do so, and those individuals may have ethical concerns about how their emotional states are used or manipulated. “Similar concerns arise for us when we interact with each other,” Marsella says. “We often treat our affective states as private, privileged information and react negatively if others comment on what we are feeling.”
The customer experience is the most obvious sweet spot for affective computing capabilities. Forrester analyzed its customer experience data from 2018 and 2019 and found that emotion was the number-one factor in determining brand loyalty across industries – far more important than the ease or effectiveness of customers’ interactions with a company. Yet most businesses have focused more on the functional experience their customers have with them than on the emotional one, in large part because, until now, there has been no easy way to assess or address the latter.
But the potential benefits of affective computing go beyond building a better customer service bot. Low-cost, wearable sensors could enable companies to measure how environment and experiences affect employee mood. Organizations could use this knowledge to design more effective work settings and processes to increase productivity and employee satisfaction. Empathy could be built into enterprise software systems to improve the user experience by, for example, sensing when employees become frustrated with a task and offering feedback or suggestions for help.
Indeed, emotion is already big business and is expected to become much bigger. The global affective computing market is estimated to grow from just over from US$22.2 billion in 2019 to $90.0 billion by 2024, according to market research firm MarketsandMarkets. However, while we are already seeing some novel applications of affective computing in business, it will take some time for it to reach its full potential.
“These are still very early days for affective computing and social robotics,” says Richard Yonck, founder of Intelligent Future Consulting and author of the Heart of the Machine: Our Future in a World of Artificial Emotional Intelligence. “But we will see progress that will lead to affective computing moving incrementally into our lives.”
Yonck predicts the emergence of digital personal assistants exhibiting increasing emotional awareness in the next two to five years, followed by similar improvements in office and accounting software. “The auto industry, healthcare, and education are also seeing innovative development using this technology,” Yonck says. “Attention tracking in cars, monitoring mood changes in clinical drug trials, and early autism detection are just a few examples of applications being developed.”
Driven by emotion
The goal of AI is to make machines more like humans. And humans are driven as much by emotion as by intellect. Indeed, neuroscientific research has revealed that emotions are a crucial component of perception, decision-making, learning, and more. Those discoveries led to the birth of affective computing research and development 20 years ago.
Since then, affective computing experts have sought not merely to mimic emotions but to build applications that can adjust to the changing moods of their human counterparts. “One of the key elements of affective computing is being able to make inferences about the emotional states of the person who is interacting with the system – are they frustrated or happy or annoyed? – and then tailoring the response of the system based on those inferences,” says Marsella.
Machines that can connect with humans bring benefits that go far beyond computer-generated compassion. Results of a simulator study found that an emotional voice assistant with the ability to empathize with automobile drivers is the most promising approach to improving negative emotional states and, thus, driver safety.
Just as AI does not exactly replicate human intelligence in a system or device, affective computing systems are not emotional in the same ways that humans are. Rather, through the application of machine learning, Big Data inputs, image recognition, and in some cases robotics, artificially intelligent systems hunt for affective clues, just as a human might capture and correlate any number of sensory inputs: widened eyes, quickened speech, crossed arms, and so on. Additionally, some researchers are looking at physiological signs of mood, such as heart rate or skin changes, that could be monitored through wearable devices.
Indeed, when it comes to data on human emotions, an incredible amount is already available today, Marsella says. “We’ve gotten really good at collecting and analyzing these large amounts of data and, using machine learning techniques, mapping things like facial or vocal expressions to inferences about the underlying mental state of the person. A lot of researchers have taken a data-driven approach to this and it’s starting to pay off to some extent,” he says.
The emotion economy
Many businesses are already putting affective computing to use in customer-facing processes and functions. After all, in an era in which customer experience is the competitive differentiator, empathy may be the killer app for digital business. “Improving the customer experience is among the primary benefits of these technologies,” Yonck says. “Ultimately, tying emotional awareness into the customer experience is going to enhance brand loyalty and improve customer relations.”
Today, companies such as the BBC, Coca-Cola, and Disney are already using the simplest form of affective computing systems – emotion analytics – to assess how consumers react to advertisements, film trailers, and TV shows. They can measure, frame by frame, how a person responds to the content in order to optimize it or determine how best to allocate media spend, for example. Ad agencies are using such analytics to measure response and correlate that to key performance indicators like brand name recall or purchase intent.
But the big opportunity for businesses lies in enabling their customer-facing applications to respond in real time to how an individual feels. The most obvious and easiest-to-implement scenarios are chatbots and other digital or digitally-enhanced customer service and support interactions. Text sentiment analysis is already standard in most digital assistants. The goal is to improve and expand emotion recognition capabilities in order to provide customers with the most appropriate response at any point in time to deliver a more meaningful, personalized, or authentic experience.
Emotionally intelligent systems could deal with a range of frequent human reactions in an automated way, or they could monitor interactions between customers and human agents for emotional cues and then prompt certain responses, elevate calls, or alert supervisors to provide help.
“Scripts could branch according to whether the customer was seeking a genuine solution or simply was someone who likes to rant. As a human operator, the way you treat the two is very different, and this should be the case with a bot as well,” Yonck says.
Insurer MetLife uses AI software that can detect conversational cues to guide call center workers through difficult customer calls. The system recognizes that a steady rise in the pitch of a customer’s voice or instances of an agent and customer talking over one another are causes for concern. The system also grades each customer call experience from 1 to 10 based on the number and types of alerts it recognizes. The company says that the software actually enables its agents to be more human, improving first call resolution metrics by 3.5% and customer satisfaction by 13%.
Affective computing could also be used to induce desired emotional states. Researchers at Carnegie Mellon University (CMU) have been working on ways for digital agents to recognize and respond to subtle cues in conversation to build more rapport with their users. They have developed the Socially-Aware Robot Assistant (SARA) with the goal of creating a chatbot that’s more effective at conveying task-related information.
Designed to collaborate with human users, SARA uses several microphones and video cameras to track a person’s nonverbal communications. “SARA is capable of detecting social behaviors in conversation, reasoning about how to respond to the intentions behind those particular behaviors, and generating appropriate social responses,” according to the CMU lab where SARA was built. SARA’s voracious appetite for computing power limits its use in the wild today but suggests the advances that could be achieved with the right inputs.
The CMU group also developed its Rapport-Aware Virtual Peer Tutor (RAPT) software tool with the same socially-aware AI approach. RAPT models not only the student’s knowledge but also the interactions that develop between system and student, using the data to generate the most appropriate responses to maximize students’ learning and the bond between students and the virtual agent. The researchers are running a study to evaluate the impact on students’ goal setting, motivation, and engagement.
Emotional awareness in the organization
More sensitive chatbots and customer service lines are just a start, however. Observers anticipate the rapid development of an ecosystem of hardware, software, and services that build artificial emotional awareness into other aspects of the digital organization.
One recruiting technology company is using affective computing to record and analyze facial expressions and word choice during job applicant interviews. It can then provide its Fortune 500 clients with an additional data point to measure candidates’ levels of engagement, motivation, and empathy. The deep-learning emotion analytics engine it uses is built on a database of six million faces. It analyzes video, audio, text, and natural language, leveraging emotion-sensing analytics to assess basic human emotions with an accuracy rate in the 90th percentile.
The recruiting company says the system helps its clients rank the best candidates, identify overlooked high-potential candidates, and assess softer traits, such as personality, motivation, and ambition. The software can also be used to measure and improve the performance of interviewers and hiring managers, making them more effective and reducing bias.
Employers could monitor employee moods to make organizational adjustments that increase productivity, effectiveness, or satisfaction. A sense of belonging and collaborative problem solving can spark a positive mood, which has a direct impact on retention and the bottom line. A study of 1,800 call center workers by the University of Oxford’s Saïd Business School found that happier workers made more calls per hour, adhered more closely to their workflow schedule, and converted more calls into sales.
Affective computing tools are also being developed to help those on the autism spectrum better understand and interact with the socio-emotional world around them, alerting them to other people’s emotions and prompting them to react in appropriate ways. Meanwhile, researchers at North Carolina State University have developed a teaching program that tracks the emotions of students engaged in interactive online learning in order to predict the effectiveness of online tutoring sessions.
The perils of collecting feelings
Whether customers and employees will be comfortable having their emotions logged and broadcast by companies is an open question. Customers may find some uses of affective computing creepy or, worse, predatory. Today, when you buy a car or a house, you interact with another person. Negotiations take place human to human. But what happens when one of the human negotiators has an emotionally aware assistant in their corner?
“Once you begin interacting with a system that can read your subtlest emotional response more rapidly and accurately than any person ever could and then shift its script and strategy instantly, well, we’re probably not going to get as good a deal on that car as we would’ve in the good old days,” Yonck says.
Affective computing vendors are quick to insist that their systems are all opt-in. And other proponents of emotionally aware systems point out that affective state is just another data point among hundreds being collected on individuals by companies around the world.
The biggest limiting factor from a tactical point of view is the availability of data required to infer a person’s emotional state. In many cases, the signals that would provide the best emotional cues aren’t or can’t be collected or may not be allowed to be used for this purpose.
Indeed, that’s why chatbots and other text-based applications are at the forefront of affective computing development. Text interactions are relatively easy to collect, parse, and interpret, and customers are comfortable with what appears to be benign data collection and analysis to make their customer experience better.
Further, because some emotions are expressed through both vocal inflections and facial expressions, machines may need to learn to track both in real time. It can be difficult to capture the confluence of physical cues that may be relevant to an interaction, such as facial expression, tone of voice, and even posture. And while some affective computing experts believe there are universal cues to human emotion, as Ekman posits, a great debate remains in the psychological community about that. We humans are, after all, fairly complicated when it comes to emotions. The connection between physical cues and affective state may differ by situation, individual, and culture.
Finally, affective computing systems work best under controlled conditions, says Yonck. Noise, poor lighting, and odd perspectives all remain challenging. “It’s easy for the lay public to read certain headlines and think we’re talking about machines that can actually internalize and experience emotion. We’re still a long way off from that in any true sense, and the ability to work well in the wild is still going to take time,” he says.
Assess the value
Despite the limitations and ethical issues to be worked out, the emotional machines are coming. Companies that want to take advantage of these emerging capabilities as they digitally transform more aspects of their business can take a number of steps to assess the potential value:
- Evaluate the business problem you want to solve. Companies will want to figure out where they might get the most value from affective computing capabilities. Functions that are already being transformed by cognitive computing or other emerging digital technologies may be the best areas to start.
- Determine what affective data is available. There may be sources of data already being collected that could be mined effectively or additional data sources that could be integrated inexpensively. Companies should determine what inferences about mental states they want the system to make and how accurately those inferences can be made using the inputs available. They should also consider what non-emotional data they will need in order to get value from an emotionally aware system.
- Scope out the technical challenges. Involve IT and engineering groups to figure out the challenges of integration with existing systems for collection, assimilation, and analysis of large volumes of emotional data.
- Consider the level of emotional complexity. A like or dislike may be relatively straightforward to determine or act on. Other emotions may be more difficult to discern or respond to. Context is also key. An emotionally aware machine would need to respond differently to user frustration in an educational setting than to user frustration in a vehicle or on a customer service line.
- Weigh the application relative to personal privacy concerns. Detecting emotions of an employee with an HR app has very different privacy considerations than tracking consumer responses in a retail setting. “Companies will need to be sensitive that customers may react negatively to their affective states being assessed,” Marsella says.
- Understand the current or near-term limitations of the technology. Interpretation and labeling of specific emotions will require greater precision for certain applications. The current state of emotion-detection technology may be suitable for certain uses, while some applications may require a level of accuracy that is still many years away.
The transformative power of empathy
In the race to build the most effective digital organization, the ability to understand and respond to human emotion may actually be the key differentiator. Companies have been using technology to rework and improve their business processes and models for decades, but there has always been a disconnect between human and machine. Those enterprises that effectively integrate emotionally aware and responsive systems into their processes will be able to transform their organizations by creating more intuitive, customized, and empathetic customer interactions and boosting employee happiness, productivity, and retention.