-->

Emotional Intelligence

Article Featured Image

Everyone gets emotional now and again; the difference lies in how we choose to cope. For some, it takes a good cry to soothe the blues, or a dry martini, shaken, not stirred; for others, all it takes is talking to someone who understands and, hopefully, makes the situation all better. After being transferred from what seemed like one incompetent agent after another, then being put on hold only to have the line dropped, the call center agent is probably your least likely source for emotional solace. The problem is, your one-year warrantee is almost up, your computer just blue-screened, and your bank statement shows purchases from a store in Florida called Missterious.

Customer service representatives (CSRs) may not be the best shoulders to cry on, but you’re counting on them, and it would certainly help if they knew how to respond to your torrent of emotional baggage.

Necessity is the mother of invention, and seeing how customer calls have been notoriously emotional since probably the inception of call centers, speech analytics providers are beginning to incorporate emotion detection into their systems. The solution, however, is admittedly a new concept, becoming technologically feasible only within the last few years. Emotion detection in the contact center is this year’s hot topic, according to Donna Fluss, founder and president of DMG Consulting.

In fact, Israel originally implemented emotion detection technology in 1996 for security purposes, akin to lie detectors, to combat terrorists. Now it seems the most obvious application is in a business-to-consumer (B2C) environment, but the technology is very limited. Given the lack of real-world applications, claims that speech technology vendors make about their emotion detection solutions are just that—claims, Fluss says.

Nevertheless, she professes to be a champion of emotion detection solutions. "I’m anything but cynical about this," she says. "I think it has great potential."

But as it is today, emotion detection as a standalone cannot stand. When used as a component of a complete speech analytics and workforce optimization (WFO) suite, however, it can provide tremendous insight into understanding the critical components behind why a customer is emotional and how to react. It contributes to the bigger puzzle when it comes to deciphering a customer call. With this added dimension, companies can diagnose and treat the pain points of not only products or services, but the performance of CSRs. The knowledge feeds into a perpetual cycle of improved customer service.

Techno-Voice

The potential of emotion detection has gradually pushed it into the spotlight, becoming one of the top three features customers ask for from call monitoring software provider Autonomy etalk. It follows behind concerns about speech processing and speaker recognition—most commonly used to distinguish between the agent and the caller. Still, Roger Woolley, vice president of marketing at Autonomy etalk, admits emotion detection is not "prevalent" in most call centers. He estimates that only two out of 10 companies even inquire about real-time capabilities, while most opt instead to perform analytics on recorded speech.

But don’t be too quick to blame technology. Reasons behind low market penetration revolve primarily around budget and operational priorities, Woolley says.

For its part, Autonomy etalk converts calls into a .WAV format and then dives in to determine their audio makeup. The solution has an emotion detection component that evaluates the call for the speaker’s tone, loudness, pitch, speed of sound, and cross-talk, which happens when more than one person is speaking at the same time. Based on these factors, the solution can then be set to highlight areas of emotional inflection, making it easier and more efficient to identify instances during calls in which disputes might have occurred.

"In the early days, companies had to staff organizations with quality assurance teams and supervisors and manually listen to those calls," Woolley says. By listening to calls, call centers derive a plethora of lessons that impact best practices, coaching opportunities, or even tracking the development of a particular campaign. "The promise of speech analytics," he adds, "is that you let the technology do the listening for you…you no longer have to look for the needle in the haystack. You can go directly to the calls that you’re most interested in."

This isn’t to suggest, however, that the early days of speech analytics are over, especially when it comes to emotion detection. Even if the technology is available, there’s still a learning curve to overcome. Amir Liberman, CEO of Israel-based speech analytics solutions provider Nemesysco, recounts how one company he spoke to recently questioned the value in analyzing voice. His response? "How can you not analyze voice?" After all, voice analysis happens almost instinctively in everyday conversations, cluing us into the intricacies that aren’t directly verbalized.

Happy? Sad? Angry?

While it’s convenient to assume, say, an increase in volume as indicative of heightened emotion, it’s also easy to see the obvious drawbacks of such thinking. What’s the difference between happy-loud and angry-loud? What about those who aren’t volatile, but rather, become very quiet when they’re angry? Emotion detection in speech analytics systems often doesn’t recognize these intricacies, Fluss says.

Autonomy etalk looks for large inflections in a call. "If I just say, ‘I’m really getting tired of you asking me questions,’ in a low, monotone way, it might not pick that up," Woolley says.

Therefore, emotion detection cannot depend on acoustics alone, says Daniel Ziv, vice president of customer interaction analytics at Verint. Even in everyday speech, emotion is defined by a combination of acoustics and linguistics, or what Woolley refers to as the context. By looking just at the acoustics (i.e., loud and soft), analytics systems can often be thrown off by the interference of background noise (e.g., rushing traffic, a loudspeaker announcement, or a crying baby).

When the technology tags a call as characteristically emotional, other tools in a speech analytics solution can take that call, mine it for specific words and phrases, and compare it to calls that were considered nonemotional. This helps pinpoint what’s uniquely driving emotional calls, whether it’s the product, the agent, or a billing issue.

Emotion detection can even be used to compare what a customer seemed to feel with what she says she felt. Most experts recommend pairing calls with postcall surveys that ask the customer to rate the quality of the call: How did we do? Did we resolve your problem? Would you recommend us to a friend?

Ziv recommends keeping the survey quick—preferably no more than three questions—and to the point. Then you can match the survey to the call and identify what’s driving people to give high or low scores, he says.

"Agents can learn the relevant skills needed to handle emotional situations by real-life examples," Ziv says. In general, emotional calls occur much less frequently than regular calls, but tend to make a much bigger splash. That’s because, according to behavioral psychologists, a loss often feels two to two-and-a-half times worse than a gain of similar magnitude. Applied to the contact center, a bad customer experience will likely attract more negative exposure than a positive response, making it critical for CSRs to be properly trained

It has been Liberman’s observation that most people tend to confuse stress and anger. Stress, he says, is derived from feeling trapped. Anger, on the other hand, translates into: "I’m trapped, but I’m getting out no matter what." Knowing this, Nemesysco’s solution tracks the development of stress and anger separately. "When a customer calls to complain about something, he expects to find empathy," Liberman says. "Once he doesn’t find it, his frustration is growing bigger, and so will his stress." That is genuine emotion, and the key is to get in before the point of no return, right before stress turns to anger, he notes.

All in Real Time

Ideally, emotion detection will one day be used to provide real-time intervention. "If you can turn around a situation, you’ve made a friend versus an enemy," Fluss says. Solve the problem proactively, rather than having to clean up the mess after a customer has defaced your brand across blogs, Twitter, or any of the many other social networking channels that continue to pop up, she suggests.

NICE Systems has opted for the term "interaction analytics" to encapsulate the variety of channels through which call centers are connecting with customers beyond speech, such as faxes, email, and live chat sessions. "When you’re interacting with a call center, you’re doing it in real time, but what does a call center do in most cases?" Fluss asks. "It goes back to reactive processes and procedures. It doesn’t take advantage of that real-time interaction."

Though assuaging an emotional caller during the time of the call is an attractive possibility, questions of practicality prove to be a hindrance. Intervening in real time requires an extensively networked environment and infrastructure, which can be undoubtedly intensive and expensive. Even if a supervisor were to intervene at every roadblock, customers don’t like having to repeat their stories over again, Ziv says. With some call centers receiving 200,000 calls a day, he says, real-time intervention is just not possible.

Companies can take a quasiproactive approach by identifying the root cause of calls a few hours after the initial onset and then resolving the situation before more calls start coming in. At the 2008 eTail East conference in Washington, Finlay Robb, chief marketing officer of LEGO Group’s Global Direct to Consumer Division, recounted a situation where the company began getting calls complaining about being double-billed. LEGO then began contacting customers who were potentially affected, alerting them to the mistake and apologizing. In addition, the company offered a certificate worth 5 euros. With the exception of one woman who attempted to sue the company, Robb says he was surprised by the number of customers who actually thanked the company for the notification.

To Janet Ryan, director of call center operations at AAA Washington, real-time intervention sounds like a great opportunity, but she doesn’t see it taking hold in a meaningful way—at least not right now. During peak driving season, AAA Washington receives approximately 5,000 calls a day. The majority of those calls come from its Emergency Road Service, and nearly all callers are emotional.

At her call center, Ryan is characterized as having "dog ears." Standing over her agents, she is acutely attuned to the differences between volume and emotion—a skill that she says many customer service supervisors still lack.

Even if an emotion detection solution were able to call for real-time interference, Ryan remains skeptical that call centers will take advantage of the functionality. "We’re always asking for more and more and more tools," she says," but the reality is that we don’t use them all."

Conference after conference, Ryan listens to motivational speakers tell audiences that they should be listening to at least 10 calls a day and calling five customers a week to check in, regardless of whether the call was good or bad. "Everyone writes it down and says, ‘Oh yes, we should,’ but does it really happen?" she asks. Resources are already thin enough as it is. "Yeah, I think it could work, but…," Ryan says with a laugh.

Looking Ahead

However, some vendors are eager to win Ryan, and many others, over. According to Fluss, Nemesysco’s is one of two solutions that can be considered a standalone emotion detection solution; the other comes from Israel-based VoiceSense. As a technology provider, Nemesysco does not provide the internal architecture for such a system, but works with companies to create seemingly endless possibilities. "We don’t want to limit people to certain ideas. We can encourage people to think out of the box here," Liberman says.

He describes the premise of a basic implementation that addresses the goal of real-time action. The technology attaches a "core priority number" to a call that gauges and quantifies the emotional degree of the call. In other words, the system ranks calls that demand immediate intervention with a higher priority. The expectation is that by being able to monitor and resolve problems in real time, call centers will be able to save tons of hours of listening to boring conversations, Liberman says.

He imagines another application where instead of concentrating on customer emotions, the technology can be used to better understand agent emotions. Call him in the morning and Liberman admits you won’t get much of an interview. "I’m brain-dead before my fourth cup of coffee, and I assume there are agents like that as well," he says. So what better way to increase efficiency than by assigning agents to work when they function best?

On top of that, the problem of agent turnaround attributable to common feelings of boredom or frustration with work can be remedied by identifying an agent’s stress level during calls. To prevent burnout, Liberman suggests managers monitor emotional trends to signal when to give agents a day off or some kind of bonus to restore their motivation at the moment they need it most.

Though promising and seemingly feasible, Liberman may well be spewing science-fiction. "We don’t have any systems like this at the moment," he says. "It’s something we’re trying to preach to our vendors. If you can extend every agent’s service by an average of one month, you can save tons of money."

Moving forward, industry experts anticipate that even outside the contact center, emotion detection will prove valuable in other customer service applications. Woolley illustrates futuristic scenarios where everyday technology—from automobile navigation devices, to household appliances, to even personal computers—can incorporate emotion detection that automatically determines user frustration levels and instantly offers assistance.

"If we don’t ‘wow’ the customer," Fluss says, "we don’t earn the right to upsell or cross-sell them." It’s not hard to see how emotion detection will play a key role in delivering an excellent customer experience. Imagine the improvements in productivity when call centers can avoid engaging in a verbal boxing match with the customer.

So while emotion detection technology in the call center has yet to make its grand debut, experts foresee a bright future. "The massive amount of data generated in the call center must be sorted in some way," Liberman says. What better way than by emotional distress since it’s often only then that call centers really have a problem. Most of the time, believe it or not, call centers are actually doing a good job, he notes.

Having come from a call center background, Fluss is determined to convince enterprises of the value of such solutions. Call centers are evolving, she says, and in the next eight to 10 years, they will be one of the most important contributors of revenue in an organization.

The rationale behind this prediction is not just because of the value emotion detection brings, she says, but because if solutions fail to justify their cost, they will be eliminated. "Call centers aren’t the most respected organizations," she says. "People are always saying, ‘What’s wrong with these people? They’re being yelled at all day. They must be doing something wrong,’ rather than, ‘Oh my God, these people are fantastic. They’re building our branding and representing us in the marketplace in a completely stressful situation.’" It’s easy to undervalue the customer service representative while completely overlooking the fact that, as Fluss states matter of factly, "Consumers aren’t very nice."

SpeechTek Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues