New Evaluation Framework

We used a new Evaluation Framework for our latest Product Evaluation Report, which is about Salesforce Service Cloud. We introduced the new Framework to make our reports shorter and more easily actionable. Shorter for sure, our previous report on Service Cloud was 57 pages including illustrations. This one is 22 pages including illustrations, shorter by more than 60 percent!

We don’t yet know whether the Report is more easily actionable. It was just published. But, our approach to its writing was to minimize descriptions and to bring to the front our most salient analyses, conclusions, and recommendations.

Why?

Our Product Evaluation Reports had become increasingly valuable but to fewer readers. Business analysts facing a product selection decision, analysts for bankers and venture capitalists considering an investment decision, and suppliers’ competitive intelligence staff keeping up with the industry have always appreciated the reports, especially their depth and detail.

However, suppliers, whose products were the subjects of the reports, complained about their length and depth. Requests for more time to review the reports have become the norm, extending our publishing cycle. Then, when we finally get their responses, we’d see heavy commenting at the beginning of the reports but light commenting and no commenting at the end, as if they lost interest. Our editors have made the same complaints.

More significantly, readership, actually reading in general, is way down. Fewer people read…anything. These days, people want information in very small bites. Getting personal, for example, I loved Ron Chernow’s 800-page Hamilton, but I have spoken to so many who told me that it was too long. They couldn’t get through it and put it down unfinished, or, more typically, they wouldn’t even start it. I’m by no means comparing my Product Evaluation Reports to this masterpiece about American history. I’m just trying to emphasize the point.

Shorter Reports, No Less Research

While the Product Evaluation Report on Salesforce Service Cloud was 60 percent shorter, our research to write it was the same as our research for those previous, much longer Product Evaluation Reports. Our approach to research still has these elements, listed in order of increasing importance:

  • Supplier presentations and demonstrations
  • Supplier web content: web site, user and developer communities
  • Supplier SEC filings, especially Forms 10Q and 10K
  • Patent documentation, if appropriate
  • Product documentation, the manuals for administrators, users, and developers
  • Product trial

Product documentation and product trial are the most important research elements and we spend most of our research time in these two areas. Product documentation, the “manuals” for administrators, users, and developers provides complete, actual, accurate, and spin-less descriptions of how to setup and configure a product, of what a product does—its services and data, and of how it works. Product trials give us the opportunity to put our hands on a product and try it out for customer service tasks.

What’s In?

The new Framework has these four top-level evaluation criteria:

  • Customer Service Apps list and identify the key capabilities of the apps included in or, via features and/or add-ons, added to a customer service software product.
  • Channels, Devices, Languages list supported assisted-service and self-service channels, devices attachable to those channels, and languages that agents and customers may use to access the customer service apps on those devices.
  • Reporting examines the facilities to measure and present information about a product’s usage, performance, effectiveness, and efficiency. Analysts use this information continually to refine their customer service product deployments.
  • Product, Supplier, Offer. Product examines the history, release cycle, development plans, and customer base for a customer service product. They’re the factors that determine product viability. Supplier examines the factors that determine the supplier’s viability. Offer examines the supplier’s markets for the product and the product’s packaging and pricing.

This is the information that we use to evaluate a customer service product.

What’s Missing?

Technology descriptions and their finely granular analyses are out. For example, the new reports do not include tables listing and describing the attributes/fields of the data models for key customer service objects/records like cases and knowledge items or listing and describing the services that products provide for operating on those data models to perform customer service tasks. The new reports do not present analyses of individual data model attributes or individual services, either. Rather, the reports present a coarsely granular analysis of data models and services with a focus on strengths, limitations, and differentiators. We explain why data models might be rich and flexible or we identify important, missing types, attributes, and relationships then summarize the details that support our analysis.

“Customer Service Technologies” comprised more than half the evaluation criteria of the previous Framework and two thirds of the content of our previous Framework-based reports. These criteria described and analyzed case management, knowledge management, findability, integration, and reporting and analysis. For example, within case management, we examined case model, case management service, case sources, and case management tools. They’re out in the new version and they’re the reason the reports are shorter. But, they’re they basis of our analysis of the Customer Service Apps criterion. If a product has a rich case model and a large set of case management services, then rich case model and large set of case management services will be listed among the case management apps key capabilities in our Customer Services Apps Table and we’ll explain why we listed them in the analysis following the Table. On the other hand, if a product’s case model is limited, then case model will be absent from the Table’s list of key capabilities and we’ll call out the limitations in our analysis. Just a reminder, our bases for the evaluation of the Customer Service Apps criteria, the subcriteria of Technologies for the old Framework are shown in the Table below:

Slide1Table 1. We present the bases for the evaluation of the Customer Service App criteria in this Table.

Trustworthy Analysis

We had always felt that we had to demonstrate that we understood a technology to justify our analysis of that technology. We had also felt that you wanted and needed our analysis of all of that technology at the detailed level of every individual data attribute and service. You have taught us that you’d prefer higher-level analyses and low-level detail only to understand the most salient strengths, limitations, and differentiators.

The lesson that we’ve learned from you can be found in a new generation of Product Evaluation Reports. Take a look at our latest Report, our evaluation of Salesforce Service Cloud and let us know if we’ve truly learned that lesson.

Remember, though, if you need more detail, then ask us for it. We’ve done the research.

Advertisements

Voices of Customers

With this week’s report, the 4Q2013 Customer Service Update, we complete our tenth year of quarterly updates on the leading suppliers and products in customer service. These updates have focused on the factors that are important in the evaluation, comparison, and selection of customer service products.

  • Customer Growth
  • Financial Performance
  • Product Activity
  • Company Activity

Taking from the framework of our reports, for Company Activity, we cover company related announcements, press releases, and occurrences that are important to our analysis of quarterly performance. In 4Q2013, three of our suppliers, Creative Virtual, KANA, and Nuance, published the results of surveys that they had conducted or sponsored over the previous several months. All of the surveys were about customer service and the answers to survey questions demonstrated customers’ approach, behavior, preferences, and issues in their attempts to get service from the companies with which they’ve chosen to do business. The responses to these surveys are the Voices of the Customers for and about customer service. This is wonderful stuff.

Now, to be sure, suppliers conduct surveys for market research and marketing purposes. Suppliers’ objectives for surveys are using the Voice of the Customer to prove/ disprove, validate, demonstrate, or even promote their products, services, or programs. Certainly, all of the surveys our suppliers published achieved those objectives. For this post, though, let’s focus on the broader value of the surveys, the Voice of the Customer for Customer Service.

Surveys

The objectives in many of the survey represent the activities that customers perform, the steps that customers follow to get customer service from the companies with which they choose to do business. By getting customer service, we mean getting answers to their questions and (re)solutions to their problems. Ordering our examination and analysis of the surveys in customers’ typical sequence of these steps organizes them into a Customer Scenario. Remember that a Customer Scenario is the sequence of activities that customers follow to accomplish an objective that they want to or need to perform. For a customer service Customer Scenario, customers typically:

  • Access Customer Service. Customer login to their accounts or to the customer service section of their companies’ web sites, or call their companies’ contact center and get authenticated to speak with customer service agents
  • Find Answers and (Re)solutions. Use self-service, social-service, virtual-assisted service, and/or assisted-service facilities to try to help themselves, seek the help of their peers, seek the help of customer service agent for answers and (re)solutions.
  • Complain. If customers cannot get answers or (re)solutions using these facilities, they complain to their companies.

Here, in Table 1, below, are the surveys that examine how customers perform these activities and how companies support those activities. Note that these surveys are a subset of those surveys that were published by our suppliers. Not all of their surveys mapped directly to customer activities. Note that our analyses of survey results are based on the content of the press releases of the surveys. This content is a bit removed from the actual survey data.

Sponsor Survey Objective Activity Respondents
Nuance Privacy and security of telephone credentials Access Smartphone users
Nuance Telephone authentication issues and preferences Access US consumers
KANA Email response times for customer service Find answers and (re)solutions N/A
KANA Twitter response times for customer service Find answers and (re)solutions N/A
Nuance Resolving problems using web self-service Find answers and (re)solutions Web self-service users, 18–45 years old
Nuance Issues with Web self-service Find answers and (re)solutions Windstream Communications customers
KANA Usage of email vs. telephone for complaints Complain N/A
KANA Customer communication channels for complaints Complain UK consumers
KANA Customer complaints Complain US consumers, 18 years old and older

Table 1. We list and describe customer service surveys published by KANA and Nuance during 4Q2013 in this Table.

Let’s listen closely to the Voices of the Customers as they perform the activities of the customer service Customer Scenario. For each of the surveys in the Table, we’ll present the published survey results, analyze them, and suggest what businesses might do to help customers perform the activities faster, more effectively, and more efficiently.

Access

If questions and problems are related to their accounts, before customers can ask questions or present problems, they have to be authenticated on the customer service system that handles and manages questions and problems. Authentication requires usernames and passwords, login credentials. In these times of rampant identity theft, security of credentials has become critically important.

Nuance’s surveys on privacy and security of telephone credentials and on telephone authentication shed some light on customers’ issues with authentication.

  • 83 percent of respondents are concerned or very concerned about the misuse of their personal information.
  • 85 percent of respondents are dissatisfied with current telephone authentication methods.
  • 49 percent of respondents stated that current telephone authentication processes are too time consuming.
  • 67 percent of respondent have more than eleven usernames and passwords
  • 80 percent respondents use the same login credentials across all of their accounts
  • 67 percent of respondents reset their login credentials between one and five times per month.

Yikes! Consumers spend so much time and effort managing and, then, using their credentials. We’ve all experienced the latest account registration pages that grade our new or reset passwords from “weak” to “strong” and reject our weakest passwords. While strong passwords improve the security of our personal data, they’re hard to remember and they increase the time we spend in their management.

In voice biometrics, Nuance offers the technology to address many of these issues. On voice devices, after a bit of training, customers simply say, “My voice is my password,” to authenticate account access based on voiceprints and  voiceprints are unique to an individual.

Find Answers and (Re)solutions

KANA’s surveys on email response times for customer service and Twitter response times for customer service examine response times for “inquiries.” When customers make inquiries, they’re looking for answers or (re)solutions. In the surveys, KANA found:

  • According to Call Centre Association members, response times to email inquiries was greater than eight hours for 59 percent of respondents and greater than 24 hours for 27 percent of respondents.
  • According to a survey by Simply Measured, a social analytics company, the average response times to Twitter inquiries were 5.1 hours and were less than one hour for 10 percent of respondents.

While it’s dangerous to make cross-survey analyses, it seems reasonable to conclude that customer service is better on Twitter than on email. That’s not surprising. Companies have become very sensitive to the public shaming by dissatisfied customers on Twitter. They’ll allocate extra resources to monitoring social channels to prevent the shame. Customers win.

However, remember that these are independent surveys. The companies that deliver excellent customer service on Twitter might also deliver excellent customer service on email and the companies that deliver not so excellent customer service on email might also deliver not so excellent customer service on Twitter. The surveys were not designed to gather this data. That’s the danger of cross-survey analysis.

If your customers make inquiries on both email and social channels, then you should deliver excellent customer service on both. Email management systems and social listening, analysis, and interaction systems, both widely used and well proven customer service applications, can help. These are systems that should be in every business’s customer service application portfolio.

Email management systems help business manage inquiries that customer make via email. These systems have been around for way more than ten years, helping businesses respond to customers’ email inquiries. Businesses configure them to respond to common and simple questions and problems automatically and to assign stickier questions and problems to customer service staff. Business policies are the critical factor to determine response times to customers’ email inquiries.

Social listening, analysis, and interaction systems have been around for about five years. They help businesses filter the noise of the social web to identify Tweets and posts that contain questions and problems and the customers who Tweet and post them. These systems then include facilities to interact with Tweeters and posters or to send the Tweets and posts to contact center apps for that interaction.

Find Answers and (Re)solutions Using Web Self-Service

Nuance’s surveys about web self-service really show the struggles of customers trying to help themselves to answers and (re)solutions.

In the survey about consumers’ experiences with web self-service, the key findings were:

  • 58 percent of consumers do not resolve their issues
  • 71 percent of consumers who do not resolve their issues spend more than 30 minutes trying
  • 63 percent of consumers who do resolve issues, spend more than 10 minutes trying

In Nuance’s survey of Windstream Communications’ customers about issues with web self-service, the key finding were:

  • 50 percent of customers who did not resolve their issues, escalated to a live agent
  • 71 percent of customers prefer a virtual assistant over static web self-service facilities

The most surprising and telling finding of these surveys was the time and effort that customers expend trying to find answers and (re)solutions using web self-service facilities. 30 minutes not to find an answer or a solution seems like a very long time. Customers really want to help themselves.

By the way, Windstream’s customers’ preference for a virtual assistant is not a surprise. Windstream Communications, a Little Rock, AK networking, cloud-computing, and managed services provider, has deployed Nina Web, Nuance’s virtual agent offering for the web. Wendy, Windstream’s virtual agent, uses Nina Web’s technology to help answer customers’ questions and solve their problems. The finding is a proof point for the value of virtual agents in delivering customer service. Companies in financial services, healthcare, and travel as well as in telecommunications have improved their customer services experiences with virtual agents. We cover the leading virtual agent suppliers—Creative Virtual, IntelliResponse, Next IT, and Nuance—in depth. Check out our Product Evaluation Reports to find the virtual agent technology best for your business.

Complain

Customers complain when they can’t get answers to their questions and (re)solutions to their problems. KANA’s surveys about complaints teach so much about customer’s behavior, preferences, and experiences.

  • In KANA’s survey on usage of email or telephone channels for complaints, 42 percent of survey respondents most frequently use email for complaints and 36 percent use the telephone for complaints.
  • In KANA’s survey of UK consumers on communications channels for complaints, 25 percent of UK adults used multiple channels to make complaints. Fifteen percent of their complaints were made face-to-face.

The surprising finding in these surveys is the high percentage of UK consumers willing to take the time and make the effort to make complaints face-to-face. These customers had to have had very significant issues and these customers were very serious about getting those issues resolved.

The key results in KANA’s survey about customer complaints by US consumers were:

  • On average, US consumers spend 384 minutes (6.4 hours) per year lodging complaints
  • In the most recent three years, 71 percent of US consumers have made a complaint. On average, they make complaints six times per year and spend one hour and four minutes resolving each complaint.
  • Thirty nine percent of US consumers use the telephone channel to register their complaints. Thirty three percent use email. Seven percent use social media.
  • Millenials complained most frequently—80 percent of 25 to 34 year old respondents. Millenials are also most likely to complain on multiple channels—39 percent of them.
  • Survey respondents had to restate their complaints (Retell their stories) 69 percent of the time as the responsibility to handle their complaints was reassigned. On average, consumers retold their stories three times before their issues were resolved and 27 percent of consumers used multiple channels for the retelling.

The surprising findings in this survey are the time, volume, and frequency of complaints. Six and a half hours a year complaining? Six complaints every year? Yikes!

No surprise about the low usage of social channels to register complaints. Customers want to bring our complaints directly to their sources. They may vent on the social web, but they bring their complaints directly to their sources, the companies that can resolve them.

Lastly and most significantly, it’s just so depressing to learn that businesses are still making customers retell their stories as their complaints cross channels and/or get reassigned or escalated. We’ve been hearing this issue from customers for more than 20 years. Customers hate it.

Come on businesses. All the apps in your customer service portfolios package the facilities you need to eliminate this issue—transcripts of customers’ activities in self-service apps on the web and on mobile devices, threads of social posts, transcripts of customers’ conversations with virtual agents, and, most significantly, case notes. Use these facilities. You’ll shorten the time to solve problems and resolve customers’ complaints. Your customers will spend less time trying to get answers and (re)solutions (and more time using your products and services or buying new ones).

4Q2013 Was a Good Quarter for Customer Service

By the way, Customer Service had a good quarter in 4Q2013. Customer growth was up. Financial performance was up as a result. Product activity was very heavy. Nine of our ten suppliers made product announcements. Company activity was light. Five suppliers did not make any company announcements. Most significantly, KANA was acquired by Verint. And of course, three suppliers published customer service surveys.

Product Evaluation: Oracle Service Cloud Social Experience

Oracle Service Cloud Social Experience 

Our evaluation of the August 2013 Release of Oracle Service Cloud Social Experience is this week’s report. You may be more familiar with the product by its former RightNow CX Social Experience or Oracle RightNow Cloud Service Social Experience names. Oracle acquired RightNow in January 2012 and, without a formal announcement, renamed the product sometime during 2Q2013. One other point about the acquisition, the former RightNow R&D team has continued to develop the product, has continued to work out of the former RightNow headquarters site in Bozeman, and has continued the regular, quarterly releases of the product.

Social Experience is one of three “Experiences” in Oracle Social Cloud. The other two are Agent Experience and Web Experience. Each is aptly named for the channel that it supports. The three share a base of common data (Customers, accounts, cases, and knowledge items, for example) and services including business rules, process management, user management, and reporting. Also, product packaging and pricing puts Social Experience “in the box” with Agent and Web Experience. So, social customer service is really built into Oracle Service Cloud and that’s its key strength and differentiator.

Social Experience has these three components:

  • Communities, which supports internal community capabilities of posts and responses on topic threads. Oracle Service Cloud Social Experience Communities is based on technology developed by HiveLive that the then RightNow acquired in 2009.
  • Social Monitor, which provides capabilities to monitor posts on the social web—Facebook, Twitter, YouTube, and RSS feeds as well as Communities, to analyze the content of monitored social posts, and to interact with social posters.
  • Self Service for Facebook, which lets organizations deploy Oracle Service Cloud web experience and Communities capabilities on their Facebook pages to help Facebook users access Oracle Service Cloud Social Experience Communities and knowledgebase as well as to create cases.

Facebook, Twitter, YouTube, RSS, and Social Experience Communities are the social sources monitored by Social Experience. While these are certainly the key social networks, the product does not monitor some sources that are critical to customer service, particularly external communities, forums, and blogs. These are sources that customers very commonly use to get answers to questions and solutions to problems. That Social Experience doesn’t monitor them is a serious limitation. Oracle already has the technology to address this limitation, technology that came with its June 2012 acquisition of Collective Intellect. Collective Intellect’s IP was social monitoring and analysis technology. Oracle told us that it’s working on integrating this technology with Oracle Service Cloud.

Twitter for Customer Service

On the topic of Twitter, last week, Patty Seybold published, “Four Reasons Why Customers Prefer Twitter for Customer Service,” a report about how businesses and their customers use Twitter as a key channel for customer service. Patty proposes seven best practices for Twitter-based customer service. Oracle Service Cloud Social Experience can help implement four of the seven—Treat Twitter as an Integrated Customer Service Channel, If You Have Lots of Customers, Establish Customer Service Twitter Accounts, Defuse Anger Publicly; Take the Issue Private, Gather Customers’ Ideas for Next-Gen Products. You’ll implement the other three—Set Customers’ Expectations Re: Times of Day You’ll Respond to Tweets in Real Time, Respond within Minutes, and Don’t Use Automated Responses!—with customer service policies, standards, and procedures. Here are the four with brief descriptions of how Oracle Service Cloud Social Experience helps implement them.

  • Treat Twitter as an Integrated Customer Service Channel

Social Experience Social Monitor searches Twitter for Tweets that are relevant to customer service. Agents and/or analysts specify search queries as strings of language-specific terms of 255 characters or fewer. Queries strings may include the exact match (“”), AND, or OR operators. Analysts can save search queries for execution at a later time or for (regularly) scheduled execution.

Social Experience Social Monitor can automatically create customer service cases from the Tweets in search results and automatically appends the info in subsequent Tweets from the same Twitter account to them.

Social Experience captures customers’ Twitter account info within search results and includes them within Oracle Service Cloud customer data.

  • If You Have Lots of Customers, Establish Customer Service Twitter Accounts

Social Experience supports multiple corporate Twitter accounts that it shares among its users. (It supports corporate Facebook accounts, too.) Businesses can create a hierarchy of corporate Twitter accounts for customer service, organizing them in any appropriate manner—by customer or customer company, by products, by customer service level, or by severity or priority, for example. And, Social Experience’s Corporate Twitter accounts can be set to follow customers’ Twitter accounts.

  • Defuse Anger Publicly; Take the Issue Private

Agents specify whether each of their Tweets on their corporate accounts is public or private.

  • Gather Customers’ Ideas for Next-Gen Products

Cases generated from Social Monitor search results can be ideas for next-gen products as well as the representation of questions and problems.

Pretty good, although a bit of content-based alerting on search results could automate Twitter monitoring. Note that these capabilities of Social Experience’s to support Twitter are capabilities that we’ve seen in other social monitoring and analysis offerings, offerings including Attensity Analyze, and Respond, Clarabridge Analyze, Collaborate, and Engage, and KANA Experience Analytics. All of these offerings have been available for a few years. They’re widely-used and well-proven. Any of them can help make Twitter an integrated customer service channel.

Going forward, we’ll extend our framework for evaluating social customer service products to include Patty’s best practices as