Microsoft Dynamics 365 for Customer Service

Serious Customer Service Capabilities

In our more than 10 years of customer service research, publishing, and consulting, we’d never before published a report about a Microsoft offering. It’s not because Microsoft hasn’t had a customer service offering or that the company hasn’t had success in business applications. Since 2003, its CRM suite has always included a customer service app. And, its Dynamics CRM brand has built a customer base of tens of thousands of accounts and millions of users. But, Dynamics CRM had always been more about its sales app and that app’s integration with Office and Outlook. Customer service capabilities have been a bit limited. No longer.

Beginning in November 2015, the improvements in two new releases—CRM 2016 and CRM 2016 Update 1—and, in November 2016, the introduction of the new Dynamics 365 brand have strengthened, even transformed, Microsoft’s customer service app and have made Microsoft a player to consider in the high end of the customer service space.

Our Product Evaluation Report on Microsoft Dynamics 365 for Customer Service, published December 1, 2016, will help that consideration. These are the new and/or significantly improved customer service components:

  • Knowledge management
  • Search
  • Customer service UI
  • Web self-service and communities
  • Social customer service

Let’s take a closer but brief look at each of them.

Knowledge Management

Knowledge Management is the name of a new customer service component. Introduced with CRM 2016, it’s a comprehensive knowledge management system with a rich and flexible knowledge model, a large set of useful knowledge management services, and an easy to learn and easy to use toolset. The best features of Knowledge Management are:

  • Visual tools of Interactive Service Hub, the customer service UI
  • Knowledge lifecycle and business processes that implement and support the lifecycle
  • Language support and translation
  • Version control
  • Roles for knowledge authors, owners, and managers

For example, Knowledge Management comes with a predefined but configurable knowledge lifecycle with Author, Review, Publish, and Expire phases. The screen shot in Figure 1 shows the steps in the Author phase.

ish-knowledge-author-stage-stepsFigure 1. This screen shot shows the steps in the Author phase of the knowledge management process.

Note that Knowledge Management is based on technology from Parature, a Reston, VA-based supplier with a customer service offering of the same name that Microsoft acquired in 2014. Beginning with the introduction of Dynamics 365, Microsoft no longer offers the Parature customer service product.

Search

Search is not a strength of Dynamics 365. Search sources are limited. Search query syntax is simple. There are few search analyses and few facilities for search results management. However, with the Dynamics 365 rebranding Microsoft has made improvements. Categorized Search, the new name of the search facility in Dynamics 365, retrieves database records with fields that begin with the words in search queries and lets administrators and seekers facet (Categorize) search results. The new Relevance Search adds relevance and stemming analyses. Microsoft still has work to do, but faceting, stemming, and relevance are a start to address limitations.

Customer Service UI – Interactive Service Hub

Interactive Service Hub (ISH) provides several useful and very attractive capabilities in Dynamics 365. It’s the UI for Knowledge Management, one of two UIs for case management, and a facility for creating and presenting dashboards. For the case management and knowledge management UIs, ISH provides visual tools that are easy to learn and easy to use. The tools let agents perform every case management task and let authors and editors perform every knowledge management function. For example, Figure 2 shows a screen shot of ISH’s presentation of an existing Case—the Name of the Case at the top left, the Case information to display “SUMMARY | DETAILS | CASE RELATIONSHIPS | SLA” under the Name, the phases of the deployment’s case management process “IDENTIFY QUALIFY RESEARCH RESOLVE” within a ribbon near the top of the screen, and the (SUMMARY) Case information in the center.

ish-existing-caseFigure 2. This screen shot shows the Interactive Service Hub display of an existing Case.

In addition to tools for building dashboards, ISH also packages useful predefined dashboards, two for case management and two for knowledge management. The four help customer service managers and agents and knowledge management authors and editors manage their work. Figure 3 shows an example of the My Knowledge Dashboard. It presents information useful to authors and editors very visually and interactively.

my-knowledge-dashboardFigure 3. This screen shot shows an example of the My Knowledge Dashboard.

Web Self-service and Communities

We were quite surprised to learn that, prior to the May 2016 introduction of CRM 2016 Update 1, Dynamics 365 for Customer Service and all of its predecessor products did not include facilities for building and deploying web self-service or communities sites. This limitation was addressed in Update 1 with the then named CRM Portal service, renamed the Portal service in Dynamics 365. Portal service is a template-based toolkit for developing (web development skills are required) and deploying browser-based web self-service and communities/forums sites. It’s based on technology from Adxstudio, which Microsoft acquired in September 2015 and it packages templates for a Customer Service Portal and a Community Portal. Note that Dynamics 365 for Customer Service licenses include one million page views per month for runtime usage of sites built on the Portal service (licenses may be extended with additional page views per month).

Social Customer Service

Microsoft Social Engagement is a separately packaged and separately priced social customer service offering that Microsoft introduced early in 2015. Social Engagement provides facilities that listen for social posts across a wide range of social sources (Instagram, Tumblr, WordPress, and YouTube as well as Facebook and Twitter), that analyze the content and sentiment of those posts, and that interact with social posters. In addition, Social Engagement integrates with Dynamics 365 for Customer Service. Through this integration, the automated or manual analysis of social posts can result in creating and managing customer service Cases. It’s a strong social customer service offering. What’s new is Microsoft bundles Social Engagement with Dynamics 365 for Customer Service. That’s a very big value add.

All This and More

We’ve discussed the most significant new and improved capabilities of Dynamics 365 for Customer Service. Knowledge Management, Interactive Service Hub, improved Search, the Portal service, and bundled Social Engagement certainly strengthen the offering. Although not quite as significant, Microsoft added and improved many other capabilities, too. For example, there are language support improvements, improvements to integration with external apps, new Customer Survey and “Voice of the Customer” feedback capabilities, and the use of Azure ML (Machine Learning) to suggest Knowledge Management Articles as Case resolutions automatically based on Case attribute values. Bottom line, Microsoft Dynamics 365 for Customer Service deserves serious consideration as the key customer service app for large businesses and public sector organizations, especially those that are already Microsoft shops.

Evaluating Customer Service Products

Framework-based, In-depth Product Evaluation Reports

We recently published our Product Evaluation Report on Desk.com, Salesforce’s customer service offering for small and mid-sized businesses. “Desk” is a very attractive offering with broad and deep capabilities. It earns good grades on our Customer Service Report Card, including Exceeds Requirements grades in Knowledge Management, Customer Service Integration, and Company Viability.

We’re confident that this report provides input and guidance to analysts in their efforts to evaluate, compare, and select those customer service products, and we know that it provides product assessment and product planning input for its product managers. Technology analysts and product managers are the primary audiences for our reports. We research and write to help exactly these roles. Like all of our Product Evaluation Reports about customer service products that include multiple apps—case management, knowledge management, web self-service, communities, and social customer service—it’s a big report, more than 60 pages.

Big is good. It’s their depth and detail that makes them so. Our research for them always includes studying a product’s licensed admin, user, and, when accessible, developer documentation, the manuals or online help files that come with a product. We read the patents or patent applications that are a product’s technology foundation. Whenever offered, we deploy and use the products. (We took the free 30-day trial of Desk.) We’ll watch suppliers’ demonstrations, but we rely on the actual product and its underlying technologies.

On the other hand, we’ve recently been hearing from some, especially product marketers when they’re charged to review report drafts (We never publish without the supplier’s review.), that the reports are too big. Okay. Point taken. Perhaps, tt is time to update our Product Evaluation Framework, the report outline, to produce shorter, more actionable reports, reports with no less depth and detail but reports with less descriptive content and more salient analytic content. It’s also time to tighten up our content.

Product Evaluation Reports Have Two Main Parts

Our Product Evaluation Reports have had two main parts: Customer Service Best Fit and Customer Service Technologies. Customer Service Best fit “presents information and analysis that classifies and describes customer service software products…speed(ing) evaluation and selection by presenting easy to evaluate characteristics that can quickly qualify an offering.” Customer Service Technologies examine the implementations of a product’s customer service applications and their foundation technologies as well as its integration and reporting and analysis capabilities. Here’s the reports’ depth and detail (and most of the content). Going forward, we’ll continue with this organization.

Streamlining Customer Service Best Fit

We will revamp and streamline Customer Best Fit, improving naming and emphasizing checklists. The section will now have this organization:

  • Applications, Channels, Devices, Languages
  • Packaging and Licensing
  • Supplier and Product
  • Best Prospects and Sample Customers
  • Competitors

Applications, Channels, Devices, Languages are lists of key product characteristics, characteristics that quickly qualify a product for deeper consideration. More specifically, applications are the sets of customer service capabilities “in the box” with the product—case management, knowledge management, and social customer service, for example. Channels are assisted-service, self-service, and social. We list apps within supported channels to show how what’s in the box may be deployed. Devices are the browsers and mobile devices the product supports for internal users and for end customers. Languages are two lists: one for the languages in which the product deploys and supports for its administration and internal users and one for the languages it supports for end customers.

Packaging and Licensing presents how the supplier offers the product, the fees that it charges for the offerings, and the consulting services available and/or necessary to help licensees deploy the offerings.

 Supplier and Product present high level assessments of the supplier’s and the product’s viability. For the supplier, we present history, ownership, staffing, financial performance, and customer growth. For the product, we present history, current development approach, release cycle, and future plans.

Best Prospects and Sample Customers are lists of the target markets for the product—the industries, business sizes, and geographies wherein the product best fits. This section also contains the current customer base for the product, a list of typical/sample customers within those target markets and, if possible, presents screen shots of their deployments.

 Competition lists the product’s closest competitors, its best alternatives. We’ll also include a bit of analysis explaining what make them the best alternatives and where the subject product has differentiators.

Tightening-up Customer Service Technologies

Customer Service Technologies is our key value-add and most significant differentiator of our Product Evaluation Reports. It’s why you should read our reports, but, as we mentioned, it’s also the main reason why they’re big.

We’ve spent years developing and refining the criteria of our Evaluation Framework. They criteria are the results of continuing work with customer service products and technologies and our complementary work the people who are product’s prospects, licensees, suppliers, and competitors. We’re confident that we evaluate the technologies of customer service products by the most important, relevant, and actionable criteria. Our approach creates common, supplier-independent and product-independent analyses. These analyses enable the evaluation and comparison of similar customer service products and results in faster and lower risk selection of a product that best fits a set of requirements.

However, we have noticed that the descriptive content that are the bases for our analyses has gotten a bit lengthy and repetitive (repeating information in Customer Best Fit). We plan to tighten up Customer Service Technologies content and analysis in these ways:

  • Tables
  • Focused Evaluation Criteria
  • Consistent Analysis
  • Reporting

Too much narrative and analysis has crept into Tables. We’ll make sure that Tables are bulleted lists with little narrative and no analysis.

Evaluation criteria have become too broad. We’ve been including detailed descriptions and analyses of related and supported resources along with resources that’s the focus of the evaluation. For example, when we describe and analyze the details of a case model, we’ll not also describe and analyze the details of user and customer models. Rather we’ll just describe the relationship between the resources.

Our analyses will have three sections. The first will summarize what’s best about a product. The second will present additional description and analysis where Table content needs further examination. The third will be “Room for Improvement,” areas where the product is limited. This approach will make the reports more actionable and more readable as well as shorter.

In reporting, we’ll stop examining instrumentation, the collection and logging of the data that serves as report input. The presence (or absence) of reports about the usage and performance of customer service resources is really what matters. So, we’ll call the criterion “Reporting” and we’ll list the predefined reports packaged with a product in a Table. We’ll discuss missing reports and issues in instrumentation in our analysis.

Going Forward

Our Product Evaluation Report about Microsoft Dynamics CRM Online Service will be the first to be written on the streamlined Framework. Expect it in the next several weeks. Its Customer Service Best Fit section really is smaller. Each of its Customer Service Technologies sections is smaller, too, more readable and more actionable as well.

Here’s the graphic of our Product Evaluation Framework, reflecting the changes that we’ve described in this post.

Slide1

Please let us know if these changes make sense to you and please let us know if the new versions of the Product Evaluation Reports that leverage them really are more readable and more actionable.

Next IT Alme: Helping Customers Do All Their Work

On September 2, 2004, we published my article, “May I Help You?” It was a true story about my experience as a boy working in my dad’s paint and wallpaper store. The experience taught me all about customer service.

The critical lesson that I learned from my dad and from working in the store was customers want and need your help for every activity that they perform in doing business with you from their first contact with you through their retirement.

That help was answering customers’ questions and solving customers’ problems. That’s the usual way that we think of customer service, helping with exceptions, the times that customers can not do their work. But, that help was also performing “normal” activities on customers’ behalves—providing the right rollers, brushes, and solvents for the type of paint they wanted to use, for example, or collaborating with customers to perform normal activities together—selecting a paint color for trim or a wallpaper pattern.

At Kramer’s Paint, my dad or I delivered all of that help—normal work and exceptions work. In your business, you deliver the help to perform customers’ normal planning, shopping, buying, installing/using, and (account) management activities through the software of self-service web sites and/or mobile apps or through the live interactions of your call center agents, in-store associates, or field reps. And, you deliver the help for customers’ exception activities through customer self-service apps on the web, social networks, or mobile devices or through the live interactions of customer service staff in call centers, stores, and in the field.

Virtual Assistants Crossover to Perform Normal Activities

Recently, in our on customer service research, we’ve begun to see virtual assistant software apps crossover from helping customers not only with the exception activities to performing normal activities on customers’ behalves, activities like taking orders, completing applications, and managing accounts. We wrote about this crossover a bit in our last post about IBM Watson Engagement Advisor’s Dialog facility. And, we provided links to crossover examples of Creative Virtual V-Person at Chase Bank and Nuance Nina Mobile at Domino’s.

Alme, the virtual assistant software app from Spokane, WA based supplier Next IT, can crossover to help customers perform normal, too. In fact, Alme has always performed normal activities for customers. One of our first reports about virtual assistants, a report that we published on March 13, 2008, discussed Jenn, Alaska Airlines’ Alme-based virtual assistant. We asked Jenn to find a flight for us through this request, “BOS to Seattle departing December 24 returning January 1.” Jenn did a lot of work to perform this normal activity. Her response was fast, accurate, and complete. We asked Jenn again in our preparation for this post. “She” prepared the “Available Flights” page for us. Once again, her answer was fast, accurate, and complete. All that’s left to do is select the flights. The illustration below shows our request and Jenn’s response.

alaska airlines blog

Next IT Alme Provides Excellent Support for Normal Activities

Alme provides these excellent facilities for performing normal activities, facilities that are one of its key strengths and competitive differentiators:

  • Support for complex, multi-step interactions
  • Rules-based personalization
  • Integration with external applications
  • Let’s take a closer look at them.

Support for Complex, Multi-Step Interactions

For normal activities, complex, multi-step interactions help virtual assistants collect the information needed to complete an insurance or loan application, order a meal, or configure a mobile device and the telecommunications services to support it, for example. Alme supports complex, multi-step interactions with Directives and Goals.

Directives

Directives are hierarchical dialogs of prompt and response interactions between Alme virtual assistants and customers. They’re stored and managed in Alme’s knowledgebase and Alme provides tools for building and maintaining them. Directive’s dialogs begin when Alme’s processing of a customer’s request matches the request to one of the nodes in a Directive. The node presents its prompt to the customer as a text box into which the customer enters a text response or as a list of links from which the customer makes a selection. Alme then processes the text responses or the link selections. This processing moves the dialog:

  • To another node in the Directive
  • Out of the Directive
  • Into a different Directive.

That customers’ requests can enter, reenter, or leave Directives at any of their nodes is what makes Directives powerful, flexible, and very useful. Alme’s analysis and matching engine processes every customer request and response to Directive prompts the same way. When the request (re)triggers a Directive, Alme automatically (re)establishes the Directive’s context, including all previous text responses and link selections. For example, financial services companies might use Directives to implement retirement planning for their customers. The customer might leave the Directive to gather information from joint accounts at the bank with the customer’s spouse before returning to the Directive to continue the planning, opening, and funding of an Individual Retirement Account (IRA).

Goals

Goals let virtual assistants collect a list of information from customers through prompt and response interactions to help perform and personalize their activities. Virtual assistants store the elements of the list of information that the customer provides within virtual assistant’s session data for use anytime within a customer/virtual assistant session. Alme can also use its integration facilities to store elements of the list persistently in external apps.

Goals have the ability to respond to customers dynamically, based on the information the Goal has collected. For example, if the customer provides all of the Goal’s information in one interaction, then Goal is complete or fulfilled and the Alme virtual assistant can perform the activity that is driven by the information. However, if the customer provides, say, two of four required information items, then the Goal can change its responses and request the missing information, leading the customer through a conversation. Goals are created by authors or analysts who specify a list of variables to store the information to be collected and the actions to be taken when customers do not provide all the information in the list. In addition, Goals can be nested, improving their power and giving them flexibility as well as promoting their reuse.

Healthcare providers (Healthcare is one of Next IT’s target markets.) might use Goals to collect a list of information from patients prior to a first appointment. Retailers might use them to collect a set of preferences for a personal e-shopper virtual assistant.

Rules-Based Personalization

Personalization is essential for any application supporting customers’ normal activities. Why? Because personalization is the use of customer information—profile attributes, demographics, preferences, shopping histories, order histories, service contracts, and account data—to tailor a customer experience for individual customers. Performing activities on customers’ behalves requires some level of personalization.

For example, virtual assistants use a customer’s login credentials to access external apps that manage account or order data and, then, use that order data to help customers process a refund or a return. Or, to complete an auto insurance application, virtual assistants need profile data and demographic data to price a policy.

Alme’s rules-based personalization facilities are Variables, Response Conditions, and AppCalls. They are implemented within the knowledgebase items that contain the responses to customers’ requests.

  • Variables provide personalization and context. They contain profile data, external application data, and session data, for example.
  • Response Conditions are expressions (rules) on Variables. Response Conditions select responses and/or set data values of their Variables.
  • AppCalls (Application Calls) pass parameters to and execute external applications. They use Alme’s integration facilities to access external apps through JavaScript and Web Services APIs. For example, Jenn, Alaska Airlines’ virtual assistant, uses AppCalls to process information extracted from the customer’s question—departure city, arrival city, departure date and return date—and normalizes and formats the information for correct handling by the airlines’ booking engine. This AppCall checks city pairs to ensure the flight is valid and formats and normalizes dates so that the booking engine can display appropriate choices. AppCalls also integrate Alme with backend systems. Ann, Aetna’s virtual assistant, uses AppCalls to collect more than 80 profile variables from Aetna’s backend systems to facilitate performing tasks and to personalize answers for Aetna’s customers after they log in and launch Ann. (See the screen shot of Ann, below.)

Integration with External Applications

The resources that virtual assistant applications “own” are typically a knowledgebase (of answers and solutions to expected customers’ questions and problems) and accounts on Facebook and Twitter to enable members of these social networks to ask questions and report problems. So, to perform normal activities, virtual assistants need to integrate with the external apps that own the data and services that support those activities.

Alme integrates with external customer service applications through JavaScript (front end) and Web Services (back end) interfaces. New in Alme 2,2, the current Alme version, Next IT has introduced a re-architected Alme platform that is more modular and more extensible. The new platform has published JavaScript and Web Services interfaces to all Alme functionality and support for JavaScript and Web Services to external resources.

AppCalls use Alme’s integration facilities. To process an AppCall successfully, developers must have established a connection between Alme and an external application. Jenn integrates Alme with Alaska Airlines booking engine. Ann integrates Alme with Aetna’s backend systems. Here’s a screen shot.

aetna blog

Virtual Assistants Are Doing More of the Work of Live Agents

Next IT Alme was one of the first virtual assistant software products with the capabilities to perform normal activities. Its facilities are powerful and flexible. While integration with external applications will always require programming (and Next IT has simplified that programming), Alme’s facilities for supporting normal activities are built-in and designed for business analysts. They’re reasonably easy to learn, easy to use, and easy to manage.

By performing normal activities, virtual assistants are doing more of the work that live agents have been doing—quickly, accurately, consistently, and at a lower cost than live agents. That frees live agents to handle the stickiest, most complex customer requests, requests to perform normal activities and requests to answer questions and resolve problems. It’s also a driver for your organization to consider adding virtual assistants to your customer service customer experience portfolio.

Voices of Customers

With this week’s report, the 4Q2013 Customer Service Update, we complete our tenth year of quarterly updates on the leading suppliers and products in customer service. These updates have focused on the factors that are important in the evaluation, comparison, and selection of customer service products.

  • Customer Growth
  • Financial Performance
  • Product Activity
  • Company Activity

Taking from the framework of our reports, for Company Activity, we cover company related announcements, press releases, and occurrences that are important to our analysis of quarterly performance. In 4Q2013, three of our suppliers, Creative Virtual, KANA, and Nuance, published the results of surveys that they had conducted or sponsored over the previous several months. All of the surveys were about customer service and the answers to survey questions demonstrated customers’ approach, behavior, preferences, and issues in their attempts to get service from the companies with which they’ve chosen to do business. The responses to these surveys are the Voices of the Customers for and about customer service. This is wonderful stuff.

Now, to be sure, suppliers conduct surveys for market research and marketing purposes. Suppliers’ objectives for surveys are using the Voice of the Customer to prove/ disprove, validate, demonstrate, or even promote their products, services, or programs. Certainly, all of the surveys our suppliers published achieved those objectives. For this post, though, let’s focus on the broader value of the surveys, the Voice of the Customer for Customer Service.

Surveys

The objectives in many of the survey represent the activities that customers perform, the steps that customers follow to get customer service from the companies with which they choose to do business. By getting customer service, we mean getting answers to their questions and (re)solutions to their problems. Ordering our examination and analysis of the surveys in customers’ typical sequence of these steps organizes them into a Customer Scenario. Remember that a Customer Scenario is the sequence of activities that customers follow to accomplish an objective that they want to or need to perform. For a customer service Customer Scenario, customers typically:

  • Access Customer Service. Customer login to their accounts or to the customer service section of their companies’ web sites, or call their companies’ contact center and get authenticated to speak with customer service agents
  • Find Answers and (Re)solutions. Use self-service, social-service, virtual-assisted service, and/or assisted-service facilities to try to help themselves, seek the help of their peers, seek the help of customer service agent for answers and (re)solutions.
  • Complain. If customers cannot get answers or (re)solutions using these facilities, they complain to their companies.

Here, in Table 1, below, are the surveys that examine how customers perform these activities and how companies support those activities. Note that these surveys are a subset of those surveys that were published by our suppliers. Not all of their surveys mapped directly to customer activities. Note that our analyses of survey results are based on the content of the press releases of the surveys. This content is a bit removed from the actual survey data.

Sponsor Survey Objective Activity Respondents
Nuance Privacy and security of telephone credentials Access Smartphone users
Nuance Telephone authentication issues and preferences Access US consumers
KANA Email response times for customer service Find answers and (re)solutions N/A
KANA Twitter response times for customer service Find answers and (re)solutions N/A
Nuance Resolving problems using web self-service Find answers and (re)solutions Web self-service users, 18–45 years old
Nuance Issues with Web self-service Find answers and (re)solutions Windstream Communications customers
KANA Usage of email vs. telephone for complaints Complain N/A
KANA Customer communication channels for complaints Complain UK consumers
KANA Customer complaints Complain US consumers, 18 years old and older

Table 1. We list and describe customer service surveys published by KANA and Nuance during 4Q2013 in this Table.

Let’s listen closely to the Voices of the Customers as they perform the activities of the customer service Customer Scenario. For each of the surveys in the Table, we’ll present the published survey results, analyze them, and suggest what businesses might do to help customers perform the activities faster, more effectively, and more efficiently.

Access

If questions and problems are related to their accounts, before customers can ask questions or present problems, they have to be authenticated on the customer service system that handles and manages questions and problems. Authentication requires usernames and passwords, login credentials. In these times of rampant identity theft, security of credentials has become critically important.

Nuance’s surveys on privacy and security of telephone credentials and on telephone authentication shed some light on customers’ issues with authentication.

  • 83 percent of respondents are concerned or very concerned about the misuse of their personal information.
  • 85 percent of respondents are dissatisfied with current telephone authentication methods.
  • 49 percent of respondents stated that current telephone authentication processes are too time consuming.
  • 67 percent of respondent have more than eleven usernames and passwords
  • 80 percent respondents use the same login credentials across all of their accounts
  • 67 percent of respondents reset their login credentials between one and five times per month.

Yikes! Consumers spend so much time and effort managing and, then, using their credentials. We’ve all experienced the latest account registration pages that grade our new or reset passwords from “weak” to “strong” and reject our weakest passwords. While strong passwords improve the security of our personal data, they’re hard to remember and they increase the time we spend in their management.

In voice biometrics, Nuance offers the technology to address many of these issues. On voice devices, after a bit of training, customers simply say, “My voice is my password,” to authenticate account access based on voiceprints and  voiceprints are unique to an individual.

Find Answers and (Re)solutions

KANA’s surveys on email response times for customer service and Twitter response times for customer service examine response times for “inquiries.” When customers make inquiries, they’re looking for answers or (re)solutions. In the surveys, KANA found:

  • According to Call Centre Association members, response times to email inquiries was greater than eight hours for 59 percent of respondents and greater than 24 hours for 27 percent of respondents.
  • According to a survey by Simply Measured, a social analytics company, the average response times to Twitter inquiries were 5.1 hours and were less than one hour for 10 percent of respondents.

While it’s dangerous to make cross-survey analyses, it seems reasonable to conclude that customer service is better on Twitter than on email. That’s not surprising. Companies have become very sensitive to the public shaming by dissatisfied customers on Twitter. They’ll allocate extra resources to monitoring social channels to prevent the shame. Customers win.

However, remember that these are independent surveys. The companies that deliver excellent customer service on Twitter might also deliver excellent customer service on email and the companies that deliver not so excellent customer service on email might also deliver not so excellent customer service on Twitter. The surveys were not designed to gather this data. That’s the danger of cross-survey analysis.

If your customers make inquiries on both email and social channels, then you should deliver excellent customer service on both. Email management systems and social listening, analysis, and interaction systems, both widely used and well proven customer service applications, can help. These are systems that should be in every business’s customer service application portfolio.

Email management systems help business manage inquiries that customer make via email. These systems have been around for way more than ten years, helping businesses respond to customers’ email inquiries. Businesses configure them to respond to common and simple questions and problems automatically and to assign stickier questions and problems to customer service staff. Business policies are the critical factor to determine response times to customers’ email inquiries.

Social listening, analysis, and interaction systems have been around for about five years. They help businesses filter the noise of the social web to identify Tweets and posts that contain questions and problems and the customers who Tweet and post them. These systems then include facilities to interact with Tweeters and posters or to send the Tweets and posts to contact center apps for that interaction.

Find Answers and (Re)solutions Using Web Self-Service

Nuance’s surveys about web self-service really show the struggles of customers trying to help themselves to answers and (re)solutions.

In the survey about consumers’ experiences with web self-service, the key findings were:

  • 58 percent of consumers do not resolve their issues
  • 71 percent of consumers who do not resolve their issues spend more than 30 minutes trying
  • 63 percent of consumers who do resolve issues, spend more than 10 minutes trying

In Nuance’s survey of Windstream Communications’ customers about issues with web self-service, the key finding were:

  • 50 percent of customers who did not resolve their issues, escalated to a live agent
  • 71 percent of customers prefer a virtual assistant over static web self-service facilities

The most surprising and telling finding of these surveys was the time and effort that customers expend trying to find answers and (re)solutions using web self-service facilities. 30 minutes not to find an answer or a solution seems like a very long time. Customers really want to help themselves.

By the way, Windstream’s customers’ preference for a virtual assistant is not a surprise. Windstream Communications, a Little Rock, AK networking, cloud-computing, and managed services provider, has deployed Nina Web, Nuance’s virtual agent offering for the web. Wendy, Windstream’s virtual agent, uses Nina Web’s technology to help answer customers’ questions and solve their problems. The finding is a proof point for the value of virtual agents in delivering customer service. Companies in financial services, healthcare, and travel as well as in telecommunications have improved their customer services experiences with virtual agents. We cover the leading virtual agent suppliers—Creative Virtual, IntelliResponse, Next IT, and Nuance—in depth. Check out our Product Evaluation Reports to find the virtual agent technology best for your business.

Complain

Customers complain when they can’t get answers to their questions and (re)solutions to their problems. KANA’s surveys about complaints teach so much about customer’s behavior, preferences, and experiences.

  • In KANA’s survey on usage of email or telephone channels for complaints, 42 percent of survey respondents most frequently use email for complaints and 36 percent use the telephone for complaints.
  • In KANA’s survey of UK consumers on communications channels for complaints, 25 percent of UK adults used multiple channels to make complaints. Fifteen percent of their complaints were made face-to-face.

The surprising finding in these surveys is the high percentage of UK consumers willing to take the time and make the effort to make complaints face-to-face. These customers had to have had very significant issues and these customers were very serious about getting those issues resolved.

The key results in KANA’s survey about customer complaints by US consumers were:

  • On average, US consumers spend 384 minutes (6.4 hours) per year lodging complaints
  • In the most recent three years, 71 percent of US consumers have made a complaint. On average, they make complaints six times per year and spend one hour and four minutes resolving each complaint.
  • Thirty nine percent of US consumers use the telephone channel to register their complaints. Thirty three percent use email. Seven percent use social media.
  • Millenials complained most frequently—80 percent of 25 to 34 year old respondents. Millenials are also most likely to complain on multiple channels—39 percent of them.
  • Survey respondents had to restate their complaints (Retell their stories) 69 percent of the time as the responsibility to handle their complaints was reassigned. On average, consumers retold their stories three times before their issues were resolved and 27 percent of consumers used multiple channels for the retelling.

The surprising findings in this survey are the time, volume, and frequency of complaints. Six and a half hours a year complaining? Six complaints every year? Yikes!

No surprise about the low usage of social channels to register complaints. Customers want to bring our complaints directly to their sources. They may vent on the social web, but they bring their complaints directly to their sources, the companies that can resolve them.

Lastly and most significantly, it’s just so depressing to learn that businesses are still making customers retell their stories as their complaints cross channels and/or get reassigned or escalated. We’ve been hearing this issue from customers for more than 20 years. Customers hate it.

Come on businesses. All the apps in your customer service portfolios package the facilities you need to eliminate this issue—transcripts of customers’ activities in self-service apps on the web and on mobile devices, threads of social posts, transcripts of customers’ conversations with virtual agents, and, most significantly, case notes. Use these facilities. You’ll shorten the time to solve problems and resolve customers’ complaints. Your customers will spend less time trying to get answers and (re)solutions (and more time using your products and services or buying new ones).

4Q2013 Was a Good Quarter for Customer Service

By the way, Customer Service had a good quarter in 4Q2013. Customer growth was up. Financial performance was up as a result. Product activity was very heavy. Nine of our ten suppliers made product announcements. Company activity was light. Five suppliers did not make any company announcements. Most significantly, KANA was acquired by Verint. And of course, three suppliers published customer service surveys.

2Q2013 Customer Service Stars

This week, continuing our tenth year of quarterly updates on the suppliers and products in customer service, we published our 2Q2013 Customer Service Update Report. These reports examine customer service suppliers and their products along the dimensions of customer growth, financial performance, product activity, and company activity. We currently cover eleven leading customer service suppliers. They lead in overall market influence and share, in market segment influence and share, and/or in product technology and innovation.

For 2Q2013, overall customer service performance was mixed but three of our suppliers—Clarabridge, IntelliResponse, and Salesforce.com—earned Customer Service Stars for the quarter. Very briefly, Clarabridge is a privately owned firm based in Reston, VA that was founded in 2005. Clarabridge offers a suite of VoC applications. IntelliResponse is a privately owned firm based in Toronto, ON that was founded in 2000. IntelliResponse offers a suite of virtual agent products. Salesforce.com is public (NYSE:CRM) firm based in San Francisco, CA that was founded in 1999. The company has a broad product that includes Salesforce Service Cloud, which provides case management, knowledge management, contact center, and web self-service applications.

So, what’s a Customer Service Star? Well, since 2009, we’ve been awarding Customer Service Stars for excellent quarterly performance balanced across those dimensions of customer growth, financial performance, products, and company activity. (Since 2010, we’ve also been awarding Customer Service Stars for the year—same criteria across four quarters.) It’s not easy to earn a Customer Service Star and we take awarding them pretty seriously. Here are the award criteria:

  • Customer growth: We examine significant quarter-over quarter acquisition of new customers and additional business from existing customers.
  • Financial performance. We examine quarterly revenue improvement as reported for public companies or as we estimate for private companies based on customer growth, customer base, and pricing.
  • Products. We examine new products, new versions in a quarter.
  • Company activity. We examine new M&A, partnerships, branding, patents, organization, and facilities in a quarter.

Typically, we award one Customer Service Star for a quarter. Frequently, we award none. Three in a quarter is a big deal, especially when many of our suppliers did not have good quarter. Here’s how Clarabridge, IntelliResponse, and Salesforce.com earned their Customer Service Stars for 2Q2013:

Customer growth and financial performance

  • On a base of approximately 250 customer accounts, Clarabridge acquired 10 to 15 new customers and did additional business with 55 to 65 existing customers, driving excellent financial performance
  • On a base of approximately 160 customer accounts, IntelliResponse acquired eight new customers and did additional business with six existing customers, driving very good financial performance.
  • On a base of approximately 165,000 customer accounts, growth in subscription and support revenue indicated that Salesforce.com acquired approximately 21,000 new customer accounts. We estimate that something around 20 percent of them licensed customer service products. Total revenue increased by more than seven percent to $957 million.

Products

  • Clarabridge made one product announcement in 2Q2013: Clarabridge 6.0, a major new version of its VoC application suite.
  • IntelliResponse made two product announcements in 2Q2013: OFFERS, a marketing application that delivers targeted offers within a virtual agent’s answers and VOICES, a Voice of the Customer analytic application. Both apps integrate with IntelliResponse Virtual Agent, “IR’s” virtual agent offering.
  • Salesforce.com made four product announcements: Salesforce Mobile Platform Services, mobile application development tools and programs for building and deploying Android, iOS, HTML5, and hybrid applications, Social.com, a new social advertising application, Salesforce Communities, a community application, and a suite of G2C (Government to Citizen) solutions for federal, state, and local agencies all built on Salesforce.com general-purpose apps.

Company activity

  • Clarabridge made three company announcements: a new corporate logo, web site, and brand for its products, a new general Counsel, and a partnership with Brandwatch for collection and analysis of social data.
  • IntelliResponse was awarded a U.S. patent for its answer matching technology.
  • Salesforce.com made three company announcements: an agreement with NTT to build a cloud-computing data center in the UK, the acquisition of ExactTarget, a marketing automation/campaign management supplier, and the appointment of a new President and Vice Chairman.

Props to all three for an excellent quarter!

We know all three of the companies and their current customer service product offerings very well. During 2013, we published a product evaluation of Clarabridge Analyze, Clarabridge Collaborate, and Clarabridge Engage against our Framework for Customer Social-Service on March 28, 2013. We published a product evaluation of IntelliResponse Virtual Agent (VA) against our Framework for Customer Virtual Assisted-Service on May 9, 2013. We published product evaluations of Salesforce Service Cloud against our Framework for Customer Cross-Channel Customer Service on January 24, 2013 and our evaluation of Salesforce Marketing Cloud Radian6 against our Framework for Customer Social-Service on August 1, 2013.

The three suppliers also made it easy for us to do our research for these product evaluations. All three gave us trial versions of their products as well as access to product documentation. For Clarabridge and IntelliResponse, we also read their appropriate patents and patent applications.

We usually publish our Quarterly Customer Service Update reports early in the third and last month of calendar quarters. IntelliResponse and Salesforce.com run on fiscal years that end on January 31. Their fiscal quarters end a month later than calendar quarters.

In a few weeks, we’ll begin research on our 3Q2013 Customer Service Update Report. Third quarters are summer quarters, quarters when the software business (and many other business) typically, shall we say, relaxes. But, we hope that a Customer Service Star or two will shine.