Zendesk, Customer Service Software That’s Easy to Evaluate

Zendesk Product Evaluation

Zendesk is the customer service offering from Zendesk, Inc. a publicly held, San Francisco, CA based software supplier with 1,000 employees that was founded in 2004. The product provides cloud-based, cross-channel case management, knowledge management, communities and collaboration, and social customer service capabilities across assisted-service, self-service, and social customer service channels.

We evaluated Zendesk against our Evaluation Framework for Customer Service and published our Product Evaluation Report on October 22. Zendesk earned a very good Report Card—Exceeds Requirements grades in Product History and Strategy, Case Management, and Customer Service Integration, and Meets Requirements grades for all other criteria but one, Social Customer Service. Its Needs Improvement grade in Social Customer Service is less an issue with packaged capabilities than it is a requirement for a specialized external app designed for and positioned for wide and deep monitoring of social networks.

Evaluation Framework

Our Evaluation Framework considers an offering’s functionality and implementation, what a product does and how it does it. It also considers the supplier and the supplier’s product marketing (positioning, target markets, packaging and pricing, competition) and product management (release history and cycle, development approach, strategy and plans) for the offering.

We rely on the supplier for product marketing and product management information. First we gather that info from the supplier’s website and press releases and, if the supplier is publicly held, from the supplier’s SEC filings. We speak directly with the supplier for anything else in these areas.

For functionality and implementation, the supplier typically gives us (frequently under NDA) access to the product’s user and developer documentation, the manuals and help files that licensees get. In this era of cloud computing, we’ve been more and more frequently getting access to the product, itself, through online trials. We also read any supplier’s patents and patent applications to learn about the technology foundation of functionality and implementation.

In addition, we entertain the supplier’s presentations and demonstrations. They’re useful to get a feel for the style of the product and the supplier and to understand future capabilities. However, to really understand the product, there’s no substitute for actual usage (where we drive) and/or documentation.

Our research process includes insisting that the supplier reviews and provides feedback on a draft of the Product Evaluation Report. This review process ensures that we respect any NDA, improves the accuracy and usefulness of the information in the report, and prevents embarrassing the supplier and us.

Ease of Evaluation, a New Evaluation Criterion

Our frameworks have never had an Ease of Evaluation criterion. We’ve always figured that we’d do the work to make your evaluation and selection of products easier, faster, and less costly. Our evaluation of Zendesk has us rethinking that. We’ve learned that our Product Evaluation Reports can speed and shorten your evaluation and selection process but that your process doesn’t end with our reports. You do additional evaluation, modifying and extending our criteria or adding criteria for criteria to represent requirements specific to your organization, your business, and/or application for a product. Understanding Ease of Evaluation can further speed and shorten your evaluation and selection process.

So, beginning with our next Product Evaluation Report, you’ll find that Ease of Evaluation criterion in our framework.

Zendesk Was Very Easy to Evaluate

By the way, Zendesk would earn an Exceeds Requirements grade for Ease of Evaluation. We did a 30-day trial of the product. We signed-up for the trial online—no waiting. During the trial we submitted cases to Zendesk Support and we used the Zendesk community forums. In addition, Zendesk.com provided a wealth of detailed information about the product, including technical specifications and a published RESTful API.

Scroll down to the bottom of Zendesk.com’s home page to see a list of UNDER THE HOOD links.

under the hood

Looking at the UNDER THE HOOD links in a bit more detail:

  • Apps and integrations is a link to a marketplace for third party apps. Currently there are more than 300 of them.
  • Developer API is a link to the documentation of Zendesk’s RESTful, JavaScript API. It lists and comprehensively describes more than100 services.
  • Mobile SDK is a link to documentation for Android and iOS SDKs and for the Web Widget API. (The Web Widget embeds Zendesk functionality such as ticketing and knowledgebase search in a website.)
  • Security is a link to descriptions of security-related features descriptions lists of Zendesk’s security compliance certifications and memberships.
  • Tech Specs is a link to a comprehensive collection of documents that describe Zendesk’s functionality and implementation.
  • What’s new is a link to high-level descriptions of recently added capabilities
  • Uptime is a link to info and charts about the availability of Zendesk Inc.’s cloud computing infrastructure
  • Legal is a link to a description of the Terms of Service of the Zendesk offering

We spent considerable time in Tech Specs and Developer API. We found the content to be comprehensive, well organized and easy to access, and well written. The combination of the product trial and UNDER THE HOOD made Zendesk easy to evaluate. And, we did not have to sign an NDA for access to any of this information.

Many suppliers make their offerings as easy to evaluate as Zendesk, Inc. made Zendesk for us. On the other hand, many suppliers are not quite so willing to share detailed information about their products and, especially their underlying technologies. Products and technologies are, after all, software suppliers’ key IP. They have every right to protect this information. They don’t feel that patent protection is enough. Their offerings are much harder to evaluate at the level of our Product Evaluation Reports.

Consider Products That Are Easy to Evaluate

We feel as you should feel that in-depth evaluations are essential to the selection of customer service products. You’ll be spending very significant time and money to deploy and maintain these products. You should never rely on supplier presentations and demonstrations to justify those expenditures. Certainly rely on our reports and use them as the basis for your further, deeper evaluation, including our new Ease of Evaluation criterion. Put those suppliers that facilitate these evaluations on your short lists.

Voices of Customers

With this week’s report, the 4Q2013 Customer Service Update, we complete our tenth year of quarterly updates on the leading suppliers and products in customer service. These updates have focused on the factors that are important in the evaluation, comparison, and selection of customer service products.

  • Customer Growth
  • Financial Performance
  • Product Activity
  • Company Activity

Taking from the framework of our reports, for Company Activity, we cover company related announcements, press releases, and occurrences that are important to our analysis of quarterly performance. In 4Q2013, three of our suppliers, Creative Virtual, KANA, and Nuance, published the results of surveys that they had conducted or sponsored over the previous several months. All of the surveys were about customer service and the answers to survey questions demonstrated customers’ approach, behavior, preferences, and issues in their attempts to get service from the companies with which they’ve chosen to do business. The responses to these surveys are the Voices of the Customers for and about customer service. This is wonderful stuff.

Now, to be sure, suppliers conduct surveys for market research and marketing purposes. Suppliers’ objectives for surveys are using the Voice of the Customer to prove/ disprove, validate, demonstrate, or even promote their products, services, or programs. Certainly, all of the surveys our suppliers published achieved those objectives. For this post, though, let’s focus on the broader value of the surveys, the Voice of the Customer for Customer Service.

Surveys

The objectives in many of the survey represent the activities that customers perform, the steps that customers follow to get customer service from the companies with which they choose to do business. By getting customer service, we mean getting answers to their questions and (re)solutions to their problems. Ordering our examination and analysis of the surveys in customers’ typical sequence of these steps organizes them into a Customer Scenario. Remember that a Customer Scenario is the sequence of activities that customers follow to accomplish an objective that they want to or need to perform. For a customer service Customer Scenario, customers typically:

  • Access Customer Service. Customer login to their accounts or to the customer service section of their companies’ web sites, or call their companies’ contact center and get authenticated to speak with customer service agents
  • Find Answers and (Re)solutions. Use self-service, social-service, virtual-assisted service, and/or assisted-service facilities to try to help themselves, seek the help of their peers, seek the help of customer service agent for answers and (re)solutions.
  • Complain. If customers cannot get answers or (re)solutions using these facilities, they complain to their companies.

Here, in Table 1, below, are the surveys that examine how customers perform these activities and how companies support those activities. Note that these surveys are a subset of those surveys that were published by our suppliers. Not all of their surveys mapped directly to customer activities. Note that our analyses of survey results are based on the content of the press releases of the surveys. This content is a bit removed from the actual survey data.

Sponsor Survey Objective Activity Respondents
Nuance Privacy and security of telephone credentials Access Smartphone users
Nuance Telephone authentication issues and preferences Access US consumers
KANA Email response times for customer service Find answers and (re)solutions N/A
KANA Twitter response times for customer service Find answers and (re)solutions N/A
Nuance Resolving problems using web self-service Find answers and (re)solutions Web self-service users, 18–45 years old
Nuance Issues with Web self-service Find answers and (re)solutions Windstream Communications customers
KANA Usage of email vs. telephone for complaints Complain N/A
KANA Customer communication channels for complaints Complain UK consumers
KANA Customer complaints Complain US consumers, 18 years old and older

Table 1. We list and describe customer service surveys published by KANA and Nuance during 4Q2013 in this Table.

Let’s listen closely to the Voices of the Customers as they perform the activities of the customer service Customer Scenario. For each of the surveys in the Table, we’ll present the published survey results, analyze them, and suggest what businesses might do to help customers perform the activities faster, more effectively, and more efficiently.

Access

If questions and problems are related to their accounts, before customers can ask questions or present problems, they have to be authenticated on the customer service system that handles and manages questions and problems. Authentication requires usernames and passwords, login credentials. In these times of rampant identity theft, security of credentials has become critically important.

Nuance’s surveys on privacy and security of telephone credentials and on telephone authentication shed some light on customers’ issues with authentication.

  • 83 percent of respondents are concerned or very concerned about the misuse of their personal information.
  • 85 percent of respondents are dissatisfied with current telephone authentication methods.
  • 49 percent of respondents stated that current telephone authentication processes are too time consuming.
  • 67 percent of respondent have more than eleven usernames and passwords
  • 80 percent respondents use the same login credentials across all of their accounts
  • 67 percent of respondents reset their login credentials between one and five times per month.

Yikes! Consumers spend so much time and effort managing and, then, using their credentials. We’ve all experienced the latest account registration pages that grade our new or reset passwords from “weak” to “strong” and reject our weakest passwords. While strong passwords improve the security of our personal data, they’re hard to remember and they increase the time we spend in their management.

In voice biometrics, Nuance offers the technology to address many of these issues. On voice devices, after a bit of training, customers simply say, “My voice is my password,” to authenticate account access based on voiceprints and  voiceprints are unique to an individual.

Find Answers and (Re)solutions

KANA’s surveys on email response times for customer service and Twitter response times for customer service examine response times for “inquiries.” When customers make inquiries, they’re looking for answers or (re)solutions. In the surveys, KANA found:

  • According to Call Centre Association members, response times to email inquiries was greater than eight hours for 59 percent of respondents and greater than 24 hours for 27 percent of respondents.
  • According to a survey by Simply Measured, a social analytics company, the average response times to Twitter inquiries were 5.1 hours and were less than one hour for 10 percent of respondents.

While it’s dangerous to make cross-survey analyses, it seems reasonable to conclude that customer service is better on Twitter than on email. That’s not surprising. Companies have become very sensitive to the public shaming by dissatisfied customers on Twitter. They’ll allocate extra resources to monitoring social channels to prevent the shame. Customers win.

However, remember that these are independent surveys. The companies that deliver excellent customer service on Twitter might also deliver excellent customer service on email and the companies that deliver not so excellent customer service on email might also deliver not so excellent customer service on Twitter. The surveys were not designed to gather this data. That’s the danger of cross-survey analysis.

If your customers make inquiries on both email and social channels, then you should deliver excellent customer service on both. Email management systems and social listening, analysis, and interaction systems, both widely used and well proven customer service applications, can help. These are systems that should be in every business’s customer service application portfolio.

Email management systems help business manage inquiries that customer make via email. These systems have been around for way more than ten years, helping businesses respond to customers’ email inquiries. Businesses configure them to respond to common and simple questions and problems automatically and to assign stickier questions and problems to customer service staff. Business policies are the critical factor to determine response times to customers’ email inquiries.

Social listening, analysis, and interaction systems have been around for about five years. They help businesses filter the noise of the social web to identify Tweets and posts that contain questions and problems and the customers who Tweet and post them. These systems then include facilities to interact with Tweeters and posters or to send the Tweets and posts to contact center apps for that interaction.

Find Answers and (Re)solutions Using Web Self-Service

Nuance’s surveys about web self-service really show the struggles of customers trying to help themselves to answers and (re)solutions.

In the survey about consumers’ experiences with web self-service, the key findings were:

  • 58 percent of consumers do not resolve their issues
  • 71 percent of consumers who do not resolve their issues spend more than 30 minutes trying
  • 63 percent of consumers who do resolve issues, spend more than 10 minutes trying

In Nuance’s survey of Windstream Communications’ customers about issues with web self-service, the key finding were:

  • 50 percent of customers who did not resolve their issues, escalated to a live agent
  • 71 percent of customers prefer a virtual assistant over static web self-service facilities

The most surprising and telling finding of these surveys was the time and effort that customers expend trying to find answers and (re)solutions using web self-service facilities. 30 minutes not to find an answer or a solution seems like a very long time. Customers really want to help themselves.

By the way, Windstream’s customers’ preference for a virtual assistant is not a surprise. Windstream Communications, a Little Rock, AK networking, cloud-computing, and managed services provider, has deployed Nina Web, Nuance’s virtual agent offering for the web. Wendy, Windstream’s virtual agent, uses Nina Web’s technology to help answer customers’ questions and solve their problems. The finding is a proof point for the value of virtual agents in delivering customer service. Companies in financial services, healthcare, and travel as well as in telecommunications have improved their customer services experiences with virtual agents. We cover the leading virtual agent suppliers—Creative Virtual, IntelliResponse, Next IT, and Nuance—in depth. Check out our Product Evaluation Reports to find the virtual agent technology best for your business.

Complain

Customers complain when they can’t get answers to their questions and (re)solutions to their problems. KANA’s surveys about complaints teach so much about customer’s behavior, preferences, and experiences.

  • In KANA’s survey on usage of email or telephone channels for complaints, 42 percent of survey respondents most frequently use email for complaints and 36 percent use the telephone for complaints.
  • In KANA’s survey of UK consumers on communications channels for complaints, 25 percent of UK adults used multiple channels to make complaints. Fifteen percent of their complaints were made face-to-face.

The surprising finding in these surveys is the high percentage of UK consumers willing to take the time and make the effort to make complaints face-to-face. These customers had to have had very significant issues and these customers were very serious about getting those issues resolved.

The key results in KANA’s survey about customer complaints by US consumers were:

  • On average, US consumers spend 384 minutes (6.4 hours) per year lodging complaints
  • In the most recent three years, 71 percent of US consumers have made a complaint. On average, they make complaints six times per year and spend one hour and four minutes resolving each complaint.
  • Thirty nine percent of US consumers use the telephone channel to register their complaints. Thirty three percent use email. Seven percent use social media.
  • Millenials complained most frequently—80 percent of 25 to 34 year old respondents. Millenials are also most likely to complain on multiple channels—39 percent of them.
  • Survey respondents had to restate their complaints (Retell their stories) 69 percent of the time as the responsibility to handle their complaints was reassigned. On average, consumers retold their stories three times before their issues were resolved and 27 percent of consumers used multiple channels for the retelling.

The surprising findings in this survey are the time, volume, and frequency of complaints. Six and a half hours a year complaining? Six complaints every year? Yikes!

No surprise about the low usage of social channels to register complaints. Customers want to bring our complaints directly to their sources. They may vent on the social web, but they bring their complaints directly to their sources, the companies that can resolve them.

Lastly and most significantly, it’s just so depressing to learn that businesses are still making customers retell their stories as their complaints cross channels and/or get reassigned or escalated. We’ve been hearing this issue from customers for more than 20 years. Customers hate it.

Come on businesses. All the apps in your customer service portfolios package the facilities you need to eliminate this issue—transcripts of customers’ activities in self-service apps on the web and on mobile devices, threads of social posts, transcripts of customers’ conversations with virtual agents, and, most significantly, case notes. Use these facilities. You’ll shorten the time to solve problems and resolve customers’ complaints. Your customers will spend less time trying to get answers and (re)solutions (and more time using your products and services or buying new ones).

4Q2013 Was a Good Quarter for Customer Service

By the way, Customer Service had a good quarter in 4Q2013. Customer growth was up. Financial performance was up as a result. Product activity was very heavy. Nine of our ten suppliers made product announcements. Company activity was light. Five suppliers did not make any company announcements. Most significantly, KANA was acquired by Verint. And of course, three suppliers published customer service surveys.

Product Evaluation: Oracle Service Cloud Social Experience

Oracle Service Cloud Social Experience 

Our evaluation of the August 2013 Release of Oracle Service Cloud Social Experience is this week’s report. You may be more familiar with the product by its former RightNow CX Social Experience or Oracle RightNow Cloud Service Social Experience names. Oracle acquired RightNow in January 2012 and, without a formal announcement, renamed the product sometime during 2Q2013. One other point about the acquisition, the former RightNow R&D team has continued to develop the product, has continued to work out of the former RightNow headquarters site in Bozeman, and has continued the regular, quarterly releases of the product.

Social Experience is one of three “Experiences” in Oracle Social Cloud. The other two are Agent Experience and Web Experience. Each is aptly named for the channel that it supports. The three share a base of common data (Customers, accounts, cases, and knowledge items, for example) and services including business rules, process management, user management, and reporting. Also, product packaging and pricing puts Social Experience “in the box” with Agent and Web Experience. So, social customer service is really built into Oracle Service Cloud and that’s its key strength and differentiator.

Social Experience has these three components:

  • Communities, which supports internal community capabilities of posts and responses on topic threads. Oracle Service Cloud Social Experience Communities is based on technology developed by HiveLive that the then RightNow acquired in 2009.
  • Social Monitor, which provides capabilities to monitor posts on the social web—Facebook, Twitter, YouTube, and RSS feeds as well as Communities, to analyze the content of monitored social posts, and to interact with social posters.
  • Self Service for Facebook, which lets organizations deploy Oracle Service Cloud web experience and Communities capabilities on their Facebook pages to help Facebook users access Oracle Service Cloud Social Experience Communities and knowledgebase as well as to create cases.

Facebook, Twitter, YouTube, RSS, and Social Experience Communities are the social sources monitored by Social Experience. While these are certainly the key social networks, the product does not monitor some sources that are critical to customer service, particularly external communities, forums, and blogs. These are sources that customers very commonly use to get answers to questions and solutions to problems. That Social Experience doesn’t monitor them is a serious limitation. Oracle already has the technology to address this limitation, technology that came with its June 2012 acquisition of Collective Intellect. Collective Intellect’s IP was social monitoring and analysis technology. Oracle told us that it’s working on integrating this technology with Oracle Service Cloud.

Twitter for Customer Service

On the topic of Twitter, last week, Patty Seybold published, “Four Reasons Why Customers Prefer Twitter for Customer Service,” a report about how businesses and their customers use Twitter as a key channel for customer service. Patty proposes seven best practices for Twitter-based customer service. Oracle Service Cloud Social Experience can help implement four of the seven—Treat Twitter as an Integrated Customer Service Channel, If You Have Lots of Customers, Establish Customer Service Twitter Accounts, Defuse Anger Publicly; Take the Issue Private, Gather Customers’ Ideas for Next-Gen Products. You’ll implement the other three—Set Customers’ Expectations Re: Times of Day You’ll Respond to Tweets in Real Time, Respond within Minutes, and Don’t Use Automated Responses!—with customer service policies, standards, and procedures. Here are the four with brief descriptions of how Oracle Service Cloud Social Experience helps implement them.

  • Treat Twitter as an Integrated Customer Service Channel

Social Experience Social Monitor searches Twitter for Tweets that are relevant to customer service. Agents and/or analysts specify search queries as strings of language-specific terms of 255 characters or fewer. Queries strings may include the exact match (“”), AND, or OR operators. Analysts can save search queries for execution at a later time or for (regularly) scheduled execution.

Social Experience Social Monitor can automatically create customer service cases from the Tweets in search results and automatically appends the info in subsequent Tweets from the same Twitter account to them.

Social Experience captures customers’ Twitter account info within search results and includes them within Oracle Service Cloud customer data.

  • If You Have Lots of Customers, Establish Customer Service Twitter Accounts

Social Experience supports multiple corporate Twitter accounts that it shares among its users. (It supports corporate Facebook accounts, too.) Businesses can create a hierarchy of corporate Twitter accounts for customer service, organizing them in any appropriate manner—by customer or customer company, by products, by customer service level, or by severity or priority, for example. And, Social Experience’s Corporate Twitter accounts can be set to follow customers’ Twitter accounts.

  • Defuse Anger Publicly; Take the Issue Private

Agents specify whether each of their Tweets on their corporate accounts is public or private.

  • Gather Customers’ Ideas for Next-Gen Products

Cases generated from Social Monitor search results can be ideas for next-gen products as well as the representation of questions and problems.

Pretty good, although a bit of content-based alerting on search results could automate Twitter monitoring. Note that these capabilities of Social Experience’s to support Twitter are capabilities that we’ve seen in other social monitoring and analysis offerings, offerings including Attensity Analyze, and Respond, Clarabridge Analyze, Collaborate, and Engage, and KANA Experience Analytics. All of these offerings have been available for a few years. They’re widely-used and well-proven. Any of them can help make Twitter an integrated customer service channel.

Going forward, we’ll extend our framework for evaluating social customer service products to include Patty’s best practices as