JARVIS Pilot: End-to-end content strategy

Employer: Adobe Systems
Role: Lead strategist defining end-to-end content strategy; product design contributor

Situation

Even a cursory look at Adobe’s stock price trends corroborates the explosive growth that the company’s business has seen since 2013. In the beginning of 2017, the company’s leadership prioritized an initiative to scale support operations without adding proportional headcount to the already 800-strong customer experience org. A new ML-powered support smart agent, internally codenamed JARVIS, was key to realizing this objective.

In the Pilot phase, the scope of this smart agent included addressing top customer issues/intents for Digital Media, Adobe’s largest business unit. Longer term, JARVIS would cover most Adobe products and be the preferred channel of support for Adobe customers.

Stakeholders

Drivers
Approvers
Contributors
Informed
  • Product managers within the customer experience org
  • Digital media BU design/content strategy (my team)
  • Customer experience org leadership
  • Digital media BU org leadership
  • Support tools engineering teams
  • Support tools data science
  • Chat agents in the participating support queues
  • Voice of Customer (VoC) content authors 
  • Support queue people managers

Product landscape

Tasks

  • Build the smart agent from scratch to address top customer intents for Digital Media products. From a content strategy perspective, this task involved:
    • Identifying the top data-driven customer intents in the Digital Media space
    • Identifying the necessary content types, and setting the tone & voice guidelines for them
    • Leading the ML training initiative identify real-world customer queries mapping to these top intents. Multiple customer lingo queries usually map to the same intent.
      Example: what’s my plan and what did you charge me for both map to the Clarify plan details intent
    • Content design for the answer bot (internally codenamed Logos)
    • Content design for the routing bot (decision tree)
    • Designing customer touchpoints, including embedded smart agent triggers in customer-facing content
    • Designing other required pieces of content, such as system messaging
  • Learn, iterate, and expand coverage to eventually serve all of Adobe’s 100+ products

Actions (content strategy)

  • Answer bot: In collaboration with support agents, source the data necessary to arrive at the list of top customer intents:
    • Call volumes across product queues in the Digital Media BU
    • Chat transcripts that would fuel Vox Populi, an ML system to surface top customer issues from raw chat data. In 2016, I’d had the privilege of contributing extensively to the development of Vox Populi as well.
  • Answer bot: In collaboration with Product partners, get Approver buy-in on top-level data-driven customer intents
  • Define a content persona from which the bots, especially the answer bot (Logos), would speak:
Bot personality framework adapted from a template by Amir Shevat
  • Create a tone & voice framework for the necessary content types:
  • Answer bot: Write the answer bot scripts for the top customer intents. Steer the scripts through UATs and executive reviews to iterate on them.
  • Routing bot: Author and iterate on the decision tree. Work with UX research to gather real-world user feedback to inform iterations.
  • Agent experience: Author the scripts that customer support agents would use when the answer/routing bots transfer customer chats to them. Train the agents to use the scripts well. The end goal here was to maintain cohesive tones & voice for all customer-facing conversational content — right from the answer bot to when agents start interacting with customers.
  • Be the primary point of contact for all launch activities. I literally got to press the “button” that took the smart agent live!  
  • Learn along the way and populate the backlog for JARVIS Next.

Results (non-confidential)

JARVIS Pilot was well-received by Adobe customers:
  • Real time feedback indicated that a good fraction of customers thought that the JARVIS Pilot experience was an improvement over the legacy “human-only” chat experience. 
  • Customers were particularly pleased that the new experience offered low wait times — a great improvement over the earlier experience. Even when a chat was transferred to a human agent, quick connections happened since JARVIS could route the chat to the right queue. The routing bot, in particular, made Adobe customers’ lives easier.
Content metrics
(confidential numbers omitted on purpose)
  • An encouraging number of customer queries were answered by the answer bot without agent intervention
  • JARVIS was often able to map customer lingo queries to the right intent, fulfilling a key learning objective for the Pilot
  • Considerable improvement in the call propensity for the top tracked issues 

JARVIS in action

 

Right-clicks are disabled. Apologies!