Skip to content

Questions for Vendors of Personalization Solutions

sh1cy1MJQQuQ7DFvZ27tEw.jpg

The path to delivering personalization is well-worn, but still brings adventure. 

I am evaluating personalization solutions using a brief questionnaire that I think gets to the heart of solution value and differentiation. There are two inputs to my choice of questions: what leaders in personalization have demonstrated to be effective, and the requirements revealed by the top user scenarios. My evaluations of vendor solutions are published in this blog.

You might find the questions useful for your own evaluation.

General

  • Describe your target market.
  • What are the top challenges your target market struggles with?
  • Which aspects of those challenges does your solution address?
  • What are the strengths of your solution in addressing those challenges?

Collaboration

Because the effort to create, deliver, and manage personalized customer experiences engages people with many roles across many departments in a company, it is important that the personalization solution encourage collaboration. The top three requirements are support for many roles, immediately accessible learning, and enterprise controls.

  • What is the average number of users at an account?
  • What roles are represented?
  • How would new participants establish the skills and resources for their testing, targeting, or optimization projects? E.G., learning what to do and how to do it; creating reports or dashboards; etc.
  • Describe user access controls and workflow support.

Integration

Personalization is not achieved with a tool bolted onto your marketing environment. It is achieved with a broad range of capabilities embedded in most of what marketing does. Integration thus becomes fundamental to the value of any personalization solution. The top three requirements are customer data integration, app integration, and accessible services.

  • How, and in which of your apps, can third party data and CRM data be used?
  • What are your solutions’ capabilities for updating CRM databases?
  • How and when can your apps exchange audience/segment information and decisions? What data can apps share about audiences, and how is it shared? 
  • Are the core services of your solution extendable and usable in other channels?

AI and Automation

Leaders in delivering personalized customer experiences are developing the capability to personalize any part of any customer interaction. Personalization at scale is a challenge can’t be accomplished at scale unless AI is effective at predicting customer reactions to each step in the experience, and automatically presenting the best next step. The top three requirements are control and automation; shared insights; and real time actions.

Which apps can make automated real time actions based on predictions and recommendations, in what circumstances?

  • What aspects of AI decisions can be constrained or controlled by users?
  • How frequently are models updated?
  • Under what circumstances can your AI services be extended to other apps or channels?
  • To what extent can AI decisions and actions be communicated in human terms? 
Advertisements

Understanding Requirements for a Personalization Solution

FourDressDummy

Fitting the customer experience to the customer — at scale — has been the long-running goal, and challenge, of personalization solutions. 

Lessons from Leaders in Personalization

A recent study, Forrester’s Business Technographics Global Data and Analytics Survey 2018, determined that only 7% of companies have figured out how to compete effectively on experiences. These companies are in hyper-growth mode, and Forrester estimates they will drive $1.8T in revenue by 2021. These are the leaders the rest of us – more than half of us, according to the survey — should emulate.

Our research into achievements in personalization have identified three broad lessons to be learned from today’s leading marketers.

  1. Leaders achieve broad engagement and collaboration among all stakeholders.
  2. Leaders are passionate for insights into customer behaviors and motivations.
  3. Leaders create a culture of experimentation.

We see three categories of requirements that are driven by how these leaders have made strides in personalization.

  • Collaboration
  • Integration
  • AI and Automation

Collaboration Requirements

Because the effort to create, deliver, and manage personalized customer experiences engages people with many roles across many departments in a company, it is important that the personalization solution encourage collaboration. The top three requirements are support for many roles, immediately accessible learning, and enterprise controls.

  1. Support for many roles. User interfaces should be oriented around roles, such as campaign manager, and marketing tasks, such as monitoring and optimizing campaigns. The supported roles should include CMO, customer success, customer loyalty, merchandising, channel marketing, ecommerce, IT, mobile app development, administrator, demand generation, lead generation, and product management, among others.   
  2. Immediately accessible learning. New participants in personalization, or those shifting to new areas, must quickly and independently establish the skills. The solution should offer online learning on how to use the solution to accomplish common tasks. Ideally, it will also offer higher level guidance on how to be effective in personalizing the customer experience.
  3. Enterprise controls. A solution used by a number of people, especially across roles and departments, needs built-in controls to ensure they are not making conflicting changes. Workflow for changes, campaigns, and tests is a big help in this environment.

Integration Requirements

Personalization is not achieved with a tool bolted onto your marketing environment. It is achieved with a broad range of capabilities embedded in most of what marketing does. Integration thus becomes fundamental to the value of any personalization solution. The top three requirements are customer data integration, app integration, and accessible services.

  1. Customer data integration. Personalization requires a broad range of data, including data generated during an interaction,  the customer data that typically resides in several sources within a company, as well as data supplied by third parties. A personalization solution must be able to use all of this data in a systematic and consistent way, add to it, derive insights from it, and share those insights with other customer-touching systems.
  2. App Integration. The personalization solution should have published APIs that suit at least your immediate requirements; server-side and client-side integration; and pre-built connectors for common marketing and sales solutions such as Salesforce and Google Analytics.
  3. Accessible services. Ideally, a personalization solution allows its core services to be used in other apps and channels, via a consistent platform, API, and UI. These core services should allow marketers to request recommendations decisions, request and also share audience definitions, request personalization decisions, use customer attributes, and add to customer attributes.

AI and Automation Requirements

Leaders in delivering personalized customer experiences are developing the capability to personalize any part of any customer interaction. Personalization at scale is a challenge can’t be accomplished at scale unless AI is effective at predicting customer reactions to each step in the experience, and automatically presenting the best next step. The top three requirements are control and automation; shared insights; and real time actions.

  1. Control and Automation. At times you will want to control AI’s decisions and actions, so a personalization solution needs to offer constraints such rules and filters.
  2. Shared insights. ideally, marketers should have insight into what ai has learned about their customers – turning ai internals into recognizable attributes, communicated in human terms. AI decisions should be available to other apps in order to promote more consistent customer experience, either via requests for decisions and predictions, or via data such as audiences or customer profiles.
  3. Real time actions. Automatically delivering the best possible experience within each interaction requires real time decisions, predictions, and actions. AI models should be capable of real time or at least hourly update.

Method: Scenario-based Requirements

Technology is always acquired to improve our processes and results. When we gather requirements for technology solutions, we look at what how we have worked in the past, and what we need now and in the near future to do the same work more effectively.  But technology changes the way we do things, in ways that are not always possible to imagine. As a result, we prioritize requirements that will soon have little value while missing those that will soon seem critical. For example, rather than insist it be easier to cut and paste from this screen to that one, insist that the data transfer be automatic.

A inherent tension in the standard requirements process is simplification vs. context. Managing requirements almost demands reducing each to a bullet or headline entry in a checklist. But getting what you need demands that you retain the context. For example, “Salesforce integration” is useful shorthand, as long as you don’t lose sight of the real requirement, such as “Exchange customer profile data with Salesforce in real time”.

We use a scenario-based approach to requirements because in our experience it is your best hope for success with the requirements process, focusing on what you need in future rather than past aggravations you’ve suffered; and capturing the context — the why, who and when —of the requirement.

Here’s how you use the scenario approach: Your team talks through business activities and goals, with the aim of creating a narrative of ideal scenarios. For a moment, forget the constraints and limitations of today’s tools. What are the most frequent, and what are the most important, outcomes that you pursue? What roles are involved in pursuing those outcomes? In a perfect world, how would your team accomplish those goals? In the ideal, you are not wasting time making up for your tools’s shortcomings, such as trying to reconcile segment definitions or revenue reports, importing or exporting data, inputting results, studying dashboards to identify anomalies and their causes.

Use your scenarios as the basis of requirements, and even more importantly, of vendor demo evaluation. Insist that the demo show us how we would accomplish our goals and do our jobs with the vendor’s tools. Don’t accept a canned demo that is organized around the features of the vendor’s tools.

We recommend starting with your top 3 scenarios. The Table presents a generalized scenario that you can use as a starting point for your own.

Scenario: Director of Marketing is overseeing launch of new product category

Task or Event

What is Success

Requirements to Achieve Success

Test the idea with current customers to find the target audience

Identify target market without negative impact on other results

Testing optimization: MVT testing that also segments audiences, predicts responses, and routes traffic to best performing experience

Identify the attributes of the target segment

Begin to understand how target market compares to other segments

Machine-discovered segment is described with human-meaningful attributes

Machine-discovered segment definition is available/useful to analytics and other marketing tools

Seek more customers to add to segment

Quantify the size of target market

Explore possible similar or related segments

AI/ML analysis that identifies similarities within a group that are predictive of behavior

Based on analysis of customer response and target market, decide to launch the category

Feel comfortable that the range and likelihood of outcomes is clear

AI analysis that quantifies outcomes, probabilities, and confidence intervals

Prepare content for target customer journeys

Customers respond to content by taking the hoped-for actions

Analysis identifies similar segments, which marketing uses to ideate content and paths for target segment

Use the target segment definition in campaigns, both web and email

Meet campaign and launch goals

Testing optimization predicts most effective content and paths for visitors and routes them accordingly

Campaigns can use both user-defined segments and observed segments in matching customers with paths and content

Deploy increasingly personalized campaigns to learn more about target customers and create richer profiles

Positive impact on business results

Improved target segment definitions

Understand how to motivate behaviors of customers

Apply third party and company data in analysis, prediction, traffic routing, segmentation

Real time prediction and analysis

Automated deployment of best experience

Of course, your standard process for requirements gathering can’t be ignored. We recommend the scenario approach as an addition to your requirements process, a way to organize your evaluation, and the best way to focus on what is most important to your future success. 

Until AI Automates the World

IMG_0406

We’re all on the AI bus careening rapidly toward the End of Human Work.

Until we get there, we all still have a lot of work to do.

We talk about the transformation as if the goal is autonomous AI, automatically doing all our work. But I think there is no question that the most valuable results will be achieved by the AI-assisted super-human, producing work never before possible – or imagined. The most valuable applications of AI will be human-machine collaboration, where AI augments human jobs and humans augment AI tasks.

We are in for a long period of working with and around intelligent machines.

To date, we have mostly experienced master-slave relationships with our machines. We feed in scads of data, and machines pour out the orders, invoices, payments, computations, and categorizations we rely on to keep business going. Or, machines feed tasks to workers such as warehouse pickers who have no other motivation/control over what to do next.

Today, for the most part it is humans who see the information, have the insights, make the decisions, and take the actions. Nevertheless, in a few arenas machines are producing better results than humans. These are limited cases where algorithms are tested enough data is clean enough, systems are integrated enough, the problem is clearly bounded and context is sufficiently indicated. Sadly, these conditions are rare: The majority of today’s business systems suffer greatly from poorly integrated, context-poor, messy data deployed in support of poorly articulated strategies and goals. Improving those conditions is a gargantuan effort, an effort already underway for decades.

For decades to come, then, AI will augment most of our jobs, and automate very few. We are not even on the cusp of understanding how to do that. We need to learn how to effectively collaborate. We need new design patterns, methods, and metaphors for this new shared work.

Technology may be ready for an AI-automated world in my lifetime, but corporations, systems, and people will be struggling to catch up.

A few questions we urgently need to answer:

  • How does an organization learn to assess opportunities to apply AI, experiment, and measure the resulting impact?
  • How do organizations acquire the skills needed to be effective users of AI?
  • What are the design methods for sharing work with a semi-autonomous agent?
  • What are the design patterns for collaborating with a machine?
  • How do we design interfaces that encourage trust? Given that ML won’t make perfect decisions in every case, how do we make people comfortable enough to use systems?
  • How do we design interfaces that involve people at the right time, in the right way?
  • How do experience designers develop sufficiently deep understanding of ML to know which behavior and context information is essential to improving ML?
  • How do we design interfaces that evolve as machines learn over time, and yet feel consistent and reliable?

Practically Personal

How personal should you make the customer experiences you deliver? How personal can you make customer experiences?

In a previous post we described how a guy who doesn’t ski reacts to images on an ecommerce site of a woman skiing.  He believes he’d respond more if shown something he can relate to.

If we somehow knew (and that’s an issue for a future post) that this guy bicycles, we could show images of bicyclists. The question for today is, what images do we need in our library to satisfy our visitors and our customer experience goals? Do we have to address, say, 6 possible biking interests (mountain, commuting, BMX, racing, family recreation, camping); 2 genders; at least 3 age groups (child, young adult, senior); and perhaps 6 environments (urban, rural, forested, plains, mountains, coastal). That’s 17 attributes, and 216 images to satisfy all combinations. If all you sell is bikes, perhaps you can afford that. If you cover all sports, or if sports is but one of your categories, how can you possibly?

Most likely, you don’t need 216 or even 17 images to be effective with this guy who bicycles. Maybe you only need 3 images. Which ones? How many? Who knows.

The only way to know is to “test” out the impact of having a few variations. I wish I could believe that there is one answer to the question of what will make our bicyclist happy. I fear that it depends on his current context, and therefore the customer experience must be variable as well.

In this realm, “test” bears no resemblance to A/B testing. Rather, it describes a an automated, data-driven prediction of what will have the greatest impact at this moment in time. Machines can make the predictions and deliver the customer experience. The marketing team has to decide how much to invest in content variations, and which variations are most likely to be important to visitors. Automated customer experience delivery and content planning are two programs that most companies have as yet to perfect, or many have as yet to attempt.

The Right Questions for Personalization Success

path to personalization

“I’m a guy, and I don’t ski. Why are you showing me pictures of a woman skiing?”

I wish I could remember the name of the man who said this, because it is a great summary of the customer perspective of personalization. The implication is, he’d be more responsive to offers that featured guys doing his sport – whatever that might be.

His complaint surfaces what I call the 5 Introductory Personalization Questions:

  1. How can we know enough about our visitor?
  2. How do we use that knowledge to select the best experience for this moment?
  3. How do we have the right content on hand?
  4. What is the mechanism for retrieving and delivering the best content to this customer at this moment?
  5. How do we know we delivered the best experience possible?

These questions are signposts for your personalization journey, and during the journey you will ask and answer many more.

You have almost certainly talked to people who want to answer these questions with technology. Technology is unquestionably necessary, but in my experience the culture and process concerns are far more challenging. Every organization that is struggling to deliver personalized customer experience describes issues with strategy, commitment, alignment, and workflows. Any time you fool with customer experience, the ripples reach every part of the company. Somehow, that is a lesson that never gets old but must be learned and learned again.

People don’t anticipate the breadth of what they are taking on when they begin their personalization journey. As a result, they start in the middle without the provisions, collaboration or roadmap they need. With a little more knowledge of what the journey entails, progress is more certain and less expensive Here’s my [You Won’t Be] Lonely Planet Guide to help you anticipate and overcome the barriers.

Program or Task

Culture

Process

Tooling

Acquiring Customer Knowledge

What knowledge do we think is valuable?

What are we willing to collect?

What sources are acceptable?

How much resource are we willing to devote to the process?

Who owns the information?

Who owns the policies and processes?

Who will collect and analyze information to create knowledge?

Who will establish, who will manage, third party relationships for data collection?

Who is responsible for budget and planning?

Who is responsible to distribute and protect the information and knowledge?

How is customer information captured, ingested, and stored?

How is knowledge extracted from information?

In what manner is the knowledge stored?

Applying Customer Knowledge to Creating Customer Experience

What is the strategy for customer experience?

How does customer experience strategy align with business strategy?

Who owns the customer experience strategy?

What aspects of customer experience should be influenced by customer knowledge?

Who decides?

What degree of automation and what degree of explicit control is acceptable?

Who/what can use the information, for what purposes, in what circumstances?

How do various customer segmentation tactics apply to knowledge-driven customer experience?

Who designs and manages the variable customer experience?

What is the mechanism for knowing the customer?

How is the best experience for each customer

identified?

How are the elements of the experience delivered during the experience?

Provisioning Customer Experience Content

How many variations of content are we willing to fund and manage?

What sources are acceptable?

What degree of quality and consistency are required?

How doe we reconcile variations with our brand?

Who creates and tracks the content strategy and plans?

How is the content tagged, stored, improved, and replaced?

Who decides which variations will be shown in each customer experience?

What is the time frame in which the decision is made?

How is content tagged and formatted for use in various experiences, across various devices?

How is content use and impact tracked and reported?

Evaluating Results

Who is responsible for the quality of customer experience?

What is the goal for quality of customer experience?

How is quality of customer experience measured and reported?

How is the value of the experience to the customer measured?

How is the quality of content measured?

How is quality of customer experience measured and reported?

How do we measure the impact of customer experience improvements on business results by period?

How do we measure the impact on value delivered to customers?

What are the mechanisms for measuring, evaluating and communicating the quality of customer experience; and the value to our business and to customers?

Optimizing Results

Who is responsible for improving customer experience?

What are the goals for improvement?

Are the goals differentiated by customer segments, product categories, or sales region?

How is improvement measured and tracked?

How do we track progress toward goals?

How do we identify and prioritize efforts to improve customer experience?

What are the mechanisms for predicting, delivering, measuring, and evaluating what makes the best experience for each customer at each moment?

Use Cases in Cloud Infrastructure Management

Well, with a snappy title like that,  I expect I am now all alone in this room. 😉

My cloud research this year will be focused on use cases in two areas: consumption and service. I will delve into the tasks involved, and how commercially available tooling addresses those tasks.

Consumption and Cost Management

  • Application cost performance review
  • Cost and capacity planning

Service Management

  • Service level evaluation and planning
  • Availability and fault recovery planning
  • Operations Automation

Evaluation Areas:

  • Desired outcome of the activity
  • Business and infrastructure impact of the activity
  • Roles involved
  • Tasks involved
  • What problems/aspects of the activity are addressed by a vendor solution
  • How a given solution contributes to the activity- what it does, how it works
  • Overview of tech/architecture/interfaces/data
  • Vendor’s target market, pricing, strategy

Remodeling Infrastructure Management

We are in the throes of rewriting what IT infrastructure is. The shift to the cloud changes what we pay for, how we budget and plan costs, what is costly, what can be managed, what can be predicted, how quickly systems are deployed, how easily systems are moved or replicated or recovered.

This means that we will soon be in the throes of rewriting what infrastructure management does, and how it works, and who uses it. 

We do have some inkling what to expect. The last shift in the IT infrastructure paradigm —from mainframe in data centers to distributed computing dominated by client/server—happened only a quarter century ago, and the lessons are readily available. Client/server engendered entirely new development technologies, development methodologies, operations technology —and upended how IT was controlled, budgeted, and managed.

Tools that were terrific in the mainframe environment were still useful in small ways, some of the time, for parts of a few of the problems. In other words, woefully inadequate. The replacements came from new players — think Microsoft and BMC— while established players —like IBM—were slow to catch up.  The established players thought they could bolt some distributed management onto their data center management. As it turns out, the new players eventually bolted on a comparatively small bit of datacenter management onto their vast new tooling.

With cloud, we once again face a different paradigm, a different world, that demands different tools, techniques, and opportunities. Fortunately, we can apply much more sophisticated technology today than was available 25 years ago. Machine and deep learning will save our bacon this time around.

The scale and complexity of the cloud environment will dwarf anything most of have experienced or can imagine. Humans did ok with millions of events and objects to manage, using scripts and templates. When faced with billions and then trillions, tooling made it possible to handle bundles of objects and respond only to exceptional events. But we are on the frontier of zeta and yotta scale. We will be forced to automate almost all of infrastructure management. Machines will observe, analyze, optimize and act. It will be our human job to observe, analyze, optimize and act on the machines and the models they run.

A new wave of management tooling is already emerging to replace the soon-to-be-sidelined management platforms you currently rely on. A new wave of skills should be under development: you should now be spending your time building models instead of scripts.

%d bloggers like this: