5. The Analytics of Online Engagement

Eric T. Peterson

Engagement of customers with online resources is an important but elusive concept. Companies have struggled with the meaning and measurement of online engagement for years. However, none of the common website metrics—page views, clicks, surveys, time spent, conversions, loyalty—is an acceptable proxy for engagement.

To make engagement more understandable and actionable, my firm, Web Analytics Demystified, Inc., has developed both a definition and a framework that reduces a wide variety of website metrics to a single visitor-engagement score. Once calculated, this score can be used to segment visitors for a variety of purposes, such as digital marketing, content improvement, keyword enhancement, and partnership development.

The Definition of Engagement

Because engagement can mean different things to different people, it has proven to be a frustrating metric for businesses to nail down. Some equate it with online activity, others focus on loyalty, and still others emphasize conversion rates. But each of these metrics fails to comprehensively support analytical decision-making. Based on observing various approaches, the following points seem clear:

Engagement isn’t conversion. Conversion (going from browsing to buying) looks at those who purchase, but it ignores the vast majority who don’t. Just because a person doesn’t buy online, does that mean he or she isn’t engaged? What characterizes the online behavior of the “unconverted?” How can a company better reach them for later conversion? Conversion is undoubtedly a type of engagement, but it’s not the whole story. For example, someone may spend hours on the Porsche website, but just because he doesn’t purchase a Porsche online (which is rare) doesn’t mean he isn’t engaged. There is also the need to measure engagement with sites that offer only content and no opportunity to purchase anything.

Engagement isn’t activity. Average time spent, page views, and clicks per visit certainly capture web activity, but to what end? Are more page views and clicks necessarily better? Could clicks be the result of frustration rather than interest? Who is “average,” anyway?

Engagement isn’t satisfaction. Although qualitative customer satisfaction (via tools from iPerceptions and ForeSee) and loyalty (embodied in metrics such as the Net Promoter Score) are both useful to track, neither says much about engagement. For example, a customer can simultaneously be highly dissatisfied and highly engaged in complaining about a brand on social networks. It is critical to measure both.

No one metric of engagement has emerged because of all the confusion surrounding the meaning of the concept.

Due to this uncertainty and lack of a consensus metric, approaches attempting to measure engagement have proliferated. In a recent poll, organizations reported using these approaches, among others:

• Traditional web analytics: 51%

• Online surveys: 34%

• Customer journey analytics: 30%

• Feedback from customer-facing staff: 28%

• Customer interviews: 27%

To be clear, all of these are worthwhile inputs, and companies should continue gathering and studying them. But none provides a complete picture of engagement.

Other organizations have attempted to define engagement, but not in a way that lends itself to analytical measurement and action:

Forrester Research: “Engagement is the level of involvement, interaction, intimacy, and influence an individual has with a brand over time.”

Advertising Research Foundation (ARF): “Engagement is turning on a prospect to a brand idea enhanced by the surrounding context.”

Both of these definitions have some appeal, but they would be difficult to translate into metrics in an offline context.

In 2007, Web Analytics Demystified, Inc. took its first pass at a more useful definition:

Engagement is an estimate of the degree and depth of visitor interaction against a clearly defined set of goals.

The inclusion of goals in the definition gives it teeth and provides a linkage to action. To even begin to measure engagement, a business must identify what actions it wants visitors to take on its site. Download a paper? Watch a video? Join a forum? Subscribe to a newsletter? Buy something? Fill out a contact form? Prioritizing a particular task or task set implies data and measurement.

Feeling that our definition still lacked precision, in 2008 we released this revised version:

Engagement is the demonstration of Attention via psychomotor activity that serves to focus an individual’s Attention. Attention is a behavior that demonstrates that specific neural activity is taking place.

The stress on attention is intentional: In this definition, engagement is seen not as an external activity but as an internal mental state, something happening in the brain. Terms such as psychomotor and neural puzzled clients and other interested parties, however, so the third version blended the best of both:

Engagement is an estimate of the depth of visitor interaction against a clearly defined set of goals. Demonstrated Attention is measured via “visitor interaction.”

A Model to Measure Online Engagement

To put the definition to work, the company developed a robust but flexible framework that captures multiple aspects of visitor interaction, as shown in Figure 5.1. We have updated it over time to add missing components.

Image

Figure 5.1. Measuring online engagement.

Before we get into the details of the model, it is important to remember that this is a general model, not an optimized calculation for all types of sites. I agree with other analysts and bloggers who insightfully say that no single calculation of engagement is useful for all sites. But I do believe that this model is robust and useful with only slight modification across a wide range of sites. The modification comes in the thresholds for individual indices, the qualitative component, and the measured events, as discussed next. Otherwise, I believe that any site capable of making this calculation can do so without having to rethink the entire model.

The calculation of engagement using this model needs to be made over the lifetime of visitor sessions to the site and needs to accommodate different time spans. This means that to calculate the percentage of sessions having more than five page views, you need to examine all the visitor’s sessions during the time frame under examination and determine which had more than five page views. If the calculation is unbounded by time, you would examine all of the visitor’s sessions in the available dataset. If the calculation was bounded by the last 90 days, you would examine sessions only during the past 90 days.

The individual session-based indices are defined as follows:

Click-Depth Index (Ci) is the percentage of sessions having more than n page views divided by all sessions. (The calculation of n is discussed in a moment.)

Recency Index (Ri) is the percentage of sessions having more than n page views that occurred in the past n weeks divided by all sessions. The recency index captures recent sessions that are also deep enough to be measured in the Click-Depth Index.

Duration Index (Di) is the percentage of sessions longer than n minutes divided by all sessions.

Brand Index (Bi) is the percentage of sessions that either began directly (had no referring URL) or were initiated by an external search for a “branded” term divided by all sessions.

Feedback Index (Fi) is the percentage of sessions where the visitor gave direct feedback via a Voice of the Customer technology such as ForeSee Results or OpinionLab, divided by all sessions.

Interaction Index (Ii) is the percentage of sessions where the visitor completed one of any specific, tracked events divided by all sessions.

In addition to the session-based indices, I have added two small, binary weighting factors based on visitor behavior:

Loyalty Index (Li) is scored as 1 if the visitor has come to the site more than n times during the time frame under examination. Otherwise, it is scored 0.

Subscription Index (Si) is scored as 1 if the visitor is a known content subscriber (subscribed to my blog) during the time frame under examination. Otherwise, it is scored 0.

In each component of the index, the n value is arbitrary, but new users are encouraged to use their particular site’s averages to represent n. In other words, if a site’s average number of page views is six, a visitor who views eight pages is assigned a 1 for that index, and a visitor who views three is given a 0 score. (Later, more experienced users can separately weight each index.) Calculate the overall engagement score by summing the component values, dividing by 7, and converting that number (which will be between 0 and 1) into a percentage. (For a more complete description of the model, as well as free ebook and whitepaper downloads, visit www.webanalyticsdemystified.com.)

To create the overall engagement score, take the value of each component index, sum them, and divide by 8 (the total number of indices in my model) to get a clean value between 0 and 1 that is easily converted into a percentage.

The Value of Engagement Scores

Once visitors have been scored, it is a straightforward matter to segment them into categories such as “highly engaged,” “somewhat engaged,” and “poorly engaged.” These segments become useful key performance indicators (KPIs) when added to current site reports. Note that this metric doesn’t judge whether a particular visitor is happy or sad, satisfied or dissatisfied, or can find what he or she is looking for. It simply makes a reasonable assumption that the visitor is paying attention, which is Web Analytics Demystified, Inc.’s proxy for engagement.

Although high-end software solutions such as Adobe’s SiteCatalyst, IBM’s Coremetrics, and Webtrends’ Analytics are potent tools, even a free product such as Google Analytics can produce significant insights from this sort of enhanced data.

With a “poorly engaged” segment defined, for example, a site owner can examine reports to answer questions such as these:

• On which landing pages did this segment arrive?

• From which search engines did they come? What search terms did they use?

• What did they buy ?

• To what digital marketing did they respond? Based on their subsequent purchases, was it cost-effective?

• From which countries are they coming? For business-to-business (B2B) entities, from which domains ?

Also, from these insights come potential actions:

Changing site content. What else could we be doing for our highly engaged visitors? How should we be treating them to move the conversation toward a task or goal?

Learning from clickthrough referrers. Ask them these questions: What do our engaged visitors click at your site? How can we help you send more of this segment to us? Should we provide the highly engaged who click through with different content opportunities?

Analyzing keywords. An analysis of search phrases that a segment uses may suggest the purchase of new, nonobvious, and more cost-effective terms at search sites. The use of branded keywords is an explicit illustration of attention as a mental state.

Triggering alerts. When a visitor’s engagement score suddenly spikes—say, from 30% to 60%—this may signal an imminent purchase decision, particularly in a B2B environment. A triggered report could prompt a salesperson to pick up the phone and call the prospect.

Perhaps the best way to illustrate the use of the model is to describe its application in two organizations: PBS and Philly.com. Both organizations were able to make substantial improvements in their online results after switching to engagement-oriented metrics and analyses.

Engagement Analytics at PBS

PBS is a private, nonprofit corporation founded in 1969. Its members are America’s public television stations—noncommercial, educational licensees that operate nearly 360 PBS member stations and serve all 50 states, Puerto Rico, U.S. Virgin Islands, Guam, and American Samoa. The corporation has transformed itself from a broadcast-only model to a truly multiplatform leader that serves Americans through television, mobile TV, the Web, interactive classrooms, and more. PBS reaches almost 117 million people through television and nearly 20 million people online each month.

The shift online has created entirely new opportunities with the corporation’s consumer audience. One of these opportunities is applying new, digital analytical applications to create a more robust understanding of audience engagement with properties, shows, technologies, and campaigns. Given the relatively high volume of consumer data generated by the corporation’s multiple digital investments, conventional wisdom dictated that developing this understanding of the online consumer could positively impact PBS’s programming strategy, both online and offline.

In 2009, under the direction of Jason Seiken, Senior Vice President of PBS Interactive, Amy Sample, Director of Web Analytics, set out to implement a multifaceted measure of engagement. It leveraged the corporation’s investments in web analytics technology. This measure was incorporated into PBS’s existing analytics efforts and ultimately was used to help the corporation grow audience and revenue and increase engagement and satisfaction. According to Sample, “Understanding and measuring user engagement with our content has advanced our use of analytics at PBS beyond just audience reason and page views. By focusing on engagement with our content, we are delivering better experiences for our users.”

PBS uses Google’s Google Analytics offering to measure and analyze its online audiences. Although it isn’t as powerful or sophisticated as other available tools, Google Analytics provided Sample and her team two benefits that ultimately drove the effort’s success:

• A user interface to the data that led the industry in terms of simplicity

• A set of application programming interfaces (APIs) that allowed the engagement data to be pulled into the corporation’s wider reporting efforts

The engagement calculation that Sample ended up using leveraged measures of audience loyalty, recency of visit, visit duration, and depth of visit. It was largely based on work done by Web Analytics Demystified and Victor Acquah, a consultant working for PBS. Through experimentation, Sample and Acquah determined that different combinations of this data were required, depending on which site was being measured, primarily because of the diversity of audiences the corporation serves.

Since rolling the measure of engagement out across PBS digital properties in 2010, the conversation has shifted. The debate used to be over “what engagement means” and “how engagement should be measured.” Now it is “which of PBS’s digital efforts drive online engagement,” “how engagement can be increased,” and “what impact increasing engagement has on satisfaction, audience composition, and ultimately revenue.” Moreover, Sample can compare engagement profiles across multiple PBS digital properties and investments, focusing on which engagement efforts are working and which need additional attention.

Sample and Acquah continue to analyze the drivers of engagement on PBS properties. Through their analysis, they discovered that video is a key driver of user engagement on PBS.org. As a result of their analysis, video content was featured more prominently on the redesigned home page, leading to a 42% increase in monthly video views. This translated into hundreds of thousands of dollars in incremental sponsor-driven revenue for PBS. “We have substantially grown our video streams and overall traffic by being very metrics-driven,” says Seiken. “The numbers confirm that we are keeping consumers engaged longer and dispelling the myth that PBS is just for older generations. The combination of our engagement analysis and our unique, uncluttered environment increases our potential sponsorship revenue.”

For 2011, PBS established a goal of increasing the total number of engaged users visiting PBS websites by 8 to 10 percent. The logic is, of course, that your best customers are your existing customers, but if you cannot keep those customers engaged you may lose them. Further, as the advertising community continues to examine their investments (and as PBS further explores advertising-based revenue models), having a good story to tell about the quantity and the quality of the PBS audience only helps drive revenue.

Engagement Analytics at Philly.com

Philly.com is an award-winning news, sports, and commentary site. It is the online home of the Philadelphia Inquirer and Philadelphia Daily News. It also creates a significant amount of its own content and aggregates the work of quality partners in the Philadelphia area. The site is particularly strong in breaking news and sports, which make up 30% to 40% of overall site traffic. Philly.com has been aggressive in adding new user engagement features, including gaming in sports and reader chats. It operates a professional video unit that produces about eight video shows per week. In 2009, the site was named a Top 10 sports news site in the country by the Associated Press Sports Editors. In 2011, the site won second place in the prestigious national Headliner Awards among newspaper websites for online presentation of a special report.

In early 2010, at the suggestion of the company’s web analytics manager, Chris Meares, leadership at Philly.com began exploring the use of visitor engagement as an alternative to traditional, page view-based measures of success on the site. Meares was familiar with the indices described in this chapter, and he determined how each could best be used for sales, marketing, and content planning purposes. He presented his results to the then-President, Vice President of Content, Vice President of Product Development, and Vice President of Sales.

Under the direction of Kevin Stetter, Vice President of Advertising, Meares set about creating a single measure of visitor engagement using Omniture SiteCatalyst and Omniture Discover. This metric, a variant on the model described earlier, was hashed out through trial and error against the key segments that the company had already been tracking. Initially, when Meares presented the finished product to the Sales organization, some concern arose, primarily from the perceived complexity of the calculation when compared to the company’s relatively simple existing set of metrics. However, as soon as Meares and Stetter explained the work behind the measure of visitor engagement, Sales immediately warmed to the idea, especially when they realized they would be able to sell advertisers on more engaged audiences and more engaging sections of the site.

Product managers and other content owners started using the visitor engagement metric as well, applying the calculation to different site sections, referring traffic sources, and geographic targeting data. What’s more, insights derived from the measure were found immediately and impacted multiple areas of the site, including jobs and real estate. Meares also discovered a strong correlation between sports content and engagement across the entire site that is now leveraged as an “early warning system” to predict when overall site traffic (and thus advertising-based revenue) is waxing or waning. Now, as opposed to page views and visits, visitor engagement is the measure of success against which the Vice Presidents of Content and Product hold their staffs accountable.

“Needing a new online metric that focused more on the content of our website and geared toward our loyal visitors, Philly.com moved to engagement as the measuring stick for the performance of our website,” said Meares. “Since we have become more focused on driving the engagement of our visitors, we have seen an overall increase in content-related page views of over 26%, which is our most valuable inventory for advertising sales.” What’s more, analysis predicts that revenues derived from incremental advertising sales will be as much as 10%, an estimated $500,000 to $650,000 per year.

Additionally, the company’s visitor engagement efforts have positively impacted many of the company’s valuable partnerships. “Since we instituted the engagement metric at Philly.com and moved away from tracking just page views and visits, we have learned much more about our most loyal users,” said Stetter. “The engagement metric has proved invaluable when discussing new online strategic initiatives as well as evaluating the current partnerships on our site. We are now able to gauge the effectiveness of our current and future partnerships from an engaged audience standpoint, which then allows us to tell a unique story to our advertisers.”

Thanks to a particularly good use of Adobe’s Omniture technology, a motivated analyst in Chris Meares, and a group of forward-thinking executives including Kevin Stetter, Philly.com is well positioned from advertising sales, partnership, and editorial perspectives. Over time, the measure of visitor engagement is likely to evolve in both its calculation and use to deliver increasingly valuable insights.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.118.121.54