digital_mapping

The Benefits of Transparency

Evan Allgood
Writer

Editor’s Note: Data sharing and marketing. Does that combo scare you, conjuring dark images of Facebook, Cambridge Analytica and Ashley Madison? Or does that excite you, with hopes of easier shopping and customized preferences? Or do you think it doesn’t impact your life at all?

Whatever you think about data gathering, it’s happening and those organizations collecting and using that data don’t just have an ethical responsibility to “do the right thing,” they also have a bottom-line responsibility to “do the best-business thing.” Sometimes those responsibilities are competing, but sometimes they can work together. How to tell which practices create what tensions and what opportunities? Using PeopleScience’s favorite tool to advance the world: Experimentation. Read on …

 

The proliferation of consumer data in the Digital Age presents a golden opportunity for companies, but it also poses some tough questions: How transparent should we be about data collection and use? Which data should we use to tailor consumer experiences?

“There is a greater demand for transparency from consumers and regulators,” says Tami Kim, a professor of marketing at the University of Virginia. “Companies are still trying to figure out how to manage all these demands while optimizing how they use consumer data.”

Kim and two other researchers, Leslie John of Harvard and Kate Barasz of the University of Navarra in Spain, partnered with the Maritz Field Research Collaborative to conduct a two-part study that aimed to measure the impact of transparency on rewards and loyalty programs. Specifically, they wanted to know if providing more transparency about product recommendations would increase those recommendations’ effectiveness.

Their findings have major implications for how rewards programs should pitch their offerings.

If you’re a loyalty program wondering whether or not to be more upfront about how you’re using customers’ data, this study says yes.

How did these transparency studies work?

In the spring of 2017, the researchers carried out two experiments involving several thousand participants in Maritz-powered rewards programs.

In the first study, 9,079 participants were randomly divided into a Control group and a Treatment group. The Control group saw a sidebar of recommended items that just read, “Recommended.” The Treatment group’s sidebar read, “Recommended based on your clicks on the site.”

The second study featured fewer participants (1,862) but a similar premise. Again the Control group’s sidebar just said, “Recommended.” But this time, the Treatment group’s sidebar read, “Recommended based on what you’ve shared with us.”

The researchers hoped to measure 1. How likely each group was to click on a recommended, 2. How much time they spent on the Recommended Products webpage and 3. How many points they spent on recommended items.

Maritz’s Senior Data Strategist, Sarah Ramrup, says that the researchers gathered enough data within just a few weeks of launching the experiment. "Running two tests, each with a control (as opposed to just testing 'Recommended based on your clicks' vs. 'Recommended based on what you’ve shared with us' within the same single group), allowed us to better measure the effect of the language."

What were the results?

In the first study – “based upon your clicks” – the Treatment group (which experienced more transparency) was 11 percent more likely to click on items that were recommended to them versus the control group. They also spent 34 percent more time on the Recommended Products page and 38 percent more points on recommended items.

sutdy_1

 

In the second study – “based upon what you’ve shared with us” – participants who experienced more transparency were 50 percent more likely to click on items that were recommended to them versus the Control group. They also spent 39 percent more time on the Recommended Products page.

study_2

Here’s where things get interesting: In the second study, there was no significant difference in how many points the Treatment group redeemed on recommended items versus the Control group — even though they spent significantly more time viewing the recommendations.

 

What do these numbers mean?

There’s a lot to unpack here, but the bulk of these results suggest that when it comes to reward and loyalty programs, transparency pays off. As the study’s authors put it in a recap for Harvard Business Review:

Such disclosure can be beneficial when targeting is performed in an acceptable manner — especially if the platform delivering the ad is otherwise trusted by its customers … We also found that when trust was high, disclosing acceptable flows actually boosted click-through rates.

By “acceptable flows,” they mean data collected from within the site (as opposed to data from a third party, which makes people squeamish).

A couple caveats from the researchers:

  1. There are a lot of conflicting data when it comes to transparency, i.e., “There’s a fine line between creepy and delightful.”  
  2. In addition to consumer trust (which rewards and loyalty programs tend to have), two key factors with transparency are control — the ability to change one’s privacy settings, block ads, etc. — and justification — explaining why the consumer is seeing a recommendation or why you’re tracking their location. If you don’t have trust and you don’t offer control and justification, transparency may backfire, i.e. it’ll skeeve people out (Editor’s note: We try not too get to technical here at PeopleScience, but it’s sometimes unavoidable, so apologies for using highly scientific terminology like “skeeve people out.”)

Still, if you’re a loyalty program wondering whether or not to be more upfront about how you’re using customers’ data, this study says yes.

Why didn’t the second study boost redemptions?

Remember that in both studies, participants who experienced more transparency were more likely to click on the recommended products and spend more time viewing them. But only the first study — “based on your clicks” — actually increased the amount of points they redeemed for these items (by a whopping 38 percent).

The second study, in which the sidebar read, “Recommended based on what you’ve shared with us,” did not significantly boost point redemptions.

Tami Kim says that one reason for this is just that redemptions or purchases are really hard to get. “It’s a hard action to convince customers to take — much harder than just getting them to click on something.”

Sarah Ramrup wonders if the first experiment boosted redemptions because the wording was more explicit regarding data collection. “We were specific about what we were using, and how we were using it,” she says.

“More companies are trying to be transparent, but we still have a long way to go.” – Tami Kim

What’s next?

Ramrup says that Maritz has already started incorporating some of these findings into its offerings. “This speaks to how excited we are to incorporate behavioral science into what we do. Bringing data and science together through an on-site experiment is just one example of how we are trying to create valuable participant experiences.”

(Editor’s Note: Transparency Edition: A reminder that PeopleScience is supported by our parent company, Maritz. We love them, not just for their support, but because they do embrace behavioral science in a way we appreciate. We try to maintain editorial objectivity with everything we publish here, but acknowledge we may have a blind spot or two. This disclosure repeart is brought to you the content of this article and the letter “guilt.”)

For Kim and the research team, this study — while compelling — was just the tip of the iceberg. They want to explore other aspects of transparency, such as…

  • Mediators of effectiveness: “There may be instances in which one’s desire for privacy and for personalization are not in conflict. Thus, we leave open the possibility that other factors may also mediate the path from transparency to effectiveness.”
  • Additional implications for the firm: “Ad transparency may affect a range of consumer perceptions and behaviors, beyond the propensity to click on a given ad … it is worth investigating how ad transparency might affect consumers’ holistic view of the firm.”
  • Implications for consumer welfare: “It is conceivable that transparency could also have unforeseen adverse effects for consumers … Future research should examine the effects of ad transparency on consumer welfare.”
  • Ad transparency that is conspicuous versus merely available: “If left to their own devices, how likely are consumers to seek out (transparency) when it is made available to them? Are certain types of consumers more likely to seek such information than others?”
  • Trade-off between privacy and personalization: “How much personal information — be it demographic, stated preference, or behavioral — are consumers willing to divulge in exchange for better personalization?”
  • Dynamic attitudes and practices: “Consumer attitudes about the collection and use of personal information have changed over time, and will inevitably continue to change into the future … empirical research must keep pace.”

In the meantime, rewards programs would be wise to exhibit more transparency with consumers.

“More companies are trying to be transparent,” Tami Kim says, “but we still have a long way to go.”

Evan Allgood
Writer

Subscribe

Get the latest behavioral
science insights.