Data and Trust: 4 Pressing Risks for CMOs

Blog Post
April 13, 2018

No member of the C-Suite wants to be pressed into duty for an apology tour which includes giving televised testimony before US House and Senate committees. Yet Facebook CEO Mark Zuckerberg did just that. How many executives lose sleep worrying about walking in those shoes?

Customer data privacy is an area of intense public and governmental concern across the globe. Since customer data are often managed by marketers, companies need their CMOs to take the lead on this one. We see four major risks that must be managed immediately. If done well, the risks become opportunities.

Risk 1: Not Advocating for Data Privacy

While Facebook talks a good game, they were not advocating for users. They were leveraging user data for economic gain with advertisers. Privacy advocates have been shouting about the problem since 2010, even before Facebook went public in 2012. What we witnessed with user data practices since then was a race to the bottom, where user data was merely a means to an end. Data practices did not serve users nearly so well as they served Facebook.

Marketers should know better. Customer data is part of a sacred, assumed and implicit agreement of trust: I give you data about me as part of an arrangement, and I trust you will protect it carefully. Trust is the universal essence of relationships. When trust is broken, relationships fall apart.

Avoiding a data apology tour starts with advocating for the customer and being a champion for their data interests. This is easily said but far more difficult to implement. And to turn this risk into an opportunity, the marketer needs to be scientific.

Everyone’s natural state is 100% privacy: people do not want their personal data revealed to strangers. From the starting point of privacy by default, any use of customer, prospect or user data must serve their interests. The tricky part is finding out which compromises to 100% privacy will be viewed by the customer as beneficial and which ones are likely to lead to an apology tour. This is where marketing science can help.

Customers are often unaware of what they really need. This is especially true when it comes to personal data. You can’t just ask ordinary people questions about data privacy and get knowledgeable answers—especially when the value exchange between the customer’s provision of their personal data and company’s use purposes are solely described through privacy statements. And you certainly can’t expect a typical customer to read, fully comprehend or expect to engender their trust through a three thousand word, legalistic privacy statement and terms of use.

Customers often have no idea how a specific data privacy compromise could lead to various side effects. So, they either react by insisting on 100% privacy, or they naïvely consent to exposing their data without full awareness of the downstream side effects. Neither response is likely to serve the customer or the company in the long run.

A useful parallel is found in pharmaceuticals, where drug ads list possible side effects, and physicians make the decision about whether or not to write the prescription. The assertion being made here is that marketers need to play the role of doctor and protect their customers from all possible side effects of not keeping their data 100% private. Marketers need to embrace the spirit of the Hippocratic oath and apply it to customer data. They must ensure that all uses of customer data are designed to serve them and do no harm to their relationship.

Risk 2: Not Being Scientific About Data Privacy Needs

Marketing science can help marketers uphold a “do no harm with customer data” oath. It can show them which data privacy compromises indeed serve customers and which sets of privacy compromises correspond with the unique needs of specific customer groups. In the digital age, it is even possible to segment by individual customer. All of this is nothing more than applying marketing science to the issue of customer data privacy.

One market research tool that is especially useful when looking for innovative solutions to challenging marketing problems is the Jobs-to-be-Done methodology. When applied to data privacy, it tells marketers to ask questions like:

  • What jobs does a customer or prospect want to get done that are facilitated by the use of their personal data?
  • Do we ever use customer data in a way that does not help them get an important job done?

When customer data is used by a company to complete tasks that are important to customers, then the compromise of 100% privacy serves the customer. When the use of customer data is for a job that is not important to them, or worse, serves someone else at the customer’s expense, then you increase the risk of an apology tour.

European regulators a use legitimate interests test. It is a systematic framework for balancing the needs of the organization with the needs of the customer. If a company ensures that all uses of customer data are aligned with tasks customers want done, then the company and customer are always in balance.

The essence of marketing is segmenting customers according to their differences. We can segment customers into groups according to the jobs they want done. The world of digital marketing often enables us to go even further and segment by individual customer.

In application to data privacy, we can segment customers according to the data-enabled jobs they want done during the buyer’s journey. For example, one group of customers may want digital content presented to them, which helps them complete their buyer’s journey. Others may want to pursue their buyer’s journey with absolutely no assistance whatsoever. For this second group, any use of the digital breadcrumbs they leave along their journey is not only unappreciated but is also perceived as a violation of their privacy. The opportunity for companies is to find ways to meet the needs of both personas.

The Privacy Dashboard concept lets customers define how their personal data is used by the company for marketing purposes. The dashboard helps them tailor their shopping experience for a given moment in time and for how their information can be used in the future. In effect, it is a one-stop shop. It engenders trust by providing transparency about how personal data is used by marketing. Such an approach is also consistent with EU guidance.

This is what marketing is all about: discovering customer needs and responding to them. It’s all about respecting the customer and driving your company to be customer-centric. This is marketing’s highest calling, and it needs to be applied, with renewed rigor, to the topic of customer data.

Risk 3: Facebook Data in Your Warehouse

If you buy or access consumer data from data brokers, there is a rapidly growing chance you already store data that was diverted or scraped from Facebook without user permission.

The data accessed by Cambridge Analytica was first estimated to come from 50 million users, but the estimate quickly jumped to 87 million. Then a week or so later Mark Zuckerberg told us that most of Facebook’s 2.2 billion users had their personal data scraped by various bad actors. Recently, the original whistle blower Christopher Wylie cautioned that ill-gotten Facebook data may be stored in Russia since the original provider Aleksander Kogan is a Russian data scientist.

It doesn’t take an expert to conclude that ill-gotten Facebook data has now spread far and wide. It is like a digital cluster bomb that exploded through the internet into billions of sub-munitions. There is a fairly good chance one of these bomblets is in your data shop, that you have no way of finding it, and that you probably can’t even trace how it got there. That’s the bad news: you may have tainted Facebook user data in your data stores, and you can’t get rid of it.

The good news: if you do no harm with this data, then you abide by the privacy oath. The privacy oath assumes you have private data. What’s important is that you do no harm with it.

This does not mean you are free to actively seek data, either on your own or through a data broker, that was not willingly released. Knowingly acquiring ill-gotten data is not only a violation of ethical business standards and Fair Information Practice Principles (FIPPS). It is also a violation of the oath and turning a blind eye to the possibility that a data broker sells tainted personal data.

You need written assurances from your data brokers that the data you have acquired, or intend to acquire, does not, to the best of their knowledge, contain Facebook user data. But the Cambridge Analytica scandal also tells us written assurances are not enough. Marketers must perform diligence with respect to the broker’s application of controls and measures that ensure legitimate personal data collection practices.

Risk 4: Lax Security Measures

With the benefit of hindsight, we can see the laxity in Facebook’s approach to user data security. Contracting with developers to delete extensive Facebook user data files, as was the case with Cambridge Analytica, now looks more like a wink and a nod.

First, developers can convert raw Facebook data into “signals.” Mathematical operations are applied to multiple raw data fields to aggregate them into new, fewer, and more powerful descriptive or predictive data variables. If the original Facebook data is deleted but the descriptive and predictive signal variables are retained, the developer may conform with the letter of the contract, but the Facebook user is not protected at all.

Second and far worse, even if developers delete all Facebook data and any associated signals generated from the Facebook dataset, the powerful predictive behavior models stay with the developer as their intellectual property.

The purpose of these models, as every marketer knows, is to predict how behavior will change when exposed to a particular type of message or content. Such predictive psychographic behavior models can be used to do harm to users, and society overall, much more so than the raw data itself. This is the real and lasting damage done by Facebook’s user data laxity.

For example, multiple Republican candidates hired Cambridge Analytica in 2014 through 2016 to manage their advertising campaigns. While those campaigns may have used Republican National Committee data as the data file of record, it is a safe bet that RNC data was poured into Cambridge Analytica’s predictive behavior models to help manage ad spending—models originally developed using Facebook data.

Many now expect the EU to penalize Facebook for their lack of appropriate caution with user data. They already fined Facebook for misleading statements about WhatsApp, and there may be mounting determination in the US to penalize Facebook relative to a 2011 Federal Trade Commission consent decree.


CMOs must take the lead in the C-Suite when it comes to customer data. The wild, wild west era of fast and loose data practices is gone. Now, the race is on to establish modern and more customer-centric data privacy policies. Lagging behind not only represents a legal and punitive damages risk. It also misses the chance to become more customer-centric—and to innovatively differentiate in a way that is top-of-mind for many if not most customers.