Facebook and Cambridge Analytica: Let’s discuss.
First, let’s start with the facts so we can at least be on the same page. Russian academic Dr. Aleksandr Kogan, who had previously done academic research with Facebook on social dynamics, collected data from Facebook for an “academic study” and then turned around and sold some 50 million records (of which only some 250,000 records were legitimate opt-in customer records) to Cambridge Analytica under the auspices of being able to leverage algorithms to better understand the data.
Apparently, Dr. Kogan, who remains confused as to why he is being singled out (in his words, there are “plenty of other Facebook apps that also collect data”), also remains confused about the terms and conditions of academic research and data collection on Facebook, particularly a little detail around not being able to sell the data…but apparently, that is just the fine print.
So now, we have 50 million records of questionable and potentially illegal collection being used to power engagement programs for the Trump presidential election campaign. Everything until this point is clearly wrong. The data was inappropriately (at best) or illegally (at worst) collected and then sold (the really big no-no here). Without question, all of this is horrible.
But what happened next is where I struggle because Cambridge Analytica used the data collected to brilliantly and accurately target and personalize campaigns in order to raise awareness and support for their client.
And it worked.
Cambridge Analytica leveraged the Facebook data to understand behavior, sentiment, attitude and opinions to better segment their audience to deliver highly personalized, hyper-targeted messages to sway voter opinion. Donald Trump won, and Cambridge Analytica’s Chief Data Officer and CEO both have made statements saying that data is at the core of the win.
Boil it all down: Cambridge Analytica used social data, targeted consumers and personalized engagements to drive better results, and changed perceptions to the benefit of their client. It worked…and worked well. Hooray for data and analytics, right?
So why do I feel so bad about it?
Similar to the infamous “Target knew my daughter was pregnant before I knew” headline from a few years back, Cambridge Analytica pulled off what many of us dream of. What makes me feel slightly dirty by association is that, if you remove the parts about stealing, selling and who they did it for, data did what data is supposed to do. It delivered insights about an audience, helped craft strategy and direction, and then powered personalized engagements to shift, sway or solidify intentions. Now, if I could just get my head around everything that went so horribly, horribly wrong…
In essence, all of this happened between 2014 and 2015. Facebook was informed of the misuse by a whistleblower in 2015 and sent a “sternly worded letter” telling Cambridge Analytica that the data needed to be deleted. Everyone said, “Sure, we will delete it,” but as they were not asked for proof, it shouldn’t shock anyone that reporters who investigated the mess found that gigabytes of the data still existed.
Even after news of the debacle broke, Facebook stalled, delayed and gave partial statements, trying to return to an age when a statement of “but we are just the platform and can’t be responsible for what happens on it” would be accepted. When asked by a reporter what I thought Facebook needed to do in the wake of the newest mess, my answer was simple: Facebook needs to grow up.
Eventually, Mark Zuckerberg said that the privacy and security of the community was more important than profitability and that new measures would be set in place to ensure that nothing like this could or would ever happen again. The company is pulling out all of the stops to secure the trust of the community and stop an exodus of users who are terrified by the recent headlines and upset that their data was used against them.
Sadly, while Facebook seems to be going to great lengths to win back the trust of its users, little is being done to secure the trust and business of its customers (i.e., the brand marketers who were irreparably damaged by this whole mess). With every breaking tidbit of news, stories would remind consumers that there are bad actors out there that want to use their data. I heard one news anchor explain, “Everyone from Donald Trump to your local grocery store is out there mining your data to find out everything about you.” If the Edelman trust barometer placed advertising and marketing at the bottom of the barrel before, then next year, they will need to create a new category called “well below zero” to sum up how consumers feel about marketers and data.
It could be argued that Facebook wasn’t the party at fault here as Dr. Kogen and Cambridge Analytica were the bad actors defrauding the public and Facebook alike. In reality, the party that will have to pay the longest-lasting price for this betrayal of trust will be marketers who are attempting to leverage social data to create personalized experiences. We will need to act with care and operate with total transparency—something that Facebook itself finds hard to do with its clients. We will need to ensure that each act of marketing and data analytics is veiled in respect and empathy for the customers that the data actually represents. And first and foremost, we will need to work twice as hard to secure the trust of our customers considering that the digital platforms we all rely on aren’t infallible and shouldn’t be blindly trusted.