Recent events involving Facebook and Cambridge Analytica serve as a powerful reminder of the new technological era we are entering. Because this age of ‘big data’ is unprecedented, security breaches and data hacks are often dealt with retroactively, with companies facing severe backlash from their consumers who feel their information is not properly protected.
To briefly summarise, Cambridge Analytica combines data analysis with behavioural science to provide businesses and political parties with information they can use to target individuals with marketing material.
Christopher Wylie initially came up with the idea to combine data and psychology in this way, and oversaw the creation of these ‘psychological profiles’ at Cambridge Analytica. About a year ago, Wylie became a whistleblower and began sharing his story with reporters, but it was only a few weeks ago that it started gaining attention.
Channel 4 aired an in-depth expose covering Cambridge Analytica’s questionable practices, which included targeting US voters with specific political advertisements based on their psychological profile, as well as using fake news to influence elections across the globe.
While all these activities are dubious to say the least, the way in which Cambridge Analytica obtained the data to create these profiles garnered the most public outcry. The firm used personal information, which was gathered from millions of Facebook profiles, to send targeted ads to users based on their psychological profile.
While some individuals may have permitted this, many did not. Cambridge Analytica obtained data from individuals who had taken part in a personality quiz on Facebook, as well as from friends of users who had taken the quiz, but did not sign off to their data being used in this way.
Facebook has come under major scrutiny as a result, with a recent poll suggesting that less than half of Americans trust Facebook to protect their data and follow US privacy laws.
Although Facebook was not responsible for creating these targeted advertisements, they are coming under attack for not having significant measures in place to prevent something like this from happening in the first place.
In addition, Facebook has apparently known about this security breach since 2015, and while it requested Cambridge Analytica destroy the data it obtained, they did not. These revelations have triggered a larger debate revolving around how much data we allow companies, such as Facebook, to access and how little we know about what actually happens with it.
Although Mark Zuckerberg stated he was “really sorry that this happened,” it is highly unlikely this security breach caught them by surprise, as using personal data for commercial uses, whether or not the individual is aware, has always been a part of the Facebook business model.
Following the release of the Cambridge Analytica story, many reports came out, discussing how digital marketers and market researchers use tactics like this all the time. To this group of individuals, it was common knowledge that Facebook’s API allowed third parties to access friend data of those individuals who were interacting with the third party.
Although accessing big data like this could help businesses tailor services to individuals more efficiently, it must be done in a way in which the user is aware and does not feel as if their information is being sold to the highest bidder. Data collection and uses by companies should be transparent so the user understands what they are receiving in return for giving their information away.
Facebook’s recent apology may be indicative of the tech company making a conscious effort to fix these issues in attempts of preventing them from occurring again. Or it may be a last-ditch effort to save face after being caught out. Either way, it represents the fact that big tech companies must transform the way they have operated in the past to regain trust and confidence from their users.
It’s clear there has been a shift in how individuals view big data and the companies that have access to it. It is also clear that enormous benefits can be had and major improvements can be made with the right access to data and new technology. However, without the trust of users, these potential gains could be irrelevant.
Brands could be completely transformed with the right approach to big data. Access to this information can allow for more personalised, predictive customer experiences, a win-win for both business and consumers.
Earning and maintaining this form of trust; however, is an increasingly complex business, and brands must learn how to demonstrate trust and earn the loyalty of today’s demanding consumer in this new era of big data.
Our event, Future of Brands in a Post-Truth Era, will be discussing these topics in greater detail and is now open for registration.
Rocket fuel, our one day summit, revolves around the idea that leveraging data is key to the efficiency of all new disruptive technologies. Register your interest today.