20.05.2019 | Health Strategy

Pharmaceutical social media after Cambridge Analytica

By Daniel Ghinn

Pharmaceutical social media after Cambridge Analytica

This article was originally written for and published in the May 2019 edition of PMLiVE.

As UK MPs call for regulation of social media companies, what are the ethical and compliance implications for the pharmaceutical industry?

“Democracy is at risk from the relentless targeting of citizens with disinformation”, according to the House of Commons Digital, Culture, Media and Sport Committee, which published its final report on Disinformation and ‘fake news’ in February 2019, signalling the end of an enquiry that had begun in September 2017 and put some of the world’s most powerful technology companies under the spotlight. The Committee stated that “This is the Final Report in our inquiry, but it will not be the final word.” Indeed, the legal and ethical implications of the investigation are likely to have far-reaching repercussions.

In an age when many in the pharmaceutical industry are still cautious about digital marketing, how will the House of Commons’ report affect the industry? And what lessons can we learn about ethical and legal boundaries in data, to shape our thinking about pharmaceutical use of social media?

The Committee’s report followed unprecedented accusations regarding the use of data on individuals by Facebook and a company called Cambridge Analytica to manipulate public opinion and behaviour. But the story began years before most people had heard of Cambridge Analytica, the data analytics company that has been associated with the Leave campaign for Britain’s EU membership referendum and Donald Trump’s election campaign, or Christopher Wylie, the ‘whistleblower’ and data scientist whose idea at the age of 24 had led to the foundation of Cambridge Analytica.

Informed consent?

Dr Aleksandr Kogan is a Cambridge University psychologist who once worked with Cambridge Analytica through his company, Global Science Research. Dr Kogan is a Senior Research Associate at Cambridge University’s Department of Psychology where, in 2015, he led a study which examined whether social class plays an important role in international friendships.

To conduct this study, Dr Kogan and his fellow researchers, who included two Facebook employees, used two approaches: on the one hand, Facebook provided the researchers with data on 57 billion friendships formed in 2011 in every country in the world at a national aggregate level, which was compared with data on GDP per capita. The other aspect of the research used an app to collect data on friendships from Facebook users. The research team used their own Facebook app (a Facebook app is a technical integration allowed by Facebook) to gather information from 857 Facebook users who consented to authorizing the app to gather some information from their profiles automatically. This information included their total number of friends and their friends’ current location.

Data loophole

In the same year that his academic research was published, Kogan set up a company, Global Science Research (GSR), which is not part of Cambridge University. The app was repurposed, rebranded and made independent of Cambridge University.

When 270,000 Facebook users signed up for GSR’s app, a personality profiling quiz called thisisyourdigitallife, they also agreed to share their personal details with GSR. But a Facebook loophole also allowed GSR to access the users’ friends’ data. So as people signed up, GSR gained access to the details of around 50M people. With this data they were able to develop ‘psychgraphic profiles’ of Facebook users which could be used to predict the voting likelihoods of individual people based on personality. GSR shared this data with Cambridge Analytica.

Cambridge Analytica whistleblower Christopher Wylie has claimed to have also been a central figure in setting up digital advertising web and software development company AggregateIQ (AIQ), the company that the ‘Vote Leave’ campaign spent 40% of its budget with. Evidence presented to the House of Commons Committee showed that there was a flow of data between entities including Cambridge Analytica, the Pro-Brexit Referendum Campaign, and The Donald Trump Presidential Campaign, via AIQ,

In its report on Disinformation and ‘fake news’, the House of Commons Committee stated that “The work of AIQ highlights the fact that data has been and is still being used extensively by private companies to target people, often in political context, in order to influence their decisions.”

A call for regulation

MPs in the UK have now called for regulation of technology companies: commenting on the House of Commons Disinformation and ‘fake news’ report, Damian Collins MP, said that “The rights of the citizen need to be established in statute, by requiring the tech companies to adhere to a code of conduct written into law by Parliament, and overseen by an independent regulator.”

The House of Commons has called for new independent regulation to be funded by a levy on tech companies operating in the UK, in order “…to create a regulatory system for online content that is as effective as that for offline content.”

If the technology industry has been complicit in the manipulation of people through disinformation, this must surely raise questions for industry at large, which has always used communication of information to shape people’s views and behaviours. And what is the implication for how the pharmaceutical industry should use social media? Let’s take a look at three areas that related to the industry’s use of social media: data collection and analysis; social media engagement; and targeted advertising.

Data collection and analysis

When it comes to social listening – a growing source of intelligence for pharmaceutical market researchers – it is already essential that data collection is carried out in line with platforms’ terms of use applied to both the social media users and the researcher.

In the Cambridge Analytica case, however, Facebook contravened its own user agreement. “It is evident that Facebook intentionally and knowingly violated both data privacy and anti-competition laws” reads the House of Commons report. Facebook was willing to override users’ privacy settings in order to transfer data to some app developers.

It is worthwhile noting that the concerns over Facebook’s treatment of user data are not simply a reflection of a new world where personal data is considered sacred and the EU General Data Protection Regulation (GPDR) has reset the rules on data processing. In fact, in 2018 Facebook was fined £500,000 – the maximum allowable at the time under law prior to the introduction of GDPR – by the ICO, which stated that “companies are responsible for proactively protecting personal information and that’s been the case in the UK for thirty years.”

As far as regulatory compliance is concerned, this changes nothing for the pharmaceutical industry, which was regulated concerning data collection on individuals well before the introduction of recent laws on data collection. Permission-based digital communication is already a fundamental concept in pharmaceutical marketing.

Social media engagement

The pharmaceutical industry is arguably more cautious than most others when it comes to engaging customers via social media, and perhaps with good reason: direct-to-consumer marketing of prescription-only medicines is illegal outside of the United States or New Zealand, and some pharmaceutical marketers still choose to avoid risks of perceived product marketing by locking down social media engagement.

But well over a decade into the social media age, most major pharmaceutical companies have carried out some form of social media engagement. While many in the industry feel that progress in social media engagement is painfully slow, the Facebook – Cambridge Analytica case is unlikely to hinder those companies who have navigated a course to compliant social media engagement.

Targeted advertising

Perhaps the most significant ethical issue emerging from the Cambridge Analytica story is the use of data to spread what the House of Commons Committee calls ‘disinformation’ in order to manipulate the behaviour of people. In this regard, along with issues of foreign influence and the sharing of data, the Committee has called for independent enquiries into past elections in order to enable changes to the law to be made.

There are always, of course, important ethical considerations when it comes to using marketing and communications tactics – whether on social media or any other environment – to influence people’s views and behaviours. In any industry, the use of false or misleading information would clearly be a step too far from an ethical and possibly legal standpoint.

So again, the Cambridge Analytica case is unlikely to change the rules of engagement for the pharmaceutical industry, although it is possible that with calls for tighter regulation on technology companies, the options for targeting messages at individuals may become more restrictive.

Take a lead on ethical innovation

In the complex world of disinformation, ‘fake news’ and security breaches, the key for the pharmaceutical industry must surely be to take a lead on ethical use of data. In an industry that is already heavily regulated, the opportunity for innovators is to consider how the availability of data shared with informed consent by social media users can shape tactics that give value back to those users openly, influence health behaviours positively, and achieve great outcomes.

View all articles >

Meet the Author

Daniel Ghinn

Daniel has been at the helm throughout the company’s life since 1998. His rich expertise in working with pharmaceutical businesses has enabled CREATION to build business solutions that fit our clients’ needs.

Daniel is married to Jo, has three children, a cat, a dog, 28 fish, and 160,000 bees.