Political advertising is as old as democracy itself. With the current digital age, however, political parties have started to advertise by abusing the data economy to a manipulative extent. One such practice is micro-targeting. It is a form of advertising that involves analyzing individuals’ personal data, and weaponizing that data to tailor advertisements to that individual’s susceptibilities. Over the past decades, political parties and campaigns have increasingly implemented micro-targeting to influence outcomes of elections, by micro-targeting opposing and indifferent voter bases. The danger to the democratic process is evident, as micro-targeting manipulates voters’ behavior by influencing their capacity to choose their vote. Secretive data analytics companies have played a leading role in collecting and analyzing data, as well as tailoring advertisements based on that data.
Instances of micro-targeting that allegedly have made a major impact on politics have occurred across the globe, including the 2016 Trump campaign, the Brexit campaign, numerous elections in Germany, Italy, Nigeria, Malaysia, Ukraine and many more. Despite the EU’s theoretically strong personal data protection mechanisms, the lack of enforcement and effectiveness have prevented any actual litigation. The European Commission has ambitious proposals to hold micro-targeting entities accountable and protect personal data, such as the European Democratic Action Plan and the Digital Services Act. Bearing in mind the legality of political advertising, the lack of enforcement of EU data protection and the European Commission’s ambition to make sharing of data among Member States easier: How can the European Union tackle micro-targeted political advertising while respecting the right of political advertising?
‘Personal data is the new oil of the internet and the new currency of the digital world’, said Meglena Kuneva, the former European Consumer Commissioner, in 2009. Less than a decade later, in 2017, data surpassed oil to become the world’s most valuable commodity. In the current age of internet giants such as Facebook, Google and Amazon, it is becoming increasingly difficult to regulate these companies, and by extension the use of the they collect.
Now, policymakers are faced with a challenge: protecting their democracies from manipulation of elections. Over the past decade several scandals have come to light concerning during election campaigns. What these scandals have in common is the involvement of social networks, with Facebook as the most notable example, and secretive data analytics companies, such as Cambridge Analytica and Aristotle. In these times of , what should the European Union (EU) do to protect the integrity of elections, while upholding the right of political parties to advertise their campaigns? As Carole Cadwalladr, a journalist for the Guardian, put it: “It’s about whether it’s actually possible to have a free and fair election ever again.”
The (JRC) has researched the threats that digital platforms form to democracy. In the digital age, there are two aspects that form a threat to democracy:
1. Data is a commodity
The JRC found that a crucial threat to democracy is . Consumers of social networks, such as Facebook, often are not aware that their data is monitored, saved, analysed or even sold. The issue is that this data is used to micro-target political advertisements, without the consent or knowledge of the affected voters.
2. Online advertisements affect voter behaviour
In addition, the JRC found that these micro-targeted advertisements, and exposure to information on digital platforms in general, has a significant impact on voting behaviour. The more personal data a company has, the more specifically it can tailor political advertisements to sway an individual’s vote.
JRC mentions that, although a functioning democracy requires open discussion, the digital information sphere is controlled by few actors with too little oversight. The JRC affirms that a functioning democracy relies on open discussion, as long as freedom of choice can be guaranteed.
Furthermore, the JRC notes that policymakers should focus their attention on three key factors: actors, content and behaviour. Despite the consideration of policy on actors and content regulation, the JRC fears that policymakers are falling behind on the aspect of behaviour, and urges them to catch up with technology.
Data Analytics companies
Since these companies are ambiguous to define, they are best identified by the manner in which they operate. Firstly, their work relies on techniques driven by personal data. Collecting, selling or analysing data, or relying on data collection or analysis for their strategies, tools or techniques. These types of data analytics firms excel at data-driven political campaigning. Despite the secrecy of these projects, the process of Cambridge Analytica has been uncovered, and can be used for reference.
Secondly, these firms have political clients. Some data analytics firms are transparent about their clients, for instance on their websites, whereas most are more secretive.
A notorious example of the data analytics companies with political clients is Cambridge Analytica, although it has many lesser known counterparts, such as AggregateIQ, Harris Media and Aristotle. Our Data Our Selves, a research project by Tactical Tech, estimates that there are over 300 companies across the globe that provide similar data-driven campaigning for political campaigns. Secrecy is something most of these firms work hard on to establish and maintain. “We’ve just used a different organisation to run a very, very successful project in an Eastern European country where… no-one even knew they were there”, said Mark Turnbull, a former executive at Cambridge Analytica, the BBC reported in 2018.
An important part of understanding how these data analytics companies operate, is to take a deeper look at the few cases of microtargeting political advertisements that are known to the public. The most covered and elaborate case study we know is Cambridge Analytica’s work during the 2016 Trump election campaign. At the height of the Trump election campaign, one million US dollars a day was spent on political advertisements on Facebook by the Trump campaign.
Micro-targeting happens in three stages:
1. The collection of personal data.
2. Processing and analysing personal data.
This often involves the creation of a psychographic profile, which gives an extremely detailed insight in voters’ political opinions and orientation, digital behaviour and more personal data. Cambridge Analytica claimed to have had 4,000 to 5,000 data points per person, on 240 million American voters.
3. Sending political advertisements to individual voters, tailored to their susceptibilities.
Political micro-targeting is a type of social engineering, meaning it aims to manipulate and sway an individual’s vote in favour of a certain political party. Two aspects of the micro-targeting are important to note. Firstly, not everyone is targeted by these ads. Specifically vulnerable or crucial groups of voters are targeted, as they are the votes that decide election outcomes. Among these data analytics companies, they are known as ‘the persuadables’. During the 2016 US elections, only 70.000 voters across 3 states tilted the scales in favour of Donald Trump.
Secondly, political parties do not only ask these data analytics companies to persuade voters in their favour. In Trinidad and Tabago, young Afro-Carribean voters, supporting the People’s National Movement party, were micro-targeted by advertisements, pushing them not to vote at all. This eventually won the majority-Indian United National Congress party the elections.
Numerous political parties have hired data analytics companies to analyse and weaponise data for micro-targeted advertisements. (See map) These parties are often either expecting a close election, or a loss. Most data analytics are secretive about their clients, but paradoxically often promote themselves by showing what elections they worked on.
Political parties known to have hired data analytics companies for their campaigns are the Trump campaign in the US, Alternatieven fur Deutschland in Germany, elections in Italy, the Czech Republic and Ukraine, and re-election campaigns in Malaysia, Nigeria and many more globally (see the map for an overview of global involvement of data analytics firms in elections). Crucial to note is that most political parties often deny cooperation with such companies, and reiterate that the allegations are unproven, although they are often well substantiated.
National Data Protection Authorities
Data Protection Authorities within Member States are charged with enforcing EU regulations on protection of personal data, especially the General Data Protection Regulation (GDPR). Most notably, the Irish Data Protection Commission and Luxembourg’s National Commission for Data Protection are key actors in enforcing the GDPR on large (mostly American) tech companies, such as Facebook, Amazon and Twitter, as most of their European headquarters are situated in either countries. This also means complaints on the basis of GDPR from other data protection authorities have to be forwarded to the Irish and Luxembourg’s authorities, as the French, German, Belgian and Austrian authorities have done.
Technology and social networking companies
The largest, mostly American technology and social networking companies are platforms where most data is to be found. Social media profiles, likes and behavior are the place where most personal data can be found, and subsequently processed, analysed or sold. Not only explicit social networks such as Facebook and Twitter are places where data can be harvested and advertisements can be published. Other platforms, such as Google and Amazon, are also full of personal data and advertising opportunities.
The voter bases are the silent victim of micro-targeted political advertising. The most crucial group of voters are those who are uncertain what political party to vote for yet, or whether to vote in the first place, and are referred to as the ‘persuadables’. They are the most heavily micro-targeted group. These voters are also the group whose personal data is extracted and used the most to the advantage of political entities.
European Commission is the executive body of the EU. The Commission executes EU policy, initiates legislative proposals for the European Parliament and enforces EU legislation. Two specific members of the Commission are important to note:
– Vera Jourova, Vice-President for Values and Transparency, is in charge of the execution of the European Democracy Action Plan.
– Margrethe Verstager, Executive Vice-President, leads the European Commission’s policy on data regulation and protection.
European Cooperation Network on Elections was founded by the European Commission in 2019, and consists of representatives from Member States’ authorities with competence over election. The network facilitates practical knowledge exchanges on the topic of free and fair elections, including data protection during elections.
European Data Protection Board (EDPB) enforces the General Data Protection Regulation in the entire EU. The EDPB consists of the heads of national Data Protection Agencies (Authorities) of each Member State, as well as the European Data Protection Supervisor (who is appointed by the LIBE committee). The EDPB ensures that data protection laws are applied consistently throughout the EU. The board does this by issuing guidelines on the interpretation of core principles of the GDPR, and rule by binding decisions on cross-border disputes. The EDPB also includes the Irish and Luxemburg’s data protection regulators.
General Data Protection Regulation (GDPR)
The GDPR is the EU’s legislation on the protection of personal data, and went into effect on May 25th 2018. The GDPR is EU legislation, but regulates companies globally as long as they interact with personal data from EU citizens.
The Regulation mainly regulates how companies can get EU citizens’ consent to use their personal data. A request for consent from a company has to be ‘clearly distinguishable from other matters’, as opposed to burying it in paperwork as was often done before the implementation of the GDPR. The Regulation also gives all EU citizens . This entails that EU citizens can request a company to show, alter or delete any personal data. If companies do not comply, they need to disclose a good argument for their decision. Moreover, GDPR requires personal data to be stored securely, in order to prevent it from being stolen. In case of a data breach, companies must alert their national regulator within 72 hours, if possible. If companies do not comply, the GDPR can fine companies up to 20 million euros or 4% of global revenue, depending on which is higher.
Political opinions are included in the most heavily regulated personal data included in GDPR. Unless you are obligated by law, the processing of political opinions, along with generic data, biometric data and some other areas, is strictly prohibited.
Given that political advertising is legal, where do you draw the line?
Brittany Kaiser, former director of business development at Cambridge Analytica, pointed out when discussing Cambridge Analytica’s strategy of micro-targeting: ”What this strategy is mostly meant to do, is to identify people who are still considering many different options, and educate them on some of the options that are out there, and if they’re on the fence, then they can be persuaded to go one way or the other. Again, that is their own choice”. Despite regulations on the use of personal data, no legislation or guidelines outline when political advertisements are legitimate, or have become manipulative. Currently, political advertisements that use micro-targeting are often protected by freedom of expression.
GDPR lacks enforcement
Despite the renowned approach of the GDPR, the Regulation lacks enforcement, conceded Vera Jourova, the European Commission’s Vice-President on Values and Transparency. Apart from the 50 million euro fine received by Google in France, GDPR has prompted no significant fines or measures yet. Many national data protection regulators and data protection advocates have numerous complaints about the GDPR’s enforcement, including:
– Too much bureaucracy around filing complaints on the basis of GDPR, such as Google’s location tracking, and lack of compliance from Amazon, Apple, Facebook and Twitter.
– National data protection authorities are too hesitant in undertaking legal action on the basis of GDPR, and prefer engagement over investigations.
– The EDPB members, Member States’ national data protection authorities, lack transparency towards each other, impeding effective cooperation. This is the case due to cultural differences, divergent legal systems and an outdated knowledge exchange system.
– Differences in interpretation of the GDPR. This means enforcement cannot happen until the European Court of Justice has to apply the GDPR, and thereby set a precedent for its application.
– The Irish Data Protection Commission has been unable to file most legal challenges forwarded by authorities from other Member States, due to an overwhelming number of cases and lack of funding.
Data ownership vs. Data mobility
Several initiatives, including #OwnYourData, have advocated for users of social platforms and other entities that harvest data to become legal owners of their data. Their vision entails that users should decide and see what their data is used for, take their data with them when they leave the platform, and even get paid for the data they give away. In addition, data ownership would grant users property rights, and the legal privileges and accountability that offers. In the status quo, political advertisements are untraceable after they are seen, so accountability is hard to pin on a single actor. The European Commission, however, has put forward the European data strategy that should make the sharing of data between Member States and the EU easier in sectors such as health, agriculture and public administration. How can the EU ensure the safety of personal data, while ensuring easy exchanges of data for Member States and the EU?
The European Democracy Action Plan (EDAP)
The European Democracy Action Plan outlines the approach the European Commission will take to protect the democracies of Member States during the upcoming years. The EDAP consists of three pillars, the first one being the promotion of free and fair elections. Most notably, legislation on the transparency of political advertisements will be introduced by the European Commission. Moreover, the Commission will revise the rules on the financing of political parties in Member States. The European Cooperation Network on Elections will more closely cooperate with Member States’ authorities to ensure data protection during elections. In addition, the lack of enforcement of the GDPR will be addressed in the EDAP as well. The specifications of the EDAP are yet to be decided, and will be released in late 2021.
Digital Services Act (DSA)
In December 2020, the European Commission announced a new proposal to regulate large technology companies in the EU: The Digital Services Act (DSA). In general, the DSA aims to force online platforms to police their platforms for illegal content more thoroughly than in the status quo. More relevantly, a specific section of the DSA forces the largest tech companies to provide greater transparency. The DSA compliments the GDPR, by adding regulations to the existing ones concerning specifically targeted advertising. Under the DSA, users will be informed on who targeted them with advertisements, why, how, and who paid for the advertisement. Notice and action obligations apply for illegally targeted advertisements.
The DSA does not define what is illegal online, as it solely gives national regulators more competencies to enforce national laws on online platforms. Enforcement of the DSA is a shared task of Member States and the EU.
Each Member State appoints a Digital Services Coordinator, who will supervise the specialised national authority on illegal content and advertisements on online platforms. The Coordinators of Member States will cooperate in an independent advisory board, called the European Board for Digital Services. This board can support analyses, reports and coordinate joint investigations. Member States will clearly outline penalties for breaches of national laws in line with the DSA, ensuring proportionality and compliance. Furthermore, for the largest online platforms, the European Commission has direct supervision powers. Firstly, the Commission can fine platforms up to 6% of their global turnover. Moreover, the Commission and Digital Services Coordinators can require immediate action when necessary. If platforms refuse to comply, a court can be asked to temporarily suspend their service.