Influence Campaigns
ArticlesEnglish Articles

Cultivating a Holistic Understanding of the Digital Political Influence Space

Abstract: The use of data analytics to design, deploy, and finesse political influence campaigns is widespread and well-established. Throughout 2020, a team of Moonshot researchers executed a research program to develop a comprehensive overview of the techniques used to identify, extract, and utilize forms of data to target and refine political influence campaigns. Our research comprised deep desk research, multi-pronged comparative case studies, and conducting interviews and focus groups with subject matter experts across disciplines and sectors.

Bottom-line-up-front: We found that the essential considerations in the analysis of political influence campaigns – no matter how simplistic or technologically advanced the methods of influence used – are the social, cultural, and political contexts in which they unfold, as well as the affordances of the platforms on which they take place.

Problem statement: How to provide more accurate and actionable insights into digital political influence techniques?

So what?: We propose the crafting of an evidence-based, two-pronged framework rooted in answers to the following questions. Crafting a framework for what works where, when, and why: 1) What digital political influence techniques exist? 2) How and where have those techniques been used? 3) Where, if any, is there evidence of the efficacy of emerged/existing techniques? What vulnerabilities exist, and what techniques are emerging to exploit them? What contextual factors determine the efficacy of existing and emerging techniques? Crafting a platform-centric framework: 1) What platforms are used for political influence campaigns in the online space? 2) What techniques have been used on the platforms previously? 3) What techniques are being used on understudied platforms?

Influence, campaign, politics, pressure and elect

Source: shutterstock.com/serato

A Threat to Democracy

The use of data analytics to design, deploy, and finesse political influence campaigns is widespread and well-established. The synthesis of human intelligence, data insights, and automation enables not only scale but also rapid, highly responsive iterative development through increasingly precise feedback loops. The past five years have seen an alarming growth in political influence campaigns deployed by both state and non-state actors. With increasing access to data analytics, groups seeking to achieve political influence have the capacity to identify, target, and exploit societal divisions at an unprecedented scale. The Oxford Internet Institute reported, “[e]vidence of organized social media manipulation campaigns which have taken place in 70 countries.”[1] Understanding how data analytics are used to serve political interests is critical to shoring up democracy and a means of defending it within and beyond national borders.

The past five years have seen an alarming growth in political influence campaigns deployed by both state and non-state actors. With increasing access to data analytics, groups seeking to achieve political influence have the capacity to identify, target, and exploit societal divisions at an unprecedented scale.

Throughout 2020, a team of Moonshot researchers executed a research program to develop a comprehensive overview of the techniques used to identify, extract, and utilize forms of data to target and refine political influence campaigns. This research adopted a broad interpretation of the term “political influence” that went beyond activities typically deemed to be malign. In particular, we evaluated campaigns run by state- and non-state actors for normative purposes and those more nefarious in nature. The methods used to paint such a holistic picture of political influence techniques in this context included a review of the relevant literature, compiling multi-pronged comparative case studies, and conducting interviews and focus groups with subject matter experts across disciplines and sectors. This essay focuses on a theme that underpins many of the research gaps and methodological flaws we came across in our research. We found that the essential considerations in the analysis of political influence campaigns – no matter how simplistic or technologically advanced the methods of influence used – are the social, cultural, and political contexts in which they unfold.

Key Contextual Factors

In an interview with our team, Professor Steven Livingston, the founding director of the Institute for Data, Democracy, and Politics (IDDP) at George Washington University, highlighted a key gap in studies of political influence campaigns in the online space. He noted that researchers tend to focus on the quantitative, data-heavy aspects of such campaigns and thereby mischaracterize, misunderstand, or simply overlook the crucial political aspects of political influence efforts. The following contextual factors are crucial qualitative considerations required for mixed methods research of political influence campaigns and the information ecosystems in which they unfold.[2]

Declining Trust

Livingston argued in favor of increased emphasis on interrogating the diminished trust and credibility of authoritative institutions:

“To the degree to which we allow all of our energy attention, money, resources, and young PhD students to spend all of their time looking at another clever mapping exercise and identifying another batch of bots—that’s great, but it’s not going to answer the fundamental questions that have to be addressed in this day and age of crumbling liberal democratic institutions […] My greatest fear is that we end up spending too much time looking at the bells and whistles and not at the foundation. And I think there are really important questions to ask about what happened to those institutions.”[3]

At present, 36% of Democrats and Democratic-leaning independents say they feel they can trust the government, whereas only 9% of Republicans and Republican-leaning independents feel the same.[4] The US government is not the only institution that has suffered such a decline in trust. At the onset of the COVID-19 pandemic, a time in which the public should have been relying upon guidance from the Centers for Disease Control and Prevention (CDC), trust in the institution actually declined.[5] The UK has experienced a similar decline in public trust in politicians and the political process.[6] Such trends speak to a broader shift in waning faith in institutions and authorities, which has undeniably impacted trusted voices and opinion leaders in the political influence space.

At the onset of the COVID-19 pandemic, a time in which the public should have been relying upon guidance from the Centers for Disease Control and Prevention (CDC), trust in the institution actually declined.

The Undersold Role of Identity and Active User Participation

Indeed, according to Shannon McGregor, assistant professor at the UNC Hussman School of Journalism and Media, “If we want to understand how any type of campaign—political campaign, malign campaign, state campaign, disinformation campaign — works if we only look at the quantitative metrics, I do believe that we’ll be missing part of it because that’s not the only way that people are using it.”[7] When selecting the targets of a campaign, whether malign or benign in nature, actors appear to go beyond emphasizing demographics and focus on how people view themselves in the context of the world in which they live.

Several of our interviewees commented on the field’s heavy emphasis on the information model of politics, namely the idea that if people are provided correct information, they can be informed to make the right political decisions (e.g. voting) for their best interests.[8] Therefore, the sphere of political influence is often viewed “through the lens of epistemology and misinformation and disinformation,” which is fundamentally at odds with what recent studies have shown plays a key role in influence: identity.[9] According to McGregor, voters don’t make decisions based on information; rather, they do so based on how they identify as individuals.[10] Daniel Kreiss, assistant professor at the UNC Hussman School of Journalism and Media confirmed, “When it comes to things like policy or even facts about the world, most of [these campaigns] are designed to play off partisan or identity appeals: appeals to values, appeals to religion, appeals to race and ethnicity.” He later said, “I think it’s about using digital social media and data and analytics to craft and target messages to people who are part of the identity in groups that you’re looking to reach, and speak and appeal to, in ways that harden and make salient certain social divisions and mute others.”[11]

The role of identity lays the groundwork for the concept of participatory disinformation, as introduced by Kate Starbird, a University of Washington Center for an Informed Public cofounder and Human Centered Design & Engineering associate professor. According to Starbird, “Trump didn’t just prime his audience to be receptive to false narratives of voter fraud, he inspired them to produce those narratives (e.g. Sharpiegate, suitcases of votes, observers denied, etc.) and then echoed those false claims back to them.”[12] As we undertook case studies across diverse geographical, legal, and cultural contexts, our team found that participatory disinformation is not unique to the Trump-era US. The collective sense of grievance fostered by reactionaries across the globe enables the framing of weaponized victimhood, often centered around a shared identity. In the case of Trump’s November 2020 loss, he weaponized the perceived victimhood of his supporters to further chip away at the trust in liberal democracy and its processes, justifying extrajudicial violence. This culminated in the January 6th insurrection as well as the enduring belief that it was a rigged election. Indeed, as of a May 2021 poll, 66% of Republicans believed that President Biden’s victory was illegitimate.[13]

The collective sense of grievance fostered by reactionaries across the globe enables the framing of weaponized victimhood, often centered around a shared identity.

Limits of Microtargeting

In past marketing efforts, audiences were targeted to persuade individuals to buy a product. However, with the advancement of digital political influence and microtargeting, this once-marketing specific practice has expanded in its scope and use. Perhaps the most well-known use of microtargeting has been to persuade the electorate to vote for a specific candidate or cause. However, the possible application of microtargeting is boundless. A few other examples of outcomes pursued through microtargeting and segmentation include:

  1. Trying to influence a target audience to stay home from the polls (e.g. voter suppression);
  2. To mobilize offline for a specific cause (e.g. protesting);
  3. To influence parents to refuse to have their kids vaccinated (e.g. anti-vaxx movement).

One of the most advanced applications of user data is the development of psychographic profiles used to leverage microtargeting efforts. Third-party firms gather data from a variety of sources before processing it in ways that produce statistical models to profile social media by their ideological stance, voting intentions, sexuality, and mental health among other seemingly private forms of user data. While such modeling is highly probabilistic, it has become apparent that the breadth of available data bears the opportunity to make inferences about a person based on their social media activity, commercial habits, and web history. Beyond merely collecting, cleaning, and augmenting this user data in one space as a product in and of itself, like in traditional data brokering, psychographic analytics go further, applying such data to produce entirely new products. The products often include statistical models and target profiles, which are attractive to political influencers seeking to segment audiences and micro-target messages in a highly competitive information environment.

Third-party firms gather data from a variety of sources before processing it in ways that produce statistical models to profile social media by their ideological stance, voting intentions, sexuality, and mental health among other seemingly private forms of user data.

However, Jens Koed Madsen, assistant professor with the Department of Psychological and Behavioural Science at the London School of Economics, asserted that people might be exaggerating the effectiveness of microtargeting tactics. He clarified: “Don’t get me wrong, I think they’re really effective […] But if you have a really bad candidate—read Jeb Bush—like no amount of money is gonna get you through that […] 2016 primary season because he’s an uninspiring candidate, and no one really gives a damn about [him] … All things being equal, the person with the micro-targeted campaign probably will win. But candidates aren’t equal.”[14]

According to a Pew Research Center survey conducted between October 29 and November 11, 2019, 18% of US adults reported that social media was the most common means of accessing political and election news. In particular, news websites/apps were the most common at 25%, whereas local TV and cable TV news closely followed social media at 16% each.[15] Further to these points, few US adults trusted social media as a place to get political and election news; members of both parties were more likely to distrust rather than trust social media sites as sources for political news. These survey results highlight the impact of microtargeting in digital spaces. If voters are receiving their news from various non-digital sources and deem the news borne from social media to be untrustworthy, the impact of political microtargeting in said spaces should be considered through that lens.

Moreover, political microtargeting has a more reined-in reach in non-US contexts due to legal restrictions surrounding data protection as well as different electoral systems.

The Solutions

We recommend the following frameworks to be developed and implemented so as to provide more accurate and actionable insights into digital political influence techniques.

Crafting a framework for what works where, when, and why

Little evidence exists in support of the efficacy of political influence techniques. This shows that techniques are neither objectively effective nor ineffective, but rather that their potential impact is reliant on different social, political, and technological factors in any given context. Consequently, there is only one evidence-based framework on the efficacy of techniques.[16] This dearth in empirically proven and widely applicable frameworks perhaps remains the most significant gap in research and practical applications of influence techniques and the defence against techniques. Therefore, we propose the crafting of an evidence-based framework rooted in answers to the following questions:

● What digital political influence techniques exist?

● How and where have those techniques been used?

● Where, if any, is there evidence of the efficacy of emerged/existing techniques?

● What vulnerabilities exist, and what techniques are emerging to exploit them?

● What contextual factors determine the efficacy of existing and emerging techniques?

Crafting a platform-centric framework

Most of the published research reviewed for the case studies Moonshot compiled throughout 2020 has involved primary data analysis collected from Twitter. This is likely to be attributable to the platform’s accessibility in extracting data. Unfortunately, this results in a significant over-representation of Twitter as a platform analyzed for influence campaigns. It also has the possible consequence of greater volumes of research, implying greater impact.

However, throughout our research project, Moonshot identified a much greater number and variety of social media and networking platforms that have been used or are emerging as mechanisms for the delivery of political influence. Such platforms are relatively less accessible than those like Twitter, and thus primary research on these platforms is more difficult, as well as resource-intensive. Due to the lack of research on these platforms, many understudied platforms remain, and we do not know the full presence of influence campaigns unfolding within and across their ecosystems. These platforms include WhatsApp, Facebook (apart from what Facebook is willing to share in self-published transparency reports), Telegram, Discord, and TikTok, among others.

Due to the lack of research on these platforms, many understudied platforms remain, and we do not know the full presence of influence campaigns unfolding within and across their ecosystems.

It is highly probable that there remain influence techniques undetected by the research community and tech industry. Only by building a more complete understanding of the techniques that are deployed on which platforms will we be able to protect against malicious threats and vulnerabilities, as well as to conceptualize the impact of influence efforts. This framework should address the following considerations:

● What platforms are used for political influence campaigns in the online space?

● What techniques have been used on the platforms previously?

● What techniques are being used on understudied platforms?

This primary research would aim to uncover the techniques and strategies that are being used on understudied platforms and what vulnerabilities exist for emerging techniques to exploit.

Conclusion

Moonshot’s year-long project embraced mixed methods to elucidate the various techniques and contexts of political influence techniques leveraged in the online setting. We undertook a review of the relevant literature, compiled multi-pronged comparative case studies, and conducted interviews and focus groups with subject matter experts across disciplines and sectors. While we gleaned a solid understanding of political influence techniques online in terms of breadth and depth, we identified lasting research gaps noted in this essay. These gaps should be essential considerations in analyzing such political influence campaigns, whether malign or benign in nature.


Kieren Aris, Manager, Moonshot; Meghan Conroy, Analyst, Moonshot and PhD Candidate, Loughborough University; Joost S., Analyst, Moonshot; Liam Monsell; and Ari Abelson, Co-Founder, Flocal. The views contained in this article are the authors’ alone and do not represent the views of the authors’ respective employers.


[1] Samantha Bradshaw and Phillip Howard, “The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation,” Oxford Internet Institute, (Working paper 2019).

[2] Steven Livingston, Interview with authors, May 27, 2020.

[3] Steven Livingston, Interview with authors, May 27, 2020.

[4] “Public Trust in Government 1958-2021,” Pew Research Center, Published May 17, 2021, https://www.pewresearch.org/politics/2021/05/17/public-trust-in-government-1958-2021/.

[5] Michael S. Pollard and Lois M. Davis, “Decline in Trust in the Center for Disease Control and Prevention During the COVID-19 Pandemic,” Rand Corporation (2021), https://doi.org/10.7249/RRA308-12.

[6] A. Park, C. Bryson, E. Curtice, and M. Philips, “Key findings: How and why Britain’s attitudes and values are changing,” British Social Attitudes: the 30th Report (2013): ii-xxi, https://www.bsa.natcen.ac.uk/latest-report/british-social-attitudes-30/key-findings/trust-politics-and-institutions.aspx.

[7] Shannon McGregor, Interview with authors, May 19, 2020.

[8] Shannon McGregor, Interview with authors, May 19, 2020.

[9] Daniel Kreiss, Interview with authors, June 10, 2020.

[10] Shannon McGregor, Interview with authors, May 19, 2020.

[11] Daniel Kreiss, Interview with authors, June 10, 2020.

[12] Kate Starbird (@katestarbird), “Participatory disinformation,” Twitter, December 1, 2020, https://twitter.com/katestarbird/status/1333791131771969537; more about participatory disinformation can be found here:https://faculty.washington.edu/kstarbi/StarbirdArifWilson_DisinformationasCollaborativeWork-CameraReady-Preprint.pdf.

[13] Rachel Bucchino, “Poll: 66% of Republicans Don’t Think Biden’s Election Win Was Legitimate,” The National Interest. May 27, 2021, https://nationalinterest.org/blog/politics/poll-66-republicans-don%E2%80%99t-think-biden%E2%80%99s-election-win-was-legitimate-186254.

[14] Jens Koed Madsen, Interview with authors, May 14, 2020.

[15] Amy Mitchell, Mark Jurkowitz, J. Baxter Oliohant, and Eliza Shearer, “Americans who mainly got news via social media know less about politics and current events, heard more about unproven stories,” Pew Research Center, (February 22, 2021), https://www.pewresearch.org/journalism/2021/02/22/americans-who-mainly-got-news-via-social-media-knew-less-about-politics-and-current-events-heard-more-about-some-unproven-stories/.

[16] Ben Nimmo, “Breakout Scale: Measuring the Impact of influence operations,” Brookings, (September 2020), https://www.brookings.edu/research/the-breakout-scale-measuring-the-impact-of-influence-operations/.

You may also like

Comments are closed.