Citation: Gruzd, A., Mai, P., & Soares, F. B. (2022). How coordinated link sharing behavior and partisans’ narrative framing fan the spread of COVID-19 misinformation and conspiracy theories. Social Network Analysis and Mining, 12(1), 1-12. DOI: https://doi.org/10.1007/s13278-022-00948-y
In the world of Covid-19 misinformation, coordinated link sharing among US-based pro-Trump, QAnon, and anti-vaccination Facebook pages and groups drives the spread of unproven cures and conspiracies about the pandemic. This is one of the main findings from our new study, “How coordinated link sharing behavior and partisans’ narrative framing fan the spread of COVID‑19 misinformation and conspiracy theories” published in the journal Social Network Analysis and Mining.
For this study, we used a combination of network and content analysis techniques to detect signs of so-called Coordinated Link Sharing Behavior (CLSB) on Facebook around the highly publicized “White Coat Summit” press conference, which was organized by “America’s Frontline Doctors” in the summer of 2020. (CLSB refers to the act of sharing the same link within a very short period of time, usually within seconds, by Facebook entities.)
Using CrowdTangle, a public content discovery and analysis tool owned by Meta, the parent company of Facebook, we collected 7,737 public Facebook posts mentioning Stella Immanuel, one of the doctors who took part in the press conference. Dr. Stella Immanuel gained notoriety after her statements in support of hydroxy-chloroquine received praise from the then President, Donald Trump, and after it was reported that she had a history of attributing medical conditions to non-scientific causes such as witches and demons.
To detect signs of CLSB among Facebook public pages and groups that mentioned Dr. Stella Immanuel, we used a specialized program called CooRnet developed by researchers Fabio Giglietto, Nicola Righetti, Luca Rossi at the University of Urbino. As described by the developers, CooRnet identifies Facebook public entities engaged in CLSB by: “(1) estimating a time threshold that identifies URLs shares performed by multiple distinguished entities within an unusually short period of time (as compared to the entire dataset), and (2) grouping the entities that repeatedly shared the same news story within this coordination interval.” The rationale that underpins this algorithm is that while it is possible for several accounts to share the same URL(s), it is highly unlikely that this can occur repeatedly within an extremely short time period without some form of coordination (be it implicit or explicit).
In total, we identified a network of 1,390 Facebook entities engaged in coordinated link sharing (Figure 1). Out of the 7,737 public posts shared by these entities as captured in our dataset, 2,484 (36.8%) were shared in a coordinated manner, i.e., they met the 71-s threshold for “unusually” fast sharing. In addition to pro-Trump, QAnon, and anti-vaccination accounts, we identified a network of Facebook accounts in other countries that also engaged in CLSB, such as Brazil and France, and some accounts in African countries that criticized the government’s pandemic response in their countries.
While some groups and pages on the left side of the political spectrum have also engaged in CLSB (see Figure 1, clusters 1 & 4), the two right-wing clusters in the network (clusters 2 & 6) had the highest percentage of pages and groups that were subsequently removed/suspended by the platform, and these entities also shared the highest percentage of links to social media posts that are no longer available on the platform (within the U.S.-based clusters). Furthermore, many of the links shared by the right-wing pages and groups that were subsequently taken down were links to claims about unproven COVID-19 treatments such as hydroxychloroquine.




In sum, even though connections between Facebook entities in the network are not necessarily a sign of explicit coordination, the discovered linkages reveal clusters of entities that share similar views and thus share and discuss similar links, e.g. birds of a feather do tend to flock together. This finding is an example of one of the basic patterns inherent in human relationships, homophily – the sociological principle which posits the tendency of individuals with similar preferences, traits or behaviors to be more likely to interact and to form tight relational bonds.
The result also reminds us that in the age of social media, misinformation knows no borders and that despite their best efforts, social media platforms are not ready to handle the viral spread of misinformation in highly partisan, internationalized and decentralized information environments. As a result, the “Whac-A-Mole” style approach to combating misinformation is likely to continue, especially in highly polarized countries.
The paper is open access. You can download the full paper here.