“Friend of a Friend” Stories as a Vehicle for Misinformation
Authors: Kolina Koltai, University of Washington Center for an Informed Public, Jack Nassetta, Graphika, Kate Starbird, University of Washington Center for an Informed Public
Contributors: Carly Miller, Dylan Junkin, Kevin Lin, Malika Mehrotra, Stanford Internet Observatory
Key Takeaways
There have been multiple viral false claims on social media about the election based on “friend of a friend” stories.
“Friend of a friend” claims are often unverified and spread quickly on platforms (and in private networks) due to the easy ability to copy-and-paste text
Social media platforms have been inconsistent in responding to this type of misinformation.
We expect to see an increasing number of instances of viral misinformation about the election in the upcoming weeks using the “friend of a friend” format.
Introduction
With about a week until the U.S. general election, there has already been an unprecedented amount of misinformation about voting and mail-in ballots. As we’ve covered in our recent piece, in the next few weeks we expect to see a lot of misleading information spread on social media related to voting in person, dropping off ballots, and mail-in voting. One way these narratives may rapidly gain traction is through “friend of a friend” narratives.
The “friend of a friend” story is a common “rumor” type, and one that precedes the Internet. But these second- and third-hand stories have taken on new forms and new vigor in the Internet era. The essence of these stories is that they relate someone else’s experience — e.g., my brother’s friend, a friend’s coworker, my mom’s doctor. Offline, these stories provide a social context for information that can give it credibility and relevance. Online, they likely play a similar role, even as the specific social context collapses as they spread. Once someone’s “friend” is someone that they know on the Internet, a single story can become mapped to a wide range of different relations.
And these stories are easily spread, copied and pasted across multiple platforms and fora. They have the capacity to propagate widely, and to give a perception of unique origins (so many different people sharing the same story), while also being difficult or impossible to fact-check. Recently this format has allowed the viral spread of coronavirus misinformation. Additionally as voting has begun, we have seen this format used to rapidly propagate false narratives about the upcoming election.
There are several features of these “friend of a friend” posts that make it a powerful tool for spreading misinformation.
They often include unsourced claims and rely on an unnamed “friend” as the source.
They thrive on “copy-pasta” spread. These posts encourage people to copy-and-paste the exact or very similar text over and over to spread it across their networks leading to rapid spread.
They can be shared in multiple ways. Because of the textual nature of the “friend of a friend” post, this type of misinformation is spread not just on social media platforms like Twitter or Facebook. People often send it out via email, text, or in private group chats (like on WhatsApp) making it even more difficult to track its spread or provide corrections.
What We Have Seen This cycle So Far
In the reports that Election Integrity Partnership have evaluated, we’ve seen at least 12 different significant incidents of misinformation about the integrity of the election spread using the “friend of a friend” claim. We suspect that there are many more we did not capture and we anticipate seeing more on and around Election Day.
One example that went viral on social media is the story about a “friend” who recently completed poll manager training and claimed that putting a mark on a ballot would make it invalid. This is not true. The “friend of friend” claim provided legitimacy to an otherwise unsourced piece of information, allowing it to spread with no basis in fact. The post used a copy-and-paste tactic to further its spread under the guise of “just passing on” a helpful tip. Analysis using CrowdTangle showed many users posted the exact same text very quickly. (see below) The post also includes a call to action to report suspicious behavior at the polls. Although many posters may have been well intentioned in spreading this narrative, genuinely believing that they were helping other voters, they were unwittingly assisting in election delegitimization.
An additional narrative we have seen spread via the “friend of a friend” claim concerns the reliability of electronic voting machines. In this case a series of copy and pasted posts claimed that a friend had their vote switched from Biden to Trump when using an electronic voting machine, and that an election judge had ordered it shut down. The narrative is repeatedly pervasive, having been seen previous election cycles including 2016 where it was promoted by inauthentic IRA accounts.
Platform Responses
This type of misinformation is difficult to stop. It can spread rapidly on platforms and often move to private channels with ease due the copy and paste nature. So far we have seen inconsistent responses when it comes to this type of misinformation, both within and across platforms.
Taking the above, unsourced poll worker story as an example, we saw Facebook take a policy of removing the posts but only in some cases. As of writing, Facebook has removed some of the posts containing the text, added labels to others, and left some untouched. Twitter on the other hand has not taken action on any of the posts.
In the case of the viral story regarding polling machines in Texas, we saw Twitter take action against both the original viral post and copied versions of it, with them being removed across the board. Facebook took no action on these posts creating inconsistency in its overall response to unsourced information casting doubt on the voting process.
While one may expect inconsistent platform responses to unsourced and debunked posts across the numerous platforms, internal inconsistency only serves to create confusion while leaving misinformation untouched.
Moving Forward
As we lead up to the election and tensions are high about potential voter fraud, it will be critical for users to be wary of unsourced and dubious information using the “friend of a friend” claim. While it may not be possible to fact check each piece of information you see online, we recommend that you be hesitant to share or copy-paste posts that use an unknown friend as a source. One way to check if the post falls into this category is by copying bits of the text into an internet search engine to see if it has already been verified or debunked. You can also be on the lookout for verifiable evidence supporting any claims that are made in the post.
Additionally, platforms must set consistent policy and abide by it when it comes to vaguely sourced material that casts doubt on the electoral process. As Election Day approaches and more people vote in person expect more misinformation to appear using this tactic and spread rapidly. There will be no time for ad-hoc policy decisions when it comes to voting misinformation.