How Twitter, Facebook and YouTube are handling election misinformation
In the months leading up to the election, major social platforms issued seemingly endless updates on how they would address election-related misinformation on their platforms.
Now
that the election is underway, there have been major differences in the
major tech platforms' approaches to moderating misinformation and
impacting its spread.
Twitter (TWTR)
has been the most aggressive in labeling and addressing false and
misleading content while Facebook and YouTube have applied a lighter
touch.
The
three platforms have taken varying approaches. Twitter has gone as far
as reducing users' ability to share misleading posts, while Facebook is
slapping labels on misinformation, but not hindering sharing. YouTube is
taking arguably the least aggressive action, by relying on a single
label -- which reminds people that US election results may not be final
-- on any and all content related to the election.
Twitter has been labeling and restricting
how tweets can be shared, including several from President Trump. For
example, Twitter placed a label on a tweet from the President in which
he baselessly claimed "We are up BIG, but they are trying to STEAL the
Election."
"Some
or all of the content shared in this Tweet is disputed and might be
misleading about how to participate in an election or another civic
process," the label on that Trump tweet read
Twitter
has also restricted how such tweets can be shared, including removing
replies and likes, and only allowing users to quote tweet -- which
allows users to share a tweet with their own comments attached -- rather
than retweet. (Twitter is also applying other labels to tweets that, by
its standards, are prematurely calling election results for either candidate. One of its labels reads: "Official sources may not have called the race when this was Tweeted.")
On the exact same post from Trump on its platform, Facebook (FB)
used vague language in its label and, unlike Twitter, didn't restrict
how it can be viewed or shared. Facebook's label reads: "Final results
may be different from initial vote counts, as ballot counting will
continue for days or weeks." By Wednesday morning it was one of the most
highly engaged with posts on Facebook, according to data from Crowdtangle, an analytics company that Facebook itself owns.
In
a statement, a Facebook spokesperson said: "Once President Trump began
making premature claims of victory, we started running top-of-feed
notifications on Facebook and Instagram so that everyone knows votes are
still being counted and the winner has not been projected. We also
started applying labels to both candidates' posts automatically with
this information."
Facebook also
labeled a Tuesday night post from Donald Trump in which he said, "I will
be making a statement tonight. A big WIN!"
Facebook's
label cautioned: "Votes are still being counted. The winner of the 2020
US Presidential Election has not been projected."
Late
Wednesday afternoon, Facebook said it's expanding its premature
declaration of victory labels to apply to people beyond the presidential
candidates, including at the state level. This will apply to Instagram
as well, which it owns.
Jennifer
Grygiel, a social media professor at Syracuse University's Newhouse
School, criticized the last-minute decision. "That should have been
obvious. They should have been prepared to do this sooner," Grygiel
said. "[Social platforms] are writing the rules of the road as they go
along."
Instagram previously announced it would
temporarily hide the "Recent" tab from showing up on all hashtag pages —
whether they're related to politics or not. The company said it hopes
the move will help prevent the spread of misinformation and harmful
content related to the election.
Meanwhile,
YouTube has placed an information panel at the top of search results
related to the election, as well as below videos that talk about the
election. The box says election "results may not be final," and it
directs users to parent company Google's feature that tracks election
results in real time. It's also taking measures similar to what it has
done in past elections, such as promoting content from authoritative
news sources in search results.
However
it is letting a video containing misinformation about the election stay
up on its platform without a fact-check or label noting that it is
misinformaton -- exposing the limits of what the social media platforms
are doing to counter the spread of potentially dangerous false claims
about election results.
In a YouTube video posted by far-right news organization One American News Network
on Wednesday, an anchor says "President Trump won four more years in
office last night." The video also includes baseless claims that
Democrats are "tossing Republican ballots, harvesting fake ballots, and
delaying the results to create confusion." The video has been viewed
more than 350,000 times. CNBC was the first to report on the video.
While
the video -- like others related to the election -- has a label on it
saying results may not be final in the US election, YouTube said the
video does not violate its rules and would not be removed.
"Our
Community Guidelines prohibit content misleading viewers about voting,
for example content aiming to mislead voters about the time, place,
means or eligibility requirements for voting, or false claims that could
materially discourage voting. The content of this video doesn't rise to
that level," said Ivy Choi, a YouTube spokesperson.
However,
YouTube said it has stopped running ads on the video -- while admitting
the video has false information. "We remove ads from videos that
contain content that is demonstrably false about election results, like
this video," Choi said.
The
company said it did remove several livestreams on Election Day that
violated its spam, deceptive practices and scams policies.
The
OAN anchor in the video shared the YouTube link to her personal Twitter
account with the comment, "Trump won. MSM hopes you don't believe your
eyes." Twitter said that according to its policy on Civic Integrity, the
tweet isn't eligible for a label indicating it might contain a
premature call of election results because the original account has
fewer than 100,000 followers and the tweet has not hit levels of
engagement that would otherwise make it eligible. However, it was
retweeted by OAN's official account, which has 1.1 million followers.
How Twitter, Facebook and YouTube are handling election misinformation
Reviewed by Information Technology News
on
2:46 PM
Rating:
Post a Comment