Facebook, Twitter and YouTube continue to battle misinformation on their services as vote counts proceed in a tight and bitterly contested presidential election.
Among the challenges the social networks have faced: fake accounts posing as credible news organizations that falsely called the election, fake reports of Sharpies being used to suppress votes and live streams broadcasting bogus results. President Donald Trump has also used social media to baselessly claim the election was being stolen from him, creating another problem for the companies.
Read more: Here's how to recognize misinformation on Election Day
The three big social networks have all established policies for dealing with misinformation. The measures range from applying a label to questionable information to deleting posts and banning users. The election has kept them busy.
All the social networks label questionable posts and remove falsehoods if the content has the potential to incite violence. Facebook has a more hands-off approach to posts from politicians than Twitter, which has limited the reach of Trump's tweets. On all election-related videos and search results, YouTube applies a warning that results may not be final.
Here are some of the most significant cases of misinformation and how the social networks responded.
- By far the biggest challenge the social network faced was a late-night post by Trump alleging without evidence that he was "up BIG" and his political opponents were "trying to steal the election." Facebook labeled the post with information that the vote count was ongoing and directed users to an election information center.
- Facebook added a label under a video shared on Trump's page in which the president falsely claims, "Frankly, we did win this election." Trump made the comment during a late address on election night. As of Thursday, the video has more than 12 million views.
- The social network shut down a massive group called "STOP THE STEAL" that was spreading false claims that Democrats were trying to swipe the election. The group had more than 364,000 members.
- Facebook was recommending live videos to users with election misinformation and Russian state-controlled media content, according to BuzzFeed News. The company pulled down some of the videos.
- Twitter labeled Trump's false allegation of election tampering as "disputed" and potentially "misleading." It also obscured the president's tweet, forcing users to click through to see it. Users can't like the tweet and can share it only if they weigh in with their own comment.
- Twitter subsequently labeled and obscured several Trump tweets and retweets, including one calling to "STOP THE FRAUD!" There isn't any evidence of election fraud.
- Twitter allowed a clip of Trump's false victory claim during an election night television appearance to remain on the service without a label. Twitter said the video, which was shared by media outlets, didn't violate its policies. The video, shared by the Trump campaign account, has more than 22 million views.
- Twitter suspended a group of accounts that posed as legitimate news organizations. Some of the accounts, which mimicked the Associated Press and CNN, spread false reports that Democrat Joe Biden had won the election. The groups appeared to be working together.
YouTube
- The Google-owned video-sharing service added a label under the video of Trump falsely claiming victory on election night. The video has more than 414,000 views on Trump's channel. (The label, though, isn't intended only for misinformation. It appears under all election-related videos and search results.)
- YouTube took down multiple videos livestreaming fake election results hours before polls closed anywhere in the country. The video streams, some of which ran ads that made the account holders money, were viewed by thousands of people before being removed. One of the channels carrying a stream appeared to have almost 1.5 million subscribers.
- YouTube has been criticized for refusing to take down two videos by One America News, a far right news organization, that falsely declare victory for Trump. Despite the false claims in the videos, YouTube said they don't violate the platform's rules, which focus more narrowly on voter suppression. The platform, though, will no longer show ads on the videos, depriving the network of revenue.
from CNET https://ift.tt/3k6BoCK
via IFTTT
No comments:
Post a Comment