Facebook just basically said social media is bad for democracy
Ah, Facebook. It’s our go-to site to post dog photos, share memes, and, sometimes, take a stand on our political soapbox. And it turns out the people behind Facebook admit that the social network could have negative effects on the way our government works.
Today, January 22nd, Facebook Inc. said that while the company is working hard to ensure that foreign governments like Russia can’t influence U.S. elections, Facebook still might not be that great for democracy. In a blog post, Product Manager Samidh Chakrabarti wrote that while social media does help facilitate political conversation, it can also spread misinformation and make it easier for hackers to meddle in politics.
"If there’s one fundamental truth about social media’s impact on democracy it’s that it amplifies human intent — both good and bad," Chakrabarti wrote. "At its best, it allows us to express ourselves and take action. At its worst, it allows people to spread misinformation and corrode democracy."
Chakrabarti wrote that he was most concerned about the spread of fake news on social media and the tendency for especially vocal political minorities to overwhelm popular opinion. He also expressed worries about political harassment.
Facebook admits that "at its worst, [social media] allows people to spread misinformation and corrode democracy" https://t.co/yiuRtzECYm
— CNN International (@cnni) January 22, 2018
Facebook’s Global Politics and Outreach Director Katie Harbath also wrote that Facebook had failed to quickly stop false news stories and “echo chambers.”
Facebook Hard Questions: Social Media and Democracy: https://t.co/1Ioz0bOVWU
— Katie Harbath (@katieharbath) January 22, 2018
In November, the House Intelligence Committee revealed several Facebook ads that had been funded by Russian organizations in an attempt to swing the 2016 presidential election in favor of Donald Trump. More than 11 million people were exposed to these Russian-backed ads between 2015 and 2017.
But Chakrabarti also discussed the social networks’s efforts to combat these issues. He wrote that, to stop the spread of fake news, the company will be hiring third-party fact-checkers. His post also stated that users will be able to visit advertisers’ pages and view all of their ads in order to show viewers who funded them. These advancements in transparency and security are not the only big changes in 2018; on January 11th, Facebook announced it will make major changes to its News Feed so that users can see more of their friends and family.
Since social media is such a large source of information for many people, it’s crucial that social media companies take some responsibility for misinformation on their sites. We’re glad to see Facebook acknowledging the negative effects social media can have and working to be better.