The dust is slowly trying to settle from 2016’s exceptionally tumultuous Presidential Election, and everyone from pundits to voters is trying to figure out what exactly happened.
Whether you lean to the right of left, one of the most common answers is the sharing of fake news on social media, and to combat this trend moving forward, Mark Zuckerberg recently said Facebook is going to make some serious changes.
“We take misinformation seriously,” the Facebook Founder and CEO wrote in a post. “Our goal is to connect people with the stories they find most meaningful, and we know people want accurate information. We’ve been working on this problem for a long time, and we take this responsibility seriously. We’ve made significant progress, but there is more work to be done.”
According to a recent BuzzFeed analysis, leading up to election night, fake news stories regularly outperformed legitimate new articles. In fact, the 20 top-performing false election stories from hoax sites and hyper-partisan blogs generated more than 8.7 million shares, reaction, and comments on Facebook alone.
To identify fake posts, Facebook could start labeling them as false when they appear on news feeds, rather than eliminating them all together – giving users the power to sort out fact from fiction.
Do you think fake news on social media is that big of a deal? We want to hear from you!