• On Monday, Facebook removed a shared post from mayor candidate Amy Lyon in Wichita, Kansas
  • Zephyer York shared Lyon’s post only to be told it was marked as spam within minutes
  • After requesting review, the post was restored roughly 40 minutes later

Facebook has raised concerns with users around the world over their community standards and how exactly they enforce them. While conservatives have been the most outspoken about the censorship experienced on the largest social media platform in the world, they have not been the only victims.

This Post Goes Against Our Community Standards On Spam

On Monday, Wichita, Kansas resident Zephyr York was a bit shocked to see her post had been removed for violating “Community Standards on spam.” What did York post that Facebook considered spam? York shared a post from one of her local mayor candidates. The original post was from the “Amy Lyon for Wichita” Facebook page.

York told discuss within ten minutes of sharing the post Facebook had marked it as spam. After requesting a review, the post was restored on his page roughly 40 minutes later with no explanation. York made a post to his Facebook with screenshots of her post and the alleged community standards violation.

Following the short wrongful removal, York said, “For whatever reason this happened, I’m more inclined now than before to become more proactive in helping get the word out about the candidates that hold my values.” An important lesson, as most political campaigns rely heavily upon volunteers to help raise awareness in the community they run in. At the end of the day, a face to face conversation will leave a bigger impact than a Facebook post.

But that does not mean Facebook does not hold a terrifying influence over American politics. The social media giant is still receiving backlash following Russian influence in the 2016 elections. Days after the election, Facebook CEO Mark Zuckerberg claimed the idea that fake news had pushed the election in favor of Donald Trump was a “pretty crazy idea.”

Just a year later, the U.S. Director of National Intelligence released a declassified report that stated, “Russian President Vladimir Putin ordered an influence campaign in 2016 aimed at the US presidential election. Russia’s goals were to undermine faith in the US democratic process, denigrate Secretary Clinton, and harm her electability and potential presidency. We further assess Putin and the Russian Government developed a clear preference for President-elect Trump.”

When faced with overwhelming evidence of the Russian cyber campaign via Facebook, the social media giant backtracked. In April 2017, a Facebook release publicly acknowledged Russia manipulating their platform for election meddling. In the release, the company noted that “data does not contradict” the January 2017 report released by US intelligence agencies. The fallout over the Russian election meddling is said to have caused dissent within the upper management at Facebook. The inner conflict was confirmed last year when Alex Stamos, Facebook’s chief information security officer announced he was leaving Facebook. Stamos was the first high-ranking employee to leave the company over the 2016 election controversy.

Since the backlash from the 2016 election, Facebook has taken drastic measures against its users with what has become known as “page purges.” Regardless of political affiliation, over the last year Facebook has unpublished thousands of pages with no information aside from the vague response of violating community standards. After Alex Jones was banned from basically all social media, pages such as The Free Thought Project and Cop Block lost millions of followers during these unexplained page purges. Some believe Jones’ ban from multiple social media platforms opened the door for Facebook and other social media giants to run amok against publishers on their platform.

Along with the 2016 election, other incidents such as 28-year-old Brenton Tarrant killing 51 people and injuring over three dozen more during a mass shooting at Christchurch, New Zealand have tightened Facebook’s grip on content published on their platform. Tarrant streamed the shooting through Facebook LIVE which caused the company to receive global backlash questioning the control they have over their platform. Facebook also tightened community standards in a failed attempt to ban white supremacist groups. There have been speculations that Facebook’s algorithm to detect violations is beyond broken. Despite obvious issues with their current practices, Facebook appears to be pushing forward while users face the repercussions.

A perfect example of how out of control Facebook censorship can be seen in an incident last year. The Liberty County Vindicator — a small-town Texas newspaper — was posting the Declaration of Independence in the days leading up to Independence Day. The paper had broken the text up into twelve pieces. One portion was flagged as hate speech by Facebook algorithms and removed. After the incident gained media attention, Facebook apologized to the Vindicator and restored the post. “The post was removed by mistake and restored as soon as we looked into it,” a spokesperson said. “We process millions of reports each week, and sometimes we get things wrong.”

Facebook admitting that sometimes we get things wrong could very well be an understatement. With basically no human representative for users to speak to at Facebook, wrongful bans often go unaddressed by anybody at the company. Concerns are growing over how bad user suppression will become during the 2020 elections. Monday’s removal of Lyon’s post in a local election could be a dark precursor for what is to come on a national level.

If you have screenshots of Facebook wrongfully censoring you, please email them to us by clicking here!

LEAVE A REPLY

Please enter your comment!
Please enter your name here