A Social Dilemma

Andre M. Wang March/April 2022
Getting your Trinity Audio player ready...

One morning a few months ago I scrolled through my Facebook feed and came across a post by a good friend regarding the origin of the COVID-19 virus. It was not a politically charged post, but a scholarly article on the virus, its purported connection to China, and the pandemic that has held the world captive for more than two years.

Having a shameless—and perhaps shameful—sense of humor, I tapped out the following comment on my phone: “As one of Chinese heritage, I rather enjoyed the kung pao bat I had for breakfast.” I clicked send and kept perusing my feed.

Within 10 seconds an alert message appeared declaring that my comment violated Facebook’s community standards and my account was subject to a seven-day suspension. There was a link in the message informing me of the process to appeal the decision and that my account was suspended immediately, even while pending any appeal. Conspicuously missing was any information on my infraction and why the penalty was warranted. Was it my mention of being Chinese on a COVID-related post—or inferring that the virus came from China? Was it the mention of a bat—or that I ate and enjoyed it?

Not knowing what to appeal, I decided pursuing it would be futile and more aggravating than spending one week in Facebook jail. I further reasoned that the social media hiatus would be good for me anyway.

After seven days my account was reactivated. However, emblazoned on my account settings page is this ominous warning: “People who repeatedly post things that aren’t allowed on Facebook may have their accounts permanently disabled.” The scarlet message (it’s actually more orange) remains there today.

This experience has raised questions about corporate ethics, social responsibility, and civility in public discourse. What I learned made me realize the implications of being connected on the world’s most popular social media platform.

The Rules

While in exile, I combed through Facebook’s user agreement to understand what is allowed and what isn’t. The agreement references the now-famous community standards, the list of official rules written in verbose legalese that outlines the types of posts that can get a user banned from the platform. It also identifies the types of users that are not allowed to post.

The guidelines define the six categories of unacceptable posts and content:

Violence and Criminal Behavior: Facebook bans any threats and advocating violence. The standards also mention that efforts are made to determine the difference between “casual statements” and “credible threats to public or personal safety.”

Safety: Facebook will remove content where there is a “genuine risk of physical harm or direct threats to public safety,” including cyberbullying and posts involving suicide and self-harm. Interestingly, the anti-bullying policies “do not apply to public figures, because we want to allow discourse, which often includes critical discussion of people who are featured in the news or who have a large public audience.” However, content that constitutes hate speech or advocates violence against a public figure will be removed.

Objectionable Content: Specifically mentioned in this category are hate speech, graphic violence, pornography, and cruel and insensitive content.

Integrity and Authenticity: This is content that falls outside of the other categories, including spam and misrepresentation (i.e., users must be real and verifiable). Of note, Facebook states that it tries to reduce “false news,” yet satire is allowed. “For these reasons, we don’t remove false news but instead significantly reduce its distribution by showing it lower in the news feed,” the standards state.

Respecting Intellectual Property: Users are restricted from posting content that is owned by someone else, including anything with “copyrights, trademarks, and other legal rights.” And contrary to widespread misconception, the community standards state that users own everything they post. For example, when you post a picture that you took, Facebook cannot and does not claim any rights to it.

Content-related Requests: Facebook will remove accounts upon the request of an authorized representative, such as an immediate relative that is deceased or incapacitated. It also adds that for the protection of minors, it will remove accounts of users that are under 13 years old, as well as requests by their parents, legal guardians, or the government.

Regarding the enforcement of these standards, both the user agreement and the community guidelines suggest that violations are adjudicated by sentient, thinking human beings at Facebook, with references to “our team.” But my experience indicates otherwise.

The Algorithm 

When I posted my ill-fated comment on my friend’s post, the notification of the community standards violation appeared almost instantly—definitely too fast to have been done by human hands. For instance, a user would have had to read my comment (and been offended) and reported it to Facebook. The comment would have then had to be reviewed by a Facebook employee, who would have rendered a community standards judgment on my comment and, if there was a violation, the length of my penalty. It is impossible for that sequence of events to occur within 10 seconds.

When Mark Zuckerberg testified before Congress in April 2018, he disclosed that Facebook was developing artificial intelligence (“AI”) to address the platform’s security, privacy, and user issues. In short, user activity and content would be monitored by robots.

People can report anything to Facebook. According to Guy Rosen, vice president for product development, Facebook receives tens of millions of reports per week about potentially objectionable content. These reports are employed as a data set to train Facebook’s AI systems to automatically detect such content. Rosen says, “The objective is how to automate the process so we can get to content faster, and get to more content. It’s about learning by examples. And the most important thing is to have more examples to teach the system.”

The AI is taught to identify low-hanging fruit: nudity, graphic violence, terrorism, spam, and hate speech. But what robots cannot be taught is nuance. While my comment could be interpreted as a slur on its face, in context it was self-referential and self-deprecating. If the AI was really on the ball, it would have picked up on my obvious Chinese surname.

The Sandbox

Despite my experience, my philosophy toward the Facebook behemoth remains unchanged: As a private business, Facebook has the right to regulate, ban, or censor whatever content or users they host on their site and, further, legally use user data however they choose. If they don’t want me making comments about eating bats, that’s their absolute prerogative. It’s their sandbox and their rules. I just play in it. 

On the other hand, there now percolates a bigger social ethic. Many point to Facebook’s domineering role in public discourse and, by extension, the public trust—that users are exploited by the Orwellian algorithm to intensify doubts, fears, and insecurities. The polarization of virtually everything—from political candidates to vaccines to pineapple on pizza—gains breakneck momentum on social media.

In October 2021 ex-Facebook data scientist Frances Haugen testified before Congress, presenting internal research that her former employer knowingly engaged in practices that harmed children, sowed division, and undermined democracy in pursuit of “astronomical profits.” She revealed that the algorithms reward engagement, which boosts sensational content such as posts that feature rage, hate, or misinformation. 

There is currently a proposal to have social media companies regulated like a public utility, where it operates as a private company but with government oversight. But when it comes to a platform that is so integral to public discourse, the “I’m from the government and I’m here to help” approach is a dangerous proposition. Oversight can become overreach seamlessly. So while Facebook executives and the government quarrel over what Facebook is and what it means to our democracy, today’s reality is that any user can be penalized because a turn of phrase offended a robot. 

A Word on Civility

Civility in discourse is a lost attribute. Somewhere along the way, our culture abandoned civility and today people are demeaned, derided, and ridiculed for who they are or what they believe. People have gotten bitter and angry—and not just bitter and angry with those who don’t agree with them. They get bitter and angry with those that aren’t as bitter and angry as they are.

Civility is defined as “politeness and courtesy in behavior or speech.” It has its etymology in the Latin word civilis, meaning citizen or person, hence the term civilization. By its very origin, civility recognizes the inherent respect and dignity of the individual and where we derive the basic code of social interaction.

Civility in discourse requires an immense humility. It is not only an acknowledgment that there is another perspective but that we could be wrong. But it goes further than that. Humility mandates that we view our counterparts as our moral and intellectual equal.

In his letter to the church in Ephesus, Paul sought to quell a political conflict raging among the citizenry. In his plea for civility in discourse, he wrote:

“I urge you to live a life worthy of the calling you have received. Be completely humble and gentle; be patient, bearing with one another in love. Make every effort to keep the unity of the Spirit through the bond of peace. There is one body and one Spirit, just as you were called to one hope when you were called. One Lord, one faith, one baptism; one God and Father of all, who is over all and through all and in all” (Ephesians 4:1-6, NIV).*

Whether my bat-eating comment warranted punishment will never be settled. But it doesn’t matter. A private company can make the rules under which its consumers must play. What does matter is that in every position that I espouse and in every interaction that I have, civility is what binds us and keeps our democracy healthy.  May we all “live a life worthy of that calling.”

*Bible texts credited to NIV are from the Holy Bible, New International Version. Copyright © 1973, 1978, 1984, 2011 by Biblica, Inc. Used by permission. All rights reserved worldwide. 


Article Author: Andre M. Wang

Andre M. Wang serves as general counsel and director of public affairs and religious liberty for the North Pacific Union Conference of Seventh-day Adventists. He continues to post musings on Facebook, Twitter, and Instagram.