You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Community Standards

I want to publicly thank Professor Jonathan Zittrain (JZ) for his wonderfully informative and absolutely riveting discussion on the topic of Internet governance. It’s not a topic around which it is easy to get your arms. It’s a mix of individual actors, corporate entities, government agencies, and open communities. There is nothing straightforward about this conglomeration of actors, and I’ve always struggled to know where to start. Luckily, Jim and I know JZ, and it turns out that that’s always a great place to start. So, thank you, JZ!

Much of his presentation — on how jurisdiction and regulation happened as the Internet evolved — was told through the stories of a few key people. It was a great way to give all of us a narrative foundation on which we could anchor further discussion. And that’s what I’d like to try to do here!

While most of Monday’s discussion looked at the past, this issue remains important as the Internet continues to evolve, and some of the most interesting pieces of the current evolution take place in our social media platforms. This got me thinking, “How does Facebook handle this ongoing evolution? Or more specifically, how has Facebook’s own regulation of its platform evolved?”

While I could call up former students that currently work at Facebook, I took a different approach: I decided to look at how the Community Standards page on facebook.com has changed over time. A great way to do this is to take advantage of the Internet Archive Wayback Machine.

The first question I investigated was simply, “How much did the Community Standards page change over the nearly seven years captured by the Wayback Machine?” Instead of looking at every minor change in the page, I focused on the point where the look of the page changed dramatically to the format it has today (see the page as it looked on March 14, 2015 and then its new, basically current look on March 18, 2015). Then I asked, “How different is the content of the page today as compared to the first captured day of its current look?”

I was surprised by the answer: Very little. In my mind, a lot has happened in the 32 months between March 2015 and November 2017. This doesn’t mean that a lot didn’t happen behind the scenes (i.e., the code that automates some of the process, and the policies that the people involved in the “dedicated teams working around the world to review things you [the user] report to help make sure Facebook remains safe”). In a moment, I’ll dig into this behind-the-scenes question, but first I’ll summarize the differences between the content of the Facebook Community Standards page in March 2015 and November 2017.

Briefly, there’s now a video link to help the users “learn more about how it works” or in particular how Facebook decides to remove (or not remove) content, described at a high level with an emphasis on why rather than how. Facebook’s claimed mission has changed slightly, from “Our mission is to give people the power to share and make the world more open and connected” to “Our mission is to give people the power to build community and bring the world closer together.”

There was also an important addition in the second paragraph of the page stating, “Sometimes we will allow content if newsworthy, significant or important to the public interest – even if it might otherwise violate our standards.” Very little else changed – only the title of the category of “Nudity” under “Encouraging respectful behavior” which became “Adult Nudity & Sexual Activity”). So, the biggest change according these differences is the power of Facebook to overrule its own Community Standards. Probably a lot more could be said about this change.

But what actually happens behind the scenes? In the early days of the Internet, these sorts of questions were debated in open forums like the IETF community meetings. The best I could find around Facebook’s Community Standards work (I will admit that I didn’t spend more than an afternoon looking) were the following two articles:

Bickert talks about how hard it is to draw the line and how daunting the task turns out to be on a social network as large as Facebook’s. The key sentence in the post for me was, “We don’t always share the details of our policies, because we don’t want to encourage people to find workarounds – but we do publish our Community Standards, which set out what is and isn’t allowed on Facebook, and why.” I encourage you to think about whether this is an acceptable answer to you.

I’m not 100% decided, but I lean toward more transparency. I’d like to know how Facebook filters what I see, especially if I am using Facebook to “see the world through the eyes of others” as they state in the first paragraph on their Community Standards page. I may not want to see what they filter, but I want to know exactly what they filter in a manner more detailed than their standards. As Bickert says, what is art to one person might be pornography to another.

Finally, in the Morse article, I’m glad to read that Facebook hasn’t replaced their team of humans that do this messy work with AI. As we’ve discussed in this seminar, AI can be even less transparent than people about the decisions it makes. But that’s my take. Yours might be different.

2 Comments

  1. Jakob Gilbert

    November 9, 2017 @ 4:53 am

    1

    Great post, Prof. Smith, and an intriguing use of the Wayback Machine that has gotten me thinking about other comparisons of web pages over time that would be of interest. I definitely agree with you — I lean far towards transparency. Forgive me if I’m wrong, but isn’t a post that has found a “workaround” to the rules for unacceptable content simply…acceptable content? I would assume Facebook is large enough to have strict and thorough behind-the-scenes community guidelines, and, as such, I would expect that any content that follows all of the rules should be considered fine, right? With human monitors checking every post, common sense could dictate when something is displaying a blatant disregard for the rules yet is somehow a “workaround,” I would assume (but I’m not convinced that a “workaround” to the rules isn’t simply a post that follows the rules). It seems the “workarounds” explanation from Bickert is either untrue or very unclear — however I don’t know what ulterior motivation Facebook would have to hide their true guidelines (perhaps the real ones have a political bias?). What do you think?

  2. profsmith

    November 9, 2017 @ 1:45 pm

    2

    Good question. It’s not clear to me what “workarounds” really concern Facebook. I assume that there are real concerns, and it would have helped the reader if Bickert could have given an example here where a rule turns into a game of nuanced changes that continue to violate the standard, but pass the current specific rule. It is harder to do these things when you have to actually code the rule, but I can’t give you an example when a human is actually interpreting the rule.

Leave a Comment

Log in