You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

Oh Facebook.

October 29th, 2016

A couple weeks ago I attended a talk given by a Quora data scientist. He described the ways in which the website would generate suggested content that a user might be interested in. For instance, if user A and user B both like books and poems but dislike movies and music, and Quora knows that user A dislikes videos, it is then less likely to suggest videos to user B. This algorithm seemed incredibly efficient at the time. After all, wouldn’t our feeds be so much nicer if everything was tailored to our interests? However, after our last meeting, I realized that this process skews us towards a certain direction and accelerates our path in that direction. Bringing this in context of politics, if someone starts as a moderate leaning left, as he is exposed more and more to recommended articles or information arguing in favor of liberal platforms, he may become more and more liberal, therefore inducing more suggested content on the left side and creating a snowball effect in the long run. While this may not necessarily be the intent of websites like Quora or Facebook in showing the user similar articles, it is nonetheless a byproduct of the continual domino effect of such automated suggestions.

However, something that troubled me more than suggested content placed by Facebook was its research on the “I Voted” button. As a user of Facebook, I would expect my social media experience to be the same as any other user of the platform. That is, Facebook usage inherently implies to me an assumption of equality across the user experience. I would have access to the same features as John Smith, and I could interact with my friends on Facebook the same ways that John Smith can interact with his. However, the research on the “I Voted” button took away this equality. While some people could see if their friends had voted or have access to a different phrased version of the button, others, specifically those in the control group, had no access or awareness of those features. As this was done for the purpose of data collection for Facebook, it makes me uncomfortable that I could be denied certain parts of the social media experience in order to prove a statistic.

This ties in nicely with another experiment mentioned in the articles where Facebook randomly altered the emotional content for some 700,000 users. Here, it is even more clear that these research experiments disrupt the equality of the user experience. Furthermore, this shows how much control Facebook has over its users, something that perhaps comes as a surprise to users who often assume that they are the more powerful party in the relationship.

One Response to “Oh Facebook.”

  1. Mike Smith said:

    Any thoughts about how you might mitigate the activities that you’ve mentioned and concern you? They concern me too.

Leave a Reply