{"id":1003,"date":"2014-06-26T11:41:46","date_gmt":"2014-06-26T15:41:46","guid":{"rendered":"http:\/\/blogs.law.harvard.edu\/niftyc\/?p=1003"},"modified":"2014-06-26T16:48:46","modified_gmt":"2014-06-26T20:48:46","slug":"1003","status":"publish","type":"post","link":"https:\/\/archive.blogs.harvard.edu\/niftyc\/archives\/1003","title":{"rendered":"Corrupt Personalization"},"content":{"rendered":"<p>(&#8220;<em>And also Bud Light<\/em>.&#8221;)<\/p>\n<p>In my last two posts\u00a0I&#8217;ve been writing about my attempt to <a href=\"https:\/\/blogs.law.harvard.edu\/niftyc\/archives\/952\">convince a group of sophomores with no background in my field<\/a>\u00a0that there has been a shift to\u00a0<a href=\"http:\/\/socialmediacollective.org\/2014\/03\/25\/show-and-tell-algorithmic-culture\/\">the algorithmic allocation of attention<\/a>\u00a0&#8212; and\u00a0that this is important. In this post I&#8217;ll respond to a student question. My favorite: &#8220;Sandvig says that <strong>algorithms are dangerous<\/strong>, but what are the the most serious repercussions that he envisions?&#8221; What is the coming social media apocalypse\u00a0we should be worried about?<\/p>\n<p><a href=\"http:\/\/www.huffingtonpost.co.uk\/2012\/09\/04\/google-down-brief-search-_n_1853379.html\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-2069 aligncenter\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/google-flames.jpg?w=300\" alt=\"google flames\" width=\"300\" height=\"125\" \/><\/a><\/p>\n<p>This is an important question because people who study this stuff are <strong>NOT\u00a0as interested <\/strong>in this student question\u00a0as they should be. Frankly, we are specialists who study media and computers and things &#8212; therefore we care about how algorithms allocate attention among cultural products almost\u00a0<em>for its own sake.\u00a0<\/em>Because this is the central thing that we study, we don&#8217;t spend a lot of time justifying it.<\/p>\n<p>And our field&#8217;s most common response to the query &#8220;what are the dangers?&#8221; often lacks\u00a0the required\u00a0sense of <em><strong>danger<\/strong><\/em>.\u00a0The most frequent\u00a0response is: &#8220;extensive\u00a0personalization is bad for democracy.&#8221; (a.k.a. Pariser&#8217;s &#8220;<a href=\"http:\/\/www.amazon.com\/The-Filter-Bubble-Personalized-Changing\/dp\/0143121235\">filter bubble<\/a>,&#8221; <a href=\"http:\/\/press.princeton.edu\/titles\/7014.html\">Sunstein&#8217;s<\/a>\u00a0&#8220;egocentric&#8221; Internet, and so on). This framing lacks a certain house-on-fire urgency, doesn&#8217;t it?<\/p>\n<blockquote><p>(sarcastic tone:) &#8220;Oh, no! <strong>I&#8217;m getting to watch, hear, and read exactly what I want<\/strong>. Help me! Somebody do something!&#8221;<\/p><\/blockquote>\n<p>Sometimes\u00a0(as <a href=\"http:\/\/press.princeton.edu\/chapters\/s8781.html\">Hindman <\/a>points out) the contention is the opposite, that Internet-based concentration is bad for democracy. \u00a0But remember that I&#8217;m\u00a0not speaking to political science majors here. The average person may not be as moved by <strong>an abstract, long-term peril to democracy<\/strong> as the average political science professor. As <a href=\"http:\/\/evident.com\/\">David Weinberger<\/a> once said after I warned\u00a0about the increasing reliance on recommendation algorithms, &#8220;<strong>So what?<\/strong>&#8221; Personalization sounds like a good thing.<\/p>\n<p>As a side note, the second most frequent response I see is that <a href=\"http:\/\/io9.com\/the-10-algorithms-that-dominate-our-world-1580110464\">algorithms are now everywhere<\/a>. And they work differently than what came before. This also lacks a required sense of danger! Yes, they&#8217;re everywhere, but <strong>if they are a <em>good<\/em> thing<\/strong>&#8230;<\/p>\n<p>So I really like this question &#8220;what are the the most serious repercussions?&#8221; because I think there <em>are<\/em> some elements of the shift to attention-sorting algorithms\u00a0that are <strong>genuinely &#8220;dangerous<\/strong>.&#8221; I can think of at least two, probably more, and they don&#8217;t get enough attention. In the rest of this post I&#8217;ll spell out the first one which I&#8217;ll call &#8220;<strong>corrupt personalization<\/strong>.&#8221;<\/p>\n<p>Here we go.<\/p>\n<p>Common-sense reasoning about algorithms and culture tells us that the purveyors of personalized content have the same interests we do. That is, if Netflix started recommending only\u00a0movies we hate\u00a0or Google started returning only useless search results we would stop using them. However: <strong>Common sense is wrong in this case<\/strong>. Our interests are often not the same as\u00a0the providers of these selection algorithms. \u00a0As in my last post, let&#8217;s work through a few concrete examples to make the case.<\/p>\n<p>In this post I&#8217;ll use <strong>Facebook<\/strong> examples, but the general problem of corrupt personalization is present on <strong>all of our media platforms in wide use<\/strong> that employ the algorithmic selection of content.<\/p>\n<p><strong>(1) Facebook &#8220;Like&#8221; Recycling<\/strong><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-2076 size-full\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/screen-shot-2012-12-10-at-12-44-34-pm.png\" alt=\"Screen Shot 2012-12-10 at 12.44.34 PM\" width=\"397\" height=\"43\" \/><\/p>\n<p>(Image from ReadWriteWeb.)<\/p>\n<p>On Facebook, in addition to advertisements along the side of the interface, perhaps you&#8217;ve noticed &#8220;featured,&#8221; &#8220;sponsored,&#8221; or &#8220;suggested&#8221; stories that appear inside your news feed, intermingled with status updates from your friends. It could be argued <strong>that this is not in your interest<\/strong>\u00a0as a user (did you ever say, &#8220;gee, I&#8217;d like ads to look just like messages from my friends&#8221;?), but I have bigger fish to fry.<\/p>\n<p>Many ads on Facebook resemble status updates in that there can be <strong>messages endorsing the ads with &#8220;likes.&#8221;<\/strong> For instance, here is an older screenshot from ReadWriteWeb:<\/p>\n<p><a href=\"https:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/pages-you-may-like-on-facebook.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-2059 aligncenter\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/pages-you-may-like-on-facebook.png?w=282\" alt=\"pages you may like on facebook\" width=\"282\" height=\"300\" \/><\/a><\/p>\n<p>Another example: a &#8220;suggested&#8221; post was mixed into my news feed just this morning. recommending World Cup coverage on Facebook itself. It&#8217;s a Facebook ad for Facebook, in other words. \u00a0It had this <strong>intriguing addendum<\/strong>:<\/p>\n<p><a href=\"https:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/censored-likes-facebook1.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-2061 aligncenter\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/censored-likes-facebook1.png?w=300\" alt=\"CENSORED likes facebook\" width=\"300\" height=\"30\" \/><\/a><\/p>\n<p>So, wait&#8230; I have hundreds of friends and <strong>eleven of them &#8220;like&#8221; Facebook<\/strong>? \u00a0Did they go to <a href=\"http:\/\/www.facebook.com\/\">http:\/\/www.facebook.com<\/a> and click on a button like this:<\/p>\n<p><a href=\"https:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/facebook-like-button-magnified.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2064 aligncenter\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/facebook-like-button-magnified.png\" alt=\"Facebook like button magnified\" width=\"224\" height=\"108\" \/><\/a><\/p>\n<p>But facebook.com doesn&#8217;t even have a &#8220;Like&#8221; button! \u00a0Did they <strong>go to Facebook&#8217;s own Facebook page<\/strong> (yes, <a href=\"https:\/\/www.facebook.com\/facebook\">there is one<\/a>)\u00a0and click &#8220;Like&#8221;? I know these people and that seems unlikely.\u00a0And does Nicolala really <strong><em>like<\/em> <\/strong>Walmart? Hmmm&#8230;<\/p>\n<p>What does this &#8220;like&#8221; statement mean? Welcome to <strong>the strange world of &#8220;like&#8221; recycling<\/strong>. Facebook has defined &#8220;like&#8221; in ways that depart from English usage. \u00a0For instance, in the past Facebook has determined that:<\/p>\n<ol>\n<li><strong>Anyone who clicks on a &#8220;like&#8221; button is considered to have &#8220;liked&#8221; all future content from that source.<\/strong> So if you\u00a0clicked a &#8220;like&#8221; button because someone shared a &#8220;Fashion Don&#8217;t&#8221; from Vice magazine, you may be surprised when your dad\u00a0logs into Facebook three years later and is shown a current\u00a0sponsored story from Vice.com like &#8220;Happy Masturbation Month!&#8221; or &#8220;How to Make it in Porn&#8221; with the endorsement that you like it. (Vice.com example is from <a href=\"http:\/\/bureauofminds.tumblr.com\/post\/41028512430\/facebook-is-impersonating-people-without-their\">Craig Condon<\/a>\u00a0[NSFW].)<\/li>\n<li><strong>Anyone who &#8220;likes&#8221; a comment on a shared link is considered to &#8220;like&#8221; wherever that link points to. \u00a0<\/strong>a.k.a.\u00a0&#8220;&#8216;liking a share.&#8221; So if you see\u00a0a (real) FB status update from a (real) friend and it says: &#8220;Yuck! The McLobster is a disgusting product idea!&#8221; and your (real) friend include a (real) link like <a href=\"http:\/\/www.mcdonalds.ca\/ca\/en\/menu\/full_menu\/sandwiches\/mclobster.html\"><span style=\"text-decoration: underline\">this one<\/span><\/a>\u00a0&#8212; that means if you clicked &#8220;like&#8221; your friends may see McDonald&#8217;s ads in the future that include the phrase &#8220;(Your Name) likes McDonalds.&#8221; (This example is from ReadWriteWeb.)<\/li>\n<\/ol>\n<p><a href=\"https:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/fauxlike_mcdonalds.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-2065 size-full\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/fauxlike_mcdonalds.png\" alt=\"fauxLike_mcdonalds\" width=\"364\" height=\"256\" \/><\/a><\/p>\n<p>This has led to some interesting results, like <a href=\"http:\/\/readwrite.com\/2012\/12\/11\/why-are-dead-people-liking-stuff-on-facebook#awesm=~oId1UMU1aKHzRW\">dead people &#8220;liking&#8221; current news stories<\/a>\u00a0on Facebook.<\/p>\n<p>There is already controversy about advertiser &#8220;like&#8221; inflation, &#8220;like&#8221; spam, and fake &#8220;likes,&#8221; &#8212; and these things may be a problem too, but that&#8217;s not what we are talking about here. \u00a0In the examples above <strong>the system is working as Facebook designed it to<\/strong>. A further caveat: note that\u00a0the definition of &#8220;like&#8221; in Facebook&#8217;s software changes periodically and when they are sued. Facebook now has <a href=\"https:\/\/www.facebook.com\/settings?tab=ads&amp;section=social&amp;view\">an opt-out setting<\/a> for the above two &#8220;features.&#8221;<\/p>\n<p>But these incendiary examples are exceptional fiascoes &#8212; on the whole the system probably works well. You likely\u00a0didn&#8217;t know that your &#8220;like&#8221; clicks <strong>are\u00a0merrily producing ads on your friends pages and in your name<\/strong> because you cannot see them. \u00a0These &#8220;stories&#8221; do not appear on your news feed and cannot be individually deleted.<\/p>\n<p>Unlike the examples from\u00a0<a href=\"http:\/\/socialmediacollective.org\/2014\/03\/25\/show-and-tell-algorithmic-culture\/\">my last post<\/a>\u00a0you can&#8217;t quickly\u00a0reproduce these results with certainty on your own account. Still, if you want to try, make a new Facebook account under a fake name (warning! <a href=\"https:\/\/www.facebook.com\/help\/249092175207621\">dangerous!<\/a>) and friend\u00a0your real account. Then use the new account to watch your status updates.<\/p>\n<p>Why would Facebook do this? Obviously it is a controversial practice that is not going to be popular with users. Yet Facebook&#8217;s business model is <strong>to produce attention for advertisers, not to help you &#8212; silly rabbit<\/strong>. So they must have felt\u00a0that using your reputation to produce more ad traffic from your friends was worth the risk of irritating you. Or perhaps they thought that the practice could be successfully hidden from users &#8212; that strategy has mostly worked!<\/p>\n<p>In sum this is a personalization scheme that does not serve your goals, it serves Facebook&#8217;s goals at your expense.<\/p>\n<p><strong>(2) &#8220;Organic&#8221;\u00a0Content<\/strong><\/p>\n<p>This second group of examples concerns content that we consider to be &#8220;not advertising,&#8221; a.k.a. <strong>&#8220;organic&#8221; content.<\/strong> Funnily enough, algorithmic culture has produced this\u00a0new\u00a0use of the word &#8220;organic&#8221; &#8212; but has also made the boundary between &#8220;advertising&#8221; and &#8220;not advertising&#8221; very blurry.<\/p>\n<p><a href=\"http:\/\/themetapicture.com\/try-organic-food\/\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-2110 size-medium\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/funny-organic-food-ad.jpg?w=300\" alt=\"funny-organic-food-ad\" width=\"300\" height=\"207\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p>The general problem is that there are many ways in which algorithms act as <strong>mixing valves between things that can be easily valued with money <\/strong>(like ads)<strong> and things that can&#8217;t<\/strong>. And this kind of mixing is a <em>normative<\/em> problem (what should we do) and not a <em>technical<\/em> problem (how do we do it).<\/p>\n<p>For instance, for years Facebook has encouraged nonprofits, community-based organizations, student clubs, other groups, and really anyone to host content on facebook.com. \u00a0If an organization creates a Facebook page for itself, the managers can update the page as though it were a profile.<\/p>\n<p>Most <strong>page managers expect that people who &#8220;like&#8221; that page get to see the updates<\/strong>&#8230; which was true until January of this year. At that time Facebook modified its algorithm so that <a href=\"http:\/\/www.zoeticamedia.com\/more-changes-for-facebook-admins-text-updates-wont-work-as-well-anymore\">text updates from organizations were not widely shared<\/a>. This is interesting for our purposes because Facebook clearly states that <strong>it wants page operators to run Facebook ad campaigns<\/strong>, and not to count on getting traffic from &#8220;organic&#8221; status updates, as it will no longer distribute as many of them.<\/p>\n<p>This change likely has a very differential effect on, say, <a href=\"https:\/\/www.facebook.com\/nike\">Nike<\/a>&#8216;s Facebook page, <a href=\"https:\/\/www.facebook.com\/Biercamp\">a small local business<\/a>&#8216;s Facebook page, <a href=\"https:\/\/www.facebook.com\/greenpeace.international\">Greenpeace International<\/a>&#8216;s Facebook page, and <a href=\"https:\/\/www.facebook.com\/pages\/Unitarian-Universalist-Church-of-Urbana-Champaign\/156831712457\">a small local church congregation<\/a>&#8216;s Facebook page.\u00a0If you start a Facebook page for a school club, you might be surprised that you are spending your labor writing status updates that are never shown to anyone. <strong>Maybe you should buy an ad.\u00a0<\/strong>Here&#8217;s an analytic for a page I manage:<\/p>\n<p><a href=\"https:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/this-week-page-likes-facebook.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-2067 aligncenter\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/this-week-page-likes-facebook.png\" alt=\"this week page likes facebook\" width=\"147\" height=\"170\" \/><\/a><\/p>\n<p>&nbsp;<\/p>\n<p>The impact isn&#8217;t just about size &#8212; at some level businesses might expect to have to insert themselves into conversations via persuasive advertising that they pay for, but it is not as clear that people expect Facebook to work this way for their local church or other domains of their lives. It&#8217;s as if on Facebook, people were using\u00a0the yellow pages but they thought they were using the white pages. \u00a0And also <strong>there are no white pages<\/strong>.<\/p>\n<p>(Oh, wait. <a href=\"https:\/\/twitter.com\/AsaTait\/status\/446723785341677568\">No one knows what yellow pages and white pages are anymore<\/a>. Scratch that reference, then.)<\/p>\n<p>No need to stop here, in the future perhaps Facebook can monetize my family relationships. It could suggest that if I really want anyone to know about the birth of my child, or<strong> I really want my &#8220;insightful&#8221; status updates to reach anyone<\/strong>, I should\u00a0turn to Facebook advertising.<\/p>\n<p>Let me also emphasize that this mixing problem extends to the <em>content<\/em> of our personal social media conversations as well. A few months back, I posted a Facebook status update that I thought was humorous. I shared a link highlighting the <a href=\"http:\/\/www.amazon.com\/BIC-Cristal-1-0mm-Black-MSLP16-Blk\/product-reviews\/B004F9QBE6\/ref=dpx_acr_txt?showViewpoints=1\">hilarious product reviews<\/a> for the Bic &#8220;<strong>Cristal For Her<\/strong>&#8221; ballpoint pen on Amazon. It&#8217;s a pen designed just for women.<\/p>\n<p><a href=\"https:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/bic-crystal-for-her.jpg\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-2083 size-medium\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/bic-crystal-for-her.jpg?w=300\" alt=\"bic crystal for her\" width=\"300\" height=\"300\" \/><\/a><\/p>\n<p>The funny thing is that I happened to look at a friend of mine&#8217;s Facebook feed over their shoulder, and <strong>my status update didn&#8217;t go away<\/strong>. It remained, pegged at the top of my friend&#8217;s news feed, for as long as 14 days in one instance. What great exposure for my humor, right? But it did seem a little odd&#8230; I queried my other friends on Facebook and some confirmed that the post was also pegged at the top of their news feed.<\/p>\n<p>I was unknowingly participating in another Facebook program that <a href=\"http:\/\/mashable.com\/2011\/01\/25\/facebook-sponsored-stories\/\">converts organic status updates into ads<\/a>. It does this by changing their order in the news feed and adding the text &#8220;Sponsored&#8221; in light gray, which is very hard to see. Otherwise at least some updates are not changed. I suspect <strong>Facebook&#8217;s algorithm thought I was advertising Amazon<\/strong> (since that&#8217;s where the link pointed), but I am not sure.<\/p>\n<p>This is similar to Twitter&#8217;s &#8220;Promoted Tweets&#8221; but there is one big difference. \u00a0In the Facebook case\u00a0<strong>the advertiser promotes content &#8212; my content &#8212; that they did not write<\/strong>. In effect Facebook is re-ordering your conversations with your friends and family on the basis of whether or not someone mentioned\u00a0Coke, Levi&#8217;s, and Anheuser Busch (confirmed advertisers\u00a0in the program).<\/p>\n<p>Sounds like a great <strong>personal social media strategy<\/strong> there&#8211;if you really want people to know about your forthcoming wedding, maybe just drop a few names? Luckily the algorithms aren&#8217;t too clever about this yet so you can mix up the word order for humorous effect.<\/p>\n<blockquote><p>(Facebook status update:) &#8220;I am so delighted to be engaged to this wonderful woman that I am sitting here in my\u00a0Michelob drinking a Docker&#8217;s Khaki Collection. And also Coke.&#8221;<\/p><\/blockquote>\n<p>Be sure to use links. I find the interesting thing about this mixing of the commercial and non-commercial to be that it\u00a0sounds to my ears like some sort of <strong>corny, unrealistic science fiction scenario<\/strong> and yet with the current Facebook platform I believe the above example would work. We are living in the future.<\/p>\n<p>So to recap, if Nike makes a Facebook page and posts status updates to it, that&#8217;s &#8220;organic&#8221; content because they did not pay Facebook to distribute it. Although\u00a0any rational human being would see it as an ad. If my school group does the same thing, that&#8217;s also organic content, but they are encouraged to buy distribution &#8212; <strong>which would make it inorganic<\/strong>. If I post a status update or click &#8220;like&#8221; in reaction to something that happens in my life and that happens to involve a commercial product, my action\u00a0<strong>starts out as organic, but then it becomes inorganic<\/strong> (paid for) because a company can buy my words and likes and show them to other people without telling me. Got it? This paragraph feels like we are rethinking\u00a0<a href=\"http:\/\/www.lsa.umich.edu\/cg\/cg_detail.aspx?content=2010CHEM402100&amp;termArray=x_xx_2010\">CHEM 402<\/a>.<\/p>\n<p>The upshot is that control of the content selection algorithm is used by Facebook to get people <strong>to pay for things they wouldn&#8217;t expect to pay for<\/strong>, and to show people personalized things that they don&#8217;t think are paid for. But these things were in fact paid for. \u00a0In sum this is again a scheme that does not serve your goals, it serves Facebook&#8217;s goals at your expense.<\/p>\n<p><strong>The Danger: Corrupt Personalization<\/strong><\/p>\n<p>With these concrete examples behind us, I can now more clearly answer this student question. What are <strong>the most serious repercussions<\/strong> of the algorithmic allocation of attention?<\/p>\n<p>I&#8217;ll call this first repercussion &#8220;<strong>corrupt personalization<\/strong>&#8221; after <a href=\"http:\/\/en.wikipedia.org\/wiki\/C._Edwin_Baker\">C. Edwin Baker<\/a>. (Baker, a distinguished legal philosopher, coined the phrase &#8220;corrupt segmentation&#8221; in 1998 as\u00a0an extension of the theories of philosopher\u00a0<a href=\"http:\/\/en.wikipedia.org\/wiki\/J%C3%BCrgen_Habermas\">J\u00fcrgen Habermas<\/a>.)<\/p>\n<p>Here&#8217;s how it works: <strong>You have legitimate interests that we&#8217;ll call &#8220;authentic.&#8221;<\/strong> These interests arise from your values, your community, your work, your family, how you spend your time, and so on. A good example might be that as a person who is enrolled in college\u00a0you might identify with the\u00a0category &#8220;student,&#8221; among your many other affiliations. As a student, you might be authentically interested in <a href=\"http:\/\/www.freep.com\/article\/20140619\/NEWS06\/306190176\/U-M-tuition-increase\">an upcoming tuition increase<\/a> or, more broadly, about\u00a0the contention that &#8220;<a href=\"http:\/\/chronicle.com\/article\/The-Miseducation-of-America\/147227\/?cid=at&amp;utm_source=at&amp;utm_medium=en\">there are powerful forces at work in our society that are actively hostile to the college ideal<\/a>.&#8221;<\/p>\n<p>However, you might also be authentically interested in the fact that your cousin is getting married. Or in <strong>pictures of kittens<\/strong>.<\/p>\n<p><a href=\"https:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/grumpy-cat-meme-610x405.png\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium wp-image-2095 aligncenter\" src=\"http:\/\/socialmediacollective.files.wordpress.com\/2014\/06\/grumpy-cat-meme-610x405.png?w=300\" alt=\"Grumpy-Cat-meme-610x405\" width=\"300\" height=\"199\" \/><\/a><\/p>\n<p>Corrupt personalization is the process by which <strong>your attention is drawn to interests that are not your own<\/strong>. This is a little tricky because it is impossible to clearly define an &#8220;authentic&#8221; interest. However, let&#8217;s put that off for the moment.<\/p>\n<p>In the prior examples we saw\u00a0some (I hope) obvious <strong>places where my interests diverged from\u00a0that of algorithmic social\u00a0media systems<\/strong>. Highlights for me were:<\/p>\n<ul>\n<li>When I express my opinion about something to my friends and family, <strong>I do not want that\u00a0opinion re-sold<\/strong> without my knowledge or consent.<\/li>\n<li>When I explicitly endorse something, <strong>I don&#8217;t want that endorsement applied to other things<\/strong> that I did not endorse.<\/li>\n<li>If I want to read a list of personalized status updates about my friends and family, <strong>I do not want my friends and family sorted by how often they mention advertisers<\/strong>.<\/li>\n<li>If a list of things is chosen for me, I want the results organized by some measure of goodness for me, <strong>not by how much money<\/strong>\u00a0someone has paid.<\/li>\n<li>I want<strong>\u00a0paid content<\/strong> to be clearly identified.<\/li>\n<li>I do not want my information technology to sort my life into commercial and non-commercial content and\u00a0<strong>systematically de-emphasize the noncommercial things that I do<\/strong>, or turn these things\u00a0toward commercial purposes.<\/li>\n<\/ul>\n<p>More generally, I think the danger of corrupt personalization is manifest in <strong>three ways<\/strong>.<\/p>\n<ol>\n<li><strong>Things that are not necessarily commercial become commercial\u00a0<\/strong>because of the organization of the system.\u00a0(Merton called this &#8220;<em>pseudo-gemeinschaft<\/em>,&#8221; Habermas called it &#8220;colonization of the lifeworld.&#8221;)<\/li>\n<li><strong>Money is used as a proxy for &#8220;best&#8221;<\/strong> and it does not work. That is, those with the most money to spend can prevail over those with the most useful information. The creation of a salable audience takes priority over your\u00a0authentic interests. (Smythe called this the &#8220;audience commodity,&#8221; it is Baker&#8217;s &#8220;market filter.&#8221;)<\/li>\n<li>Over time, if people are offered things that are not aligned with their interests\u00a0often enough, <strong>they can be taught what to want<\/strong>. That is, they may come to wrongly believe that these are their authentic interests, and it may be difficult to see the world any other way. (Similar to\u00a0Chomsky and Herman&#8217;s [not Lippman&#8217;s] arguments about &#8220;manufacturing consent.&#8221;)<\/li>\n<\/ol>\n<p>There is nothing inherent in the technologies of algorithmic allocation that is doing this to us, instead <strong>the economic organization of the system<\/strong> is producing these pressures. In fact, we could design a system to support our authentic interests, but we would then need to fund it. (Thanks, late capitalism!)<\/p>\n<p>To conclude, let&#8217;s get some historical perspective. What are the other options, anyway? If cultural selection is governed by computer algorithms now, you might answer, &#8220;<strong>who cares?&#8221;<\/strong> It&#8217;s always going to be governed somehow. If I said\u00a0in a talk about &#8220;algorithmic culture&#8221; that I don&#8217;t like <strong>the Netflix recommender algorithm<\/strong>, what is supposed to replace it?<\/p>\n<p>This all sounds pretty bad, so you might think I am\u00a0asking for a return to &#8220;pre-algorithmic&#8221; culture: Let&#8217;s <strong>reanimate the corpse<\/strong> of <a href=\"http:\/\/en.wikipedia.org\/wiki\/Louis_B._Mayer\">Louis B. Mayer<\/a>\u00a0and he can decide what I watch. That doesn&#8217;t seem good either\u00a0and I&#8217;m not recommending it. We&#8217;ve always had selection systems and we could even call some of the earlier ones &#8220;algorithms&#8221; if we want to. \u00a0However, we are constructing\u00a0something new and largely unprecedented here and it isn&#8217;t ideal. It isn&#8217;t that I think algorithms are inherently dangerous, or bad &#8212; quite the contrary. To me this seems like <strong>a case of squandered potential<\/strong>.<\/p>\n<p>With algorithmic culture, computers and algorithms are allowing a new level of real-time personalization and content selection on an individual basis that just wasn&#8217;t possible before. But rather than use these tools to serve our authentic interests, we have built a system that often <strong>serves a commercial interest that is often at odds with our interests<\/strong>\u00a0&#8212; that&#8217;s\u00a0<strong>corrupt personalization<\/strong>.<\/p>\n<p>If I use the dominant forms of communication online today (Facebook, Google, Twitter, YouTube, etc.) I can expect content customized for others to use my name and my words without my consent, in ways I wouldn&#8217;t approve of. <strong>Content &#8220;personalized&#8221;\u00a0for me includes material I don&#8217;t want, and obscures material that I do want.\u00a0<\/strong>And it does so in a way that I may not be aware of.<\/p>\n<p>This isn&#8217;t an abstract problem\u00a0like a long-term threat to democracy,<strong> it&#8217;s more like a mugging<\/strong> &#8212; or at least a confidence game or a fraud. It&#8217;s violence being done to you right now, under your nose. Just click &#8220;like.&#8221;<\/p>\n<p>In answer to your question, dear student, that&#8217;s <strong>my first danger<\/strong>.<\/p>\n<p>* * *<\/p>\n<p><strong>ADDENDUM:<\/strong><\/p>\n<p>This blog post is already too long, but here is a TL;DR addendum for people who already know about all this stuff.<\/p>\n<p>I&#8217;m calling this corrupt personalization because<strong> I cant just apply\u00a0Baker&#8217;s excellent ideas about corrupt segments<\/strong>\u00a0&#8212; the world has changed since he wrote them. Although this post&#8217;s\u00a0reasoning is an extension of Baker, it is not a straightforward extension.<\/p>\n<p>Algorithmic attention is a big deal because we used to think about media and identity using <em><strong>categories<\/strong>,<\/em> but the algorithms in wide use are not natively organized that way. Baker&#8217;s ideas were premised on the difference between authentic and inauthentic categories (&#8220;segments&#8221;), yet <strong>segments are just not that important anymore<\/strong>.\u00a0<a href=\"http:\/\/www.ie.edu\/university\/studies\/faculty\/fernando-bermejo\">Bermejo <\/a>calls this the era of\u00a0<em>post-demographics<\/em>.<\/p>\n<p>Advertisers used to\u00a0group demographics together to make audiences comprehensible, but it may no longer be necessary\u00a0to buy and sell demographics or categories as <strong>they are a crude proxy for purchasing behavior<\/strong>. If I want to sell a Subaru, why buy access to &#8220;<a href=\"http:\/\/www.claritas.com\/MyBestSegments\/Default.jsp?ID=37&amp;id1=1027&amp;id2=12\">Brite Lights, Li&#8217;l City<\/a>&#8221; (My PRIZM marketing demographic from the 1990s) when I can\u00a0directly\u00a0detect &#8220;intent to purchase a station wagon&#8221; or &#8220;shopping for a Subaru right now&#8221;? This complicates Baker&#8217;s idea of authentic segments quite a bit. See also Gillespie&#8217;s concept of <a href=\"http:\/\/www.tarletongillespie.org\/essays\/Gillespie%20-%20The%20Relevance%20of%20Algorithms.pdf\">calculated publics<\/a>.<\/p>\n<p>Also Baker was writing in an era where content was inextricably linked to advertising because it was not feasible to decouple them. But today <strong>algorithmic attention sorting has often completely decoupled advertising from content<\/strong>. Online we see ads from networks that are based on user behavior over time, rather than what content the user is looking at right now. The relationship between advertising support and content is therefore more subtle than in the previous era, and this bears more\u00a0investigation.<\/p>\n<p>Okay, okay <strong>I&#8217;ll stop<\/strong> now.<\/p>\n<p><em>(This post was cross-posted to <a href=\"http:\/\/socialmediacollective.org\/2014\/06\/26\/corrupt-personalization\/\">The Social Media Collective<\/a>.)<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>(&#8220;And also Bud Light.&#8221;) In my last two posts\u00a0I&#8217;ve been writing about my attempt to convince a group of sophomores with no background in my field\u00a0that there has been a shift to\u00a0the algorithmic allocation of attention\u00a0&#8212; and\u00a0that this is important. In this post I&#8217;ll respond to a student question. My favorite: &#8220;Sandvig says that algorithms [&hellip;]<\/p>\n","protected":false},"author":2132,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2},"jetpack_post_was_ever_published":false},"categories":[1321,261],"tags":[],"class_list":["post-1003","post","type-post","status-publish","format-standard","hentry","category-research","category-teaching"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/s4M7Bm-1003","_links":{"self":[{"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/posts\/1003","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/users\/2132"}],"replies":[{"embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/comments?post=1003"}],"version-history":[{"count":4,"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/posts\/1003\/revisions"}],"predecessor-version":[{"id":1009,"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/posts\/1003\/revisions\/1009"}],"wp:attachment":[{"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/media?parent=1003"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/categories?post=1003"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/niftyc\/wp-json\/wp\/v2\/tags?post=1003"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}