{"id":196,"date":"2007-02-24T19:57:00","date_gmt":"2007-02-24T23:57:00","guid":{"rendered":"http:\/\/blogs.law.harvard.edu\/ugasser\/2007\/02\/24\/social-signaling-theory-and-cyberspac"},"modified":"2007-02-25T12:27:07","modified_gmt":"2007-02-25T16:27:07","slug":"social-signaling-theory-and-cyberspace","status":"publish","type":"post","link":"https:\/\/archive.blogs.harvard.edu\/ugasser\/2007\/02\/24\/social-signaling-theory-and-cyberspace\/","title":{"rendered":"Social Signaling Theory and Cyberspace"},"content":{"rendered":"<p>Yesterday, I attended a Berkman workshop on \u201cAuthority and Authentication in Physical and Digital Worlds: An Evolutionary Social Signaling Perspective. <a href=\"http:\/\/smg.media.mit.edu\/people\/Judith\/\">Professor Judith Donath<\/a> from the MIT Media Lab and Berkman Center\u2019s Senior Fellow <a href=\"http:\/\/cyber.law.harvard.edu\/home\/john_clippinger\">Dr. John Clippinger<\/a> were presenting fascinating research on trust, reputation, and digital identities from the perspective of signaling theory \u2013 a theory that has been developed in evolutionary biology and has also played an important role in economics. I had the pleasure to serve as a respondent. Here are the three points I tried to make (building upon fruitful prior exchanges with <a href=\"http:\/\/www.fir.unisg.ch\/org\/fir\/web.nsf\/wwwPubInhalteEng\/Nadine+Blaettler?opendocument\">Nadine Blaettler<\/a> at our <a href=\"http:\/\/www.fir.unisg.ch\/org\/fir\/web.nsf\/wwwPubhomepage\/webhomepageeng?opendocument\">Center<\/a>).<\/p>\n<p>The starting point is the observation that social signals \u2013 aimed at (a) indicating a certain hidden quality (e.g. \u201cwould you be a good friend?,\u201d \u201care you smart?\u201d) and (b) changing the believes or actions of its recipient \u2013 are playing a vital role in defining social relations and structuring societies. Viewed from that angle, social signals are dynamic building blocks of what we might call a \u201cgovernance system\u201d of social spaces, both offline and online. (In the context of evolutionary biology, Peter Kappeler provides an excellent overview of this topic in his recent <a href=\"http:\/\/www.amazon.com\/Verhaltensbiologie-Springer-Lehrbuch-Peter-M-Kappeler\/dp\/354024056X\">book<\/a> [in German.)<\/p>\n<p>Among the central questions of signaling theory is the puzzle of what keeps social signals reliable. And what are the mechanisms that we have developed to ensure \u201chonesty\u201d of signals? It is obvious that these questions are extremely relevant from an Internet governance perspective \u2013 especially (but not only) vis-\u00e0-vis the enormous scale of online fraud and identity theft that occurs in cyberspace. However, when applying insights from social signaling theory to cyberspace governance issues, it is <em>important to sort out in what contexts we have an interest in signal reliability and honest signaling<\/em>, respectively, and where not. This question is somewhat counterintuitive because we seem to assume that honest signals are always desirable from a societal viewpoint. But take the example of virtual worlds like <a href=\"http:\/\/www.secondlife.com\/\">Second Life<\/a>. Isn\u2019t it one of the great advantages of such worlds that we can experiment with our online representations, e.g., that I as a male player can engage in a role-play and experience my (second) life as a girl (female avatar)? In fact, we might have a normative interest in low signal reliability if it serves goals such as equal participation and non-discrimination. So, my first point is that we face an important normative question when applying insights from social signaling theory to cyberspace: What degree of signal reliability is desirable in very diverse contexts such as dating sites, social networking sites, virtual worlds, auction web sites, blogs, tagging sites, online stores, online banking, health, etc.? Where do we as stakeholders (users, social networks, business partners, intermediaries) and as a society at large care about reliability, where not?<\/p>\n<p>My second point: Once we have defined contexts in which we have an interest in high degrees of signal reliability, we should consider the <span style=\"font-style: italic\">full range of strategies and approaches to increase reliability<\/span>. Here, much more research needs to be done. Mapping different approaches, one might start with the basic distinction between assessment signals and conventional signals. One strategy might be to design spaces and tools that allow for the expression of assessment signals, i.e. signals where the quality they represent can be assessed simply by observing the signal. User-generated content in virtual worlds might be an example of a context where assessment signals might play an increasingly important role (e.g. richness of virtual items produced by a player as a signal for the user\u2019s skills, wealth and available time.)<br \/>\nHowever, cyberspace is certainly an environment where conventional signals dominate \u2013 a type of signal that lacks an inherent connection between signal and the quality it represents and is therefore much less reliable than an assessment signal. Here, social signaling theory suggests that the reliability of conventional signals can be increased by making dishonest signaling more expensive (e.g. by increasing the sender\u2019s production costs and\/or minimizing the rewards for dishonest signaling, or \u2013 conversely \u2013 lowering the recipient\u2019s policing\/monitoring costs.) In order to map different strategies, Lessig\u2019s model of <a href=\"http:\/\/codev2.cc\/\">four modes of regulation<\/a> might be helpful. Arguably, each ideal-type approach \u2013 technology, social norms, markets, and law \u2013 could be used to shape the cost\/benefit-equilibrium of a dishonest signaler. A few <em>examples<\/em> to illustrate this point:<\/p>\n<ul>\n<li><strong>Technology\/code\/design<\/strong>: Increasing punishment costs of the sender by way of building efficient reputation systems based on persistent digital identities; use of aggregation and syndication tools to collect and \u201cpool\u201d experiences among many users to lower policing costs; lowering transaction costs of match-making between a user who provides a certain level of reliability and a transaction partner who seeks that level of reliability (see, e.g., Clippinger\u2019s idea of a ID Rights Engine, or consider search engines on social networking sites that allow to search for \u201ccommon ground\u201d-signals where reliability is often easier to assess, see also <a href=\"http:\/\/www.msu.edu\/~lampecli\/papers\/chi2007_slashdot.pdf\">here<\/a>, p. 9.)<\/li>\n<\/ul>\n<ul>\n<li><strong>Market-based approach<\/strong>: Certification might be a successful signaling strategy \u2013 see, e.g., <a href=\"http:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=351261\">this study<\/a> on the online comic book market. Registrations costs, e.g. for a social networking or online dating sites (see <a href=\"http:\/\/smg.media.mit.edu\/papers\/Donath\/PublicDisplays.pdf\">here<\/a>, p. 74, for an example), might be another market-based approach to increase signal reliability (a variation on it: creation of economic incentives for new intermediaries \u2013 \u201cYouTrust\u201d \u2013 that would guarantee for certain degrees of signal reliability.) [During the discussion, Judith made the excellent point that registration costs might not signal what we would hope for while introducing it \u2013 e.g. it might signal \u201cI can afford this\u201d as opposed to the desired signal of \u201cI\u2019m willing to pay for the service because I have honest intentions\u201d.)]<\/li>\n<\/ul>\n<ul>\n<li><strong>Law-based approach<\/strong>: Law can also have an impact on the cost\/benefit equilibrium of the interacting parties. Consider, e.g., disclosure rules such as requiring the online provider of goods to provide test results, product specifications, financial statements, etc.; warranties and liability rules; trademark laws in the case of online identity (see <a href=\"http:\/\/www.nyls.edu\/pages\/591.asp\">Professor Beth Noveck\u2019s<\/a> <a href=\"http:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=835924\">paper<\/a> on this topic.) Similarly, the legal system might change the incentives of the platform providers (e.g. <a href=\"http:\/\/www.myspace.com\/\">MySpace<\/a>, <a href=\"http:\/\/www.youtube.com\/\">YouTube<\/a>) to ensure a certain degree of signal reliability. [<a href=\"http:\/\/cyber.law.harvard.edu\/people\/jpalfrey.html\">Professor John Palfrey<\/a> pointed to this <a href=\"http:\/\/www.siliconvalley.com\/mld\/siliconvalley\/news\/editorial\/16703654.htm\">decision<\/a> as a good illustration of this question of intermediary liability.)<\/li>\n<\/ul>\n<p>In sum, my second point is that we should start mapping the different strategies, approaches, and tools and discuss their characteristics (pros\/cons), feasibility and interplay when thinking about practical ways to increase signal reliability in cyberspace.<\/p>\n<p>Finally, a third point in brief: Who will make the decisions about the degrees of required signal reliability in cyberspace? Who will make the choice among different reliability-enhancing mechanisms outlined above? Is it the platform designer, the <a href=\"http:\/\/lindenlab.com\/about\">Linden Labs<\/a> of this world? If yes, what is their legitimacy to make such design choices? Are the users in power by voting with their feet \u2013 assuming that we\u2019ll see the emergence of competition among different governance regimes as KSG <a href=\"http:\/\/ksgfaculty.harvard.edu\/viktor_mayer-schoenberger\">Professor Viktor Mayer-Schoenberger<\/a> <a href=\"http:\/\/ksgnotes1.harvard.edu\/Research\/wpaper.nsf\/rwp\/RWP05-052\">has argued<\/a> in the context of virtual world platform providers? What\u2019s the role of governments, of law and regulation?<\/p>\n<p>As always, comments appreciated.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Yesterday, I attended a Berkman workshop on \u201cAuthority and Authentication in Physical and Digital Worlds: An Evolutionary Social Signaling Perspective. Professor Judith Donath from the MIT Media Lab and Berkman Center\u2019s Senior Fellow Dr. John Clippinger were presenting fascinating research on trust, reputation, and digital identities from the perspective of signaling theory \u2013 a theory [&hellip;]<\/p>\n","protected":false},"author":202,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1265,1185,1197,904,1332,662,1057],"tags":[],"class_list":["post-196","post","type-post","status-publish","format-standard","hentry","category-digital-id","category-digital-institutions","category-futurology","category-interet-governance","category-signaling-theory","category-virtual-worlds","category-web-20"],"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/posts\/196","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/users\/202"}],"replies":[{"embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/comments?post=196"}],"version-history":[{"count":0,"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/posts\/196\/revisions"}],"wp:attachment":[{"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/media?parent=196"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/categories?post=196"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/archive.blogs.harvard.edu\/ugasser\/wp-json\/wp\/v2\/tags?post=196"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}