You are viewing a read-only archive of the Blogs.Harvard network. Learn more.

The Longest Now

Forging Social Proof: the Networked Turing Test Rules the First AI War
Friday September 25th 2020, 2:51 pm
Filed under: citation needed,fly-by-wire,Uncategorized

A few years ago I wrote about how our civilization was forfeiting the zeroth AI war — allowing individual attention hacks, deployed at scale, to diminish and replace our natural innovation and productivity in every society.  We gained efficiency in every area of life, and then let our new wealth and spare time get absorbed by newly-efficient addictive spirals.

Exploit culture

This war for attention affects what sort of society we can hope to live in. Channeling so much wealth to attention-hackers, and the networks of crude AI tools and gambling analogs that support them, has strengthened an entire industry of exploiters, allowing a subculture of engineers and dealmakers to flourish.  That industry touches on fraud, propaganda, manipulation of elections and regulation, and more, all of which influence what social equilibria are stable.

The first real AI war

Now we are facing the first real artificial-intelligence war — dominated by entities that appear as avatars of independent, intelligent people, but are artificial, scripted, automated.  

What is new in this? Earlier low-tech versions of this required no machine learning or programming: they used the veil of pseudonymity to fake authorship, votes, and small-scale consensus.  In response, we developed layers of law and regulation around earlier attacks — fraud, impersonation, and scams are illegal.  AI can smoothly scale this to millions of comments on public bills, and to forging microtargeted social proof in millions of smaller group interactions online. And these scaled attacks are often still legal, or lightly penalized and enforced.

Scale and speed

Fabricating an alternate fantastical reality, exaggerated and unfalsifiable, is a common tactic in socializing broad reallocations of agency and power. {cn}  This realfabrik often cannibalizes the knowledge, cooperation, and stability of its surrounding society, in order to build something incompatible with it.

This was once limited by the human capacity and commitment of one’s followers. Now, however, you can build such a movement in your (or someone else’s) backyard.  This is thanks to a confluence of developments in spam, propaganda, AI, and delegated decision-making, including:

  • New sources of  funding for such tools (weapons to defeat unwanted realities) {c}
  • Tools that pass the Distributed Turing Test – networks that pass as human + fantasy that passes as fact in shared discourse space {c}
  • A cache of unaddressed vulnerabilities in every sense-making intermediary: polls, press, politics, law, regulation, corporations, even scientific + military bodies. (Where is the 0-day database for this?) {c}
  • A rising cohort of exploiters, willing to use these vulnerabilities
  • Societies that continue to assume good faith and accurate information, allowing that cohort to implement changes that cement their control {counter-c}

Impact and funding

How much fungible power and control is at stake here, and how much is being invested into this war already?  The battle for attention felt like a $T of investment for some proportional profit, and an opportunity cost perhaps ten times that.  This feels a few magnitudes larger: a war that will dominate how we perceive and respond to a cascade of crises that will not end for some time.  In the US alone, the course of the urrent pandemic has been strongly influenced by this war.  $10T has been spent on explicit remediation and emergency funding, with minimal oversight, manipulation of supplies, and insider trading of associated stocks.  By the end of the pandemic, taking the externalities and aftermath into account, this information war may account for a quad — four years of global product — roughly the net cost of WWII.  This will be a mix of the opportunity cost of global disorder, the replacement cost of systems lost, and the positive cost of directable streams of resources (often nominally to counteract those destructive effects).

Much more than this could be gained from generative uses of reality distortion, but the result would be harder to centralize, control, and hide.  A perverse incentive for short-term leaders to sustain {war, pandemic, cataclysm} is that it keeps people busy with daily crises and provides a catch-all smokescreen for decay or failure.

Phases of the war: state of play

Forging identity and reputation

The first phase of this war was populating digital spaces with fake identities, organizations, newsrooms, and reports.

DDOSing sensemaking

Recently, a US ecosystem has become the world’s top source of funding and amplification for grassroots conspiracy theories. This ecosystem includes government spokespeople, the country’s most influential news network, and political marketing organizations, as well as niche media and platforms. These are not limited to cranks or any single subgroup, but span religious and secular, conservative and libertarian and liberal.

This ecosystem completes a positive feedback loop: forging social proof in a way that makes all sectors of society respond to fabricated realities, treating them as real and wasting the time of [real] people in responding to them.  In combination with the unequal distribution of tools mentioned above, this can exploit and defeat preexisting norms, regulations, and laws, by subverting the sensemaking needed to implement them.

What we do next

I don’t know what will come next. But we must change how we parse our environment, what is at stake and the forces at play, to have a chance of making sense of it.

~ We must acknowledge the global froth of battles to control how we understand our shared present, with combatants who collaborate publicly and privately.  We must name this an information war, not as an analogy, but as a Named War with sides and objectives.

~ We must quantify where and how communities lose shared understanding of what is going on.  Where the average person’s understanding of the state of the world is no longer governed primarily by the independent observations of reporters, scientists, educators, ministers, and civil servants. Where even the most reliable of sources can no longer draw from pools of knowledge untainted by manipulation. {c} And where reliable human sources trained in those backgrounds are drowned out by others that look similar, but are fully fabricated or scripted by agents of the war. {c}

We should recognize that compact networks directly influence the flow of quads, with a speed that outstrips legal and political adaptation, and is accelerated by crisis. {c}  And for the first time, there are a wide range of automatable 0-day exploits for current laws and norms, affecting most parts of society, alongside tools sufficient for a small group to exploit them at scale. {c}

Finally, we must concede that communities of exploiters willing to implement those social vulnerabilities at scale are winning major battles in this war — including keeping it from being declared openly while solidifying into alliances of mutual aid, with structural, financial, and memetic power. {c}

Comments Off on Forging Social Proof: the Networked Turing Test Rules the First AI War

Bad Behavior has blocked 150 access attempts in the last 7 days.