Tuesday, July 13, 2021

Toward the Digital Constitution of Knowledge [a teaser*]

How to Destroy Truth, the 7/1/21 David Brooks column, offers insight on the problems and opportunities of social media, drawing on Jonathan Rauch’s important new book “The Constitution of Knowledge: A Defense of Truth.” Brooks summarizes Rauch about empirical and propositional knowledge (and how that complements the emotional and moral knowledge that derives from the collective wisdom of shared stories):

…the acquisition of this kind of knowledge is also a collective process. It’s not just a group of people commenting on each other’s internet posts. It’s a network of institutions — universities, courts, publishers, professional societies, media outlets — that have set up an interlocking set of procedures to hunt for error, weigh evidence and determine which propositions pass muster.

My work on the future of networks for human collaboration has been in tune with this and suggests some urgent further directions, as detailed most recently in my Tech Policy Press article, The Internet Beyond Social Media Thought-Robber BaronsHaving just read Rauch’s book (with close attention to his chapter on “Disinformation Technology: The Challenge of Digital Media”), I have two initial take-aways that I preview here:

Extending Rauch’s work: I was struck that Rauch might enhance his ideas by drawing on proposals for unbundling aspects of digital media, as I and others (including Jack Dorsey and Francis Fukuyama) have advocated. Rauch’s chapter on media is very resonant, but the final section stopped me short. He seems uncritical in support of Big Tech efforts at quasi-independent outsourcing of controls like the Facebook Oversight Board and fact-checking authorities. I see that as ineffective – and, more importantly, as a fig-leaf on overcentralized authoritarian control of these essential network utilities -- and counter to the more open emergence needed to seek effective consensus on truth.

Extending my work: I have built on similar ideas (notably Renee DiResta’s Mediating Consent) -- but Rauch convinces me to add focus on the role of institutional participants in that process, beyond the emergent bottom-up processes for reliance on such institutions that I have been emphasizing as the driving force.

As Rauch explains, the “constitution of knowledge” is a collective process based on rules and governance principles. As he says, the dominant social media companies have hijacked this process to serve their own business objectives of selling ads, rather than the objectives of their users and society to support the constitution of knowledge. It is now clear to everyone whose salary does not depend on the selling of ads that these two objectives are incompatible, and we are suffering the consequences.

But, to the extent it is the platforms that address this, directly or via surrogates, it devolves into undemocratic “platform law,” which as  Molly Land explains, lacks legitimacy and is “overbroad and underinclusive.” Rauch makes a similar point that the Web has become a “company town.”

To address that we need to unbundle key functions of the social network platforms. As all discourse moves to the digital domain, there is a core function of posting and access that seeks to be universal and thus very subject to network effects that favor a degree of concentration. But the function that is essential to the constitution of knowledge is the selection of what each of us sees in our newsfeeds. In a free society that must be largely a matter of individual choice. That can be decentralized and has limited network effects.

The solution this leads to is a functional unbundling: to create an open market in filtering services that each of us can select from and mix and match to customize a feed of what each of us wishes to view from the platform at any given time. That might be voluntary (if Dorsey has his way) or mandated (if Zuckerberg continues to overreach).

My article and the works of other proponents of such an unbundling explain how these feed filtering services can be offered by service providers that may include the kinds of traditional institutional gatekeepers Rauch refers to. We have argued that such decentralization breaks up the “platform law” we are stumbling into. Instead, it returns individual agency into our open marketplace of ideas, supporting it with an open marketplace of filters. We, not the platform, should decide when we want filters from a given source, and with what weight. Those sources can include all the kinds of institutions based on professionalism and institutionalism that Rauch refers to, but we should generally be the ones to decide. Rauch quotes Frederick Douglas on “the rights of the hearer.” Democracy and truth require that we free our feeds to protect “the rights of the hearer as well as those of the speaker.”

As one of Rauch’s chapter subtitles says, “Outsourcing reality to a social network is humankind’s greatest innovation.” Translating that to the digital domain, the core idea is that multiple filtering services can uprank and downrank items for possible inclusion in our feeds. Each filtering service should be able to assign weights to those up or down rankings, and users should be able to use knobs or sliders to give higher or lower weightings to each filtering service they want applied. Rauch's emphasis on institutions suggests that more official and authoritative gatekeepers might have special overweightings or other privileged ways to adjust what we see and how we see it (such as to provide warnings or introduce friction into viral sharing).

My own work on filtering algorithms designs for truth in a man-machine partnership that is based on distilling human judgements of reputation. This generalizes how Google’s PageRank distills the human judgments of “Webmasters” as encoded into the emergent linking of the Web – adapting that to the likes and shares and comments of social media. Rauch seems to suggest a similar direction: “giving users an epistemic credit score.” (Social media already track users’ reputational credit scores, but they score for engagement, not for truth.) As Rauch observes' there can be "no comprehensive solutions to the disinformation threat," but this massive crowdsourcing of judgments offers what can become a robust "cognitive immune system." 

I would be very interested to learn how Rauch might build on those proposals for 1) open filtering and 2) reputation-based truth-seeking algorithms -- to develop his vision of the constitution of knowledge into a more dynamically adaptive and emergent future – one that moves toward a more flexible structure of “community law.” (Similar issues of platform versus community law also apply to the softer emotional and moral knowledge that Brooks refers to.)

Pursuant to that, I plan to revisit ideas from my early work on how this digitally augmented constitution of knowledge can effectively combine 1) the open emergence of preferred filtering services from individual users with 2) the contingent establishment of more official and authoritative gatekeepers. My original 2003 design document (paragraph 0288) outlined a vision that extended this kind of decentralized selection of filtering services in a way that I hope Rauch might relate to:

Checks and balances could provide for multiple bodies with distinct responsibilities, such as executive, legislative, and judicial, and could draw on representatives to oversee critical decisions and methods. Such representatives may be elected by democratic methods, or through reputation-based methods, or some combination. Expert panels could also have key roles, again, possibly given limited charters and oversight by elected representatives to avoid abuse by a technocracy. External communities and governmental bodies may also have oversight roles in order to ensure broadly based input and sensitivity to the overall welfare. The use of multiple confederated and cooperative marketplaces, as described above, may also provide a level of checks and balances as well.

It seems most of our thinking about social media is currently reactive and rooted in the present, looking only to the very near future. But we are already far down a wrong path and need a deep rethinking and reformation. We need a new driving vision of how our increasingly digital society can reposition itself to deal with the constitution of knowledge for coming decades. That future must be a flexible and emergent, and able to deal with unimaginable scale, speed, and scope. If we do not set a course for that future now, we may well find ourselves in a dark age that will be increasingly hard to escape. That window may already be closing.

---

*Referring to this as "a teaser," it is a preliminary draft that I hope to refine and expand based on further thought and feedback.

-----

Links to my related work in Tech Policy Press, my blog, and other publications can be found in the Selected Items tab above.

No comments:

Post a Comment