Friday, September 17, 2021

Unbundling Social Media Filtering Services – Toward an Ecosystem Architecture for the Future [Working Draft]

Abstract

Raging issues concerning moderation of harmful speech on social media are most often viewed from the perspective of combating current harms. But the broader context is that society has evolved a social information ecosystem that mediates discourse and understanding through a rich interplay of people, publishers, and institutions. Now that is being disintermediated, but as digitization progresses, mediating institutions will reinvent themselves to leverage this new infrastructure. The urgent task for regulation is to facilitate that. Current proposals for unbundling of social media filtering services are just a first and most urgent step in that evolution. That can transform moderation remedies -- including ranking/recommenders, bans/takedowns, and flow controls -- to support epistemic health instead of treating epistemic disease.

An unbundling proposal with growing, but still narrow, support was the subject of an important debate about social media platform power among scholars in the Journal of Democracy (as I summarized in Tech Policy Press). On further reflection, the case for unbundling as a way to resolve the dilemmas of today becomes stronger by looking farther ahead. Speaking as a systems architect, here are thoughts about these first steps on a long path – to limit current harms and finesse current dilemmas in a way that builds toward a flexible digital social media ecosystem architecture. What is currently seen as failings of individual systems should be viewed as birth pangs in a digital transformation of the entire ecosystem for social construction of truth and value.

That debate was on proposals to unbundle and decentralize moderation decisions now made by the platforms -- to limit platform power and empower users. The argument is that the platforms have gained too much power, and that, in a democracy, we the people should each have control over what information is fed to us (directly or through chosen agents). Those decisions should serve users -- and not be subject to undemocratically arbitrary “platform law” -- nor to improper government control (which the First Amendment constrains far more than many would-be reformers recognize). Common arguments against such unbundling are that shifting control to users would do too little to combat current harms and might even worsen them.* Meanwhile, some other observers favor a far more radical decentralization, but that seems beyond reach.

Here I suggest how we might finesse some concerns about the unbundling proposals -- to position that as a first step that can help limit harms and facilitate other remedies -- while also beginning a path toward meeting the greater challenges of the future. That future should be neither centralized nor totally decentralized, but a constantly evolving hybrid of distributed services, authority, and control. Unbundling filters is a start.

The moderation dilemma

The trigger for this debate was Francis Fukuyama’s article on proposals that the ranking and recommender decisions made within the platforms should be spun out into an open market that users can select from. The platforms should not control decisions about what individuals choose to hear, and an open market would spur competition and innovation. Responding to debate comments, Fukuyama recognized concerns that some moderation of toxic content might be too complex and costly to decentralize. He also observed that we face a two-sided issue: not only the promotion of toxic content, but also bans or takedowns that censor some speakers or items of their speech. He suggested that perhaps the control of clearly toxic content – except for the sensitive case of political speech -- should remain under centralized control.

That highlights the distinction between two mechanisms of moderation that are often confounded -- each having fundamentally different technical/operational profiles:

Blocking moderation in the form of bans/takedowns that block speakers or their speech from being accessible to any user. Such items are entirely removed from the ecosystem. To the extent this censorship of speech (expression) is to be done, that cannot be a matter of listener choice.

Promotional moderation in the form of ranking/recommenders that decide at the level of each individual listener what they should hear. Items are not removed, but merely downranked in priority for inclusion in an individual’s newsfeed so they are unlikely to be heard. Because this management of reach (impression) is listener-specific, democratic principles require this to be a matter of listener rights (at least for the most part). Users need not manage this directly but should be able to choose filtering services that fit their desires.

The decision options map out as in this table:


OK for personal taste

Lawful but awful?

Illegal

Ranking/recommenders

OK – listener control

Boundary???

n/a

Bans/takedowns

n/a

Boundary???

Block to all


As a first approximation, setting aside those lawful but awful boundary cases, this suggests that OK content should be managed by filtering services that are chosen by listeners in an open market -- as has been proposed, but that blocking of illegal content should remain relatively centralized. That narrows the tricky questions to the boundary zone: how wide and clearly bounded is that zone, should the draconian censorship of bans/takedowns apply, how much of this can be trusted to the lighter hand of decentralized ranking/recommenders to moderate well enough, and who decides on bans/takedowns (platforms, government, independent boards)?

Unbundling is a natural solution for ranking/recommenders, but a different kind of unbundling might also prove useful for bans/takedowns. That censorship authority might be delegated to specialized services that operate at different levels to address issues that vary by nation, region, community, language, and subject domain.

Sensible answers to these issues will shake out over time – if we look ahead enough to assure the flexibility, adaptability, and freedom for competition and innovation to enable that. The answers may never be purely black or white, but instead, pragmatic matters of nuance and context.

An ecosystem architecture for social media – from pre-digital to digital transformation

How can we design for where the puck will be? It is natural to think of “social media” in terms of what Facebook and its ilk do now. The FTC defined that as “personal social networking services…built on a social graph that maps the connections between users and their friends, family, and other personal connections.” But it is already evident this blurs in many ways: with more media-centered services like YouTube and Pinterest; as users increasingly access much of their news via social media sharing and promotion rather than directly from publishers; and as institutions interact with their members via these digital social media channels.

Meanwhile, it is easy to forget that long before our digital age, humanity evolved a highly effective social media ecology in which communities, publishers, and institutions mediated our discourse as gatekeepers and discovery services. These pre-digital social networks are now being disintermediated by new digital social network systems. But as digitization progresses, such mediating institutions will reinvent themselves and rebuild for this new infrastructure.

Digital platforms seized dominance over this epistemic ecosystem by exploiting network effects – moving fast and breaking it. Now we are digitizing our culture – not just the information, but the human communication flows, processes, and tools – in a way that will determine human destiny.

From this perspective, it is apparent that neither full centralization nor full decentralization can cope with this complexity. It requires an architecture that is distributed and coordinated in its operation, how it is controlled, and by whom. It must be open to many social network services, media services, and intermediary services to serve many diverse communities and institutions. It must evolve organically and emergently, with varying forms of loose or tight interconnection, with semipermeable boundaries -- just as our existing epistemic ecosystem has. Adapting the metaphor of Jonathan Rauch, it will take a “constitution of discourse” that blends bottom-up, top-down, and distributed authority – in ways that will embed freedom and democracy in software code. Open interoperability, competition, innovation, and adaptability -- overseen with smart governance and human intervention -- will be the only way to serve this complex need.+

Network effects will continue to drive toward universality of network connectivity. Dominant platforms may resist change, but oligopoly control of this utility infrastructure will become untenable. Once the first step of unbundling filters is achieved, at least some elements of the decentralized and interoperable visions of Mike Masnick, Cory Doctorow, Ethan Zuckerman, and Twitter CEO Jack Dorsey that now may seem impractical will become more attainable and compelling. Just as electronic mail can be sent from one mail system (Gmail, Apple, Outlook, ...) through a universal backbone network that interoperates with every other mail system, postings in one social media service should interoperate with every other social media service. This will be a long, complex evolution, with many business, technical, and governance issues to be resolved. But what alternative is viable?

Facebook Groups and Pages are primitive community and institutional layers that complement our personal social network graphs. Filtering serves a complementary organizing function. Both can interact and grow in richness to support the reinvention of traditional intermediaries and institutions to ride this universal social media utility infrastructure and reestablish context that has “collapsed.”** Twitter’s Bluesky initiative envisions that this ecosystem will grow with a scope and vibrance that makes attempts at domination by monolithic services counterproductive -- and that this broader service domain presents a far larger and more sustainable business opportunity. The Web enabled rich interoperability in information services -- why not build similar interoperability into our epistemic ecosystem?

Coping with ecosystem-level moderation challenges

The ecosystem perspective is already becoming inescapable. Zeve Sanderson and colleagues point to “cross-platform diffusion of misinformation, emphasizing the need to consider content moderation at an ecosystem level.” That suggests that filtering services should be cross-platform. Eric Goldman provides a rich taxonomy of diverse moderation remedies, many of which beg for cross-platform scope. No one filtering service can apply all of those remedies (nor can one platform), but an ecosystem of coordinating and interoperable tools can grow and evolve to meet the challenges -- even as bad actors keep trying to outwit whatever systems are used.

Still-broader presumptions remain unstated – they seem to underlie key disagreements in the Journal of Democracy debate, especially as they relate to tolerance for lawful but awful speech. Natalie Helberger explores four different models of democracy -- liberal, participative, deliberative, and critical -- and how each leads to very different ideas for what news recommenders should do for citizens. My interpretation is that American democracy is primarily of the liberal model (high user freedom), but with significant deliberative elements (encouraging consideration of alternative views). This too argues for a distributed-control architecture for our social media ecosystem that can support different aspects of democracy suited to different contexts. Sorting this out rises to the level of Rauch’s “constitution” and warrants a wisdom of debate and design for diversity and adaptability not unlike that of the Federalist Papers, and the Enlightenment philosophers that informed that.***

Designing a constitution for human social discourse is now a crisis discipline. Simple point solutions that lack grounding in a broader ecological vision are likely to fail or create ever deeper crises. Given the rise of authoritarianism, information warfare, and nihilism now exploiting social media, we need to marshal multi-disciplinary thinking, democratic processes, innovation, and resolve.

Ecosystem-level management of speech and reach

Returning to filters, the user level of ranking/recommenders blurs into a cross-ecosystem layer of controls, including many suggested by Goldman – especially flow controls affecting virality, velocity, friction, and circuit-breakers. As Sanderson evidences, these may require distributed coordination across filtering services and platforms (at least those beyond some threshold scale). As suggested previously in Tech Policy Press, financial markets provide a very relevant and highly developed model of a networked global ecosystem with realtime risks of systemic instability, managed under a richly distributed and constantly evolving regulatory regime.****

These are primarily issues of reach (ranking/recommenders at the listener level) -- but some may be issues of speech (bans/takedowns). As platforms decentralize and interoperate to create a distributed social network that pools speech inputs to feed to listeners through a diversity of channels, the question arises whether bans/takedowns are always ecosystem-wide or may be effected differently in different contexts. There is a case for differential effect – such as to provide for granularity in national- and community-level control. Related questions are whether there should be free-speech zones that are protected from censorship by authoritative regimes and communities.

Fukuyama seems most concerned about authoritarian control of political speech, but important questions of wrongful constraint can apply to pornography, violence, incitement, and even science. Current debates over sex worker rights and authoritarian crackdowns on “incitement” by dissidents, as well as the Inquisition-era ban on Galileo’s statement that the Earth moves, show that there are no simple bright lines or infallible authorities. And, to protect against overenforcement, it seems there must be provision for appeals and for some form of retention (with more or less strictly controlled access) for even very objectionable speech.

Which future?

The unbundling of filtering systems becomes more compelling from this broader perspective. It is no panacea, and not without its own complications – but it is a first step toward distributed development of a new layer of social mediation functionality that can enable next-generation democracy. Most urgent to preserve democracy is a moderation regime that leverages decentralization to be very open and empowering on ranking/recommenders, while applying a light and context-sensitive hand on the censorship of bans/takedowns. This openness can restore our epistemic ecology to actively supporting health, not just reactively treating disease.

Which future do we want? Private platform law acting on its own -- or as the servant of a potentially authoritarian government -- to control the seen reality and passions of an unstructured mob? A multitude of separate, uncoordinated platforms, some tightly managed as walled gardens of civility, but surrounded by nests of vipers? Or a flexible hybrid, empowering a digitally augmented upgrade of the richly distributed ecology of mediators of consent on truth and value that, despite occasional lapses and excesses, has given us a vibrant and robust marketplace of ideas -- an epistemic ecology that liberates, empowers, and enlightens us in ways we can only begin to imagine?

---

This article expands on articles in Tech Policy Press and Reisman’s blog, listed here.

________________________

*Some, notably Facebook, argue for just a limited opening of filter parameters to user control. That is worth doing as a stopgap. But that remaining private corporate control is too inflexible, and just not consistent with democracy. Others, like Nathalie Marechal argue that the prime issue is the ad-targeting business model, and that fixing that is the priority. That is badly needed, too, but would still leave authoritarian platform law under corporate control, free to subvert democracy.

A more direct criticism, by Robert Faris and Joan Donovan, is that unbundling filters is “fragmentation by design” that works against the notion of a “unified public sphere.” I praise that as “functional modularity and diversity by design,” and suggest that apparent unity results from a dialectic of constantly boiling diversity. Designing for diversity is the only way to accommodate the context dependence and contingency required in humanity’s global epistemic ecosystem. Complex systems of all kinds thrive on an emergent, distributed order (whether designed or evolved) built on divisions of function and power. This applies to software systems (where it is designed, learning from the failures of monolithic systems) and in economic, political, and epistemic systems (where it evolves from a stew of collaboration).

**Twitter recently announced similar community features explicitly intended to restore the context that has collapsed.

***A recent paper by Bridget Barrett, Katherine Dommet, and Daniel Kreiss (interviewed by Tech Policy Press) offers relevant research suggesting a need to be more explicit about what we are solving for and that conflicting objectives must be balanced. Maybe what regulation policy should solve for is not the balanced solution itself, but an ecosystem for seeking balanced solutions in a balanced way.

****Note the parallels in the new challenges in regulating decentralized finance and digital currency.

+[Added 9/20] Of course the even more fundamental metaphor comes from nature. Claire Evans likens the "context collapse" on social media to the disruption of monocultures and clear-cutting in forests, ignoring “the wood wide web” -- “just how sustainable, interdependent, life-giving systems work."

1 comment:

  1. Backpage
    If you were a user of backpage and used to post classified advertisements for free, but because of shutting down the

    original backpage site, then your wait ends here. Because AdBackpage provides you with the same features and can work as a

    backpage replacement site very quickly.
    AdBackpage Team.

    ReplyDelete