Sunday, July 20, 2025

Tech and Democracy: Busy Times in Tech Policy Press

[Working notes in progress, as I try to drink from and reflect on this firehose...]

Tech Policy Press
has become increasingly essential reading and reveals increasing energy in "
issues and ideas at the intersection of tech & democracy." 

I begin this working draft post today because, in addition to being pleased to see my own new piece on AI and Democracy (7/16/25) join ten others I have done since 2021, I saw stimulating connections with other TPP pieces through the week that deserve comment. Today I found still more in editor Justin Hendrix's weekly newsletter recap, even though I can only keep up with a fraction of TPPs continuing growth as a key focal point for this community that Justin is catalyzing. 

This post serves as my point of connection for comments that link some of them (and from other sources) to complementary ideas from my work. [This may expand.]

These two triggered this idea for a connection point for my commentary:

Reviglio's insightful Pluralism piece first struck me, as a variant perspective on issues in my new piece and throughout my work -- the challenge of balancing bottom-up individual agency versus democratically legitimate top-down group influence on discourse. One can drive to silos, filter bubbles, echo chambers, and the madness of crowds. The other can lead to sterile, conformist groupthink, or authoritarian Huxwell dystopias. 

Reviglo suggests that "plurality typically denotes market diversity and anti-monopoly safeguards, while pluralism generally refers to ensuring broad access and visibility of diverse voices and perspectives." I read his "algorithmic plurality" as a purely bottom-up influence, and his "algorithmic pluralism" as a lateral influence (for both serendipity and "prosocial" "bridging" of diverse viewpoints) that emerges from either a seeking from the bottom-up, or a positive nudging that can be either top-down or side-peer. (That contrasts with more negative top-down nudging to conformity or subservience.)

My latest piece emphasizes the need for attention agents to serve their users, and my Three Pillars piece broadens the ideas of algorithmic choice to factor in strong levels of social influence. Pieces out of and building on the Stanford symposium on middleware that I helped organize address the these debates and how it is not an issue of technology for user control, but the sociotechnical issues of how society chooses to support and influence the use of these tools that cut in whatever direction we shape. Our urgent task is to build the sociotechnical infrastructure to support development and use of more prosocial algorithmic services.

Algorithmic pluralism is also central to my earlier Delegation series with Chris Riley, with a broader look at these issues especially in the last two installments on the Community roles in moderation of a Digital Public Hypersquare, and on Contending as agonistic versus antagonistic.

Marechal's piece on AI Slop presents a thought-provoking summary of how algorithms and broader factors have driven us toward "optimized" culture of "fast-food" for the mind, the opposite of the vision of "bicycles for the mind" were to offer us. He laments the downranking of outliers, "...a deeply illiberal optimization ethic that rejects “outlier” perspectives. Rather than seeing deviations from the “algorithmic models in our heads” as opportunities to grow, we increasingly see outliers as dangerous anomalies to be ignored or ridiculed."

That is where we have let ourselves be taken, but my half-century vision of bicycles for the mind has always been to enable the opposite, as my latest piece notes. I did a 2012 piece, Filtering for Serendipity -- Extremism, 'Filter Bubbles' and 'Surprising Validators' on how to optimize for serendipity and challenging ideas (drawing on my 2003 system design) and a forerunner to recent work by others on "bridging systems."

Here again, the problem is not in the tech or in algorithms in general, but in how we have let that be hijacked to serve platforms not users and communities (my Three Pillars). Marechal suggests "becoming an algorithmic problem" by insisting on better algorithms. That is exactly why I advocate for middleware to "Free Our Feeds" -- not just for individual agency run wild, but in a context of a "social mediation ecosystem" that creates more enlightened and challenging algorithms to be mixed in with the junk food. People are beginning to realize that we are "amusing ourselves to death." What we need is a whole of society effort to change that, and middleware and algorithmic choice is the only technology that can enable that. It is up to us to use it wisely.*

One older piece that I finally read today (not from Tech Policy Press) also ties in with these issues of how algorithms work for or against us.

Berjon points out that "digital sovereignty had a bad reputation...[but] is a real problem that matters to real people and real businesses in the real world, it can be explained in concrete terms, and we can devise pragmatic strategies to improve it." The visions that many are now working toward for open infrastructure, and communities, especially the "semi-permeable" open "hyper-communities" referred to above can enable the positive forms of digital sovereignty that Berjon delineates, and he provides an excellent overview of a wide range of strategies for building on such an open infrastructure. My Delegation series (the Contending installment) and many other works have emphasized the need for "subsidiarity," as the basis for true federalism.

-----------------------

Comments? I invite comments, and posted about this on LinkedIn, as a vehicle to facilitate that -- please make any comments there.

*Apropos of this issue, I happened to just watch the 2005 movie Good Night and Good Luck, including the very on point "Wires and Lights in a Box" speech by Edward R. Murrow from 1958. I highly recommend the full speech, and this shortened rendition from the movie.

Wednesday, July 02, 2025

How to Reclaim Social Media from Big Tech (As published in Francis Fukuyama's Persuasion)

“Middleware” is an idea whose time has come.

By Renée DiResta and Richard Reisman

This article was published July 1 by American Purpose, the magazine and community founded by Francis Fukuyama in 2020, which is proudly part of the Persuasion family.


via Getty Images via Persuasion.community
Social media platforms have long influenced global politics, but today their entanglement with power is deeper and more fraught than ever. Major tech CEOs, who once endeavored to appear apolitical, have increasingly taken far more partisan stances; Elon Musk, for example, served as a campaign surrogate in the 2024 U.S. presidential election, and spoke out in favor of specific political parties in the German election. Immediately following Trump’s re-election, Meta made radical shifts to align its content moderation policies with changing political winds, and TikTok’s CEO issued public statements flattering Trump and praising him for his assistance in deferring enforcement of regulation to ban the app. Both Meta and X chose to settle lawsuits that had been widely seen as easy wins for them in the courts, with their CEOs making donations to Trump’s presidential library, in presumptive apology for their fights over his post-January 6 deplatforming. Outside of the United States, there is growing tension between platforms and EU regulatory bodies, which Vice President JD Vance has opportunistically framed as concern about “free speech” amid increased European calls for “digital sovereignty.”

While companies have always sought to maintain favorable relationships with those in power—and while those in power have always sought to “work the referees”—the current dynamics are much more pronounced and consequential. Users’ feeds have long been at the mercy of opaque corporate whims (as underlined when Musk bought Twitter), but now it is clearer than ever that the pendulum of content moderation and curation can swing hard in response to political pressures.

It is users, regardless of where they live or their political leanings, who bear the brunt of such volatility. Exiting a platform comes at a high cost: we use social media for entertainment, community, and connection, and abandoning an app often means severing ties with online friends, or seeing less of our favorite creators. Yet when users try to push back against policies they don’t like—if they attempt to “work the referees” themselves—they are often hindered both by a lack of relative power and the lack of transparency about the internal workings of platform algorithms. Without collective action significant enough to inflict economic consequences, user concerns rarely outweigh the expediencies of CEOs or governments. Unaccountable private platforms continue to wield disproportionate control over public attention and social norms.

We need to shift this paradigm and find alternatives that empower users to take more control over their social media experience. But what would that look like?

As Francis Fukuyama and others at Stanford University argued in 2020—and as we expanded upon in a recent report coauthored with Fukuyama and others—one promising solution is middleware: independent software-enabled services that sit between the user and the platform, managing the information that flows between them. For example, a user might choose a middleware service that filters out spammy clickbait headlines from their feed, or one that highlights posts from trusted sources in a specific domain, like public health or local news. Middleware can help rebalance the scales, empowering users while limiting platforms’ ability to dictate the terms of online discourse.

Putting users in control of their attention

Middleware has the potential to transform two of the most contentious functions of social media: curation and moderation. Curation shapes how content is ranked in users’ feeds, shaping which voices are amplified. Moderation governs what is allowed, labeled, demoted, or removed. Both functions have become politicized battlegrounds, with critics on all sides accusing platforms of bias, censorship, or failing to address harms.

Middleware cuts through this dynamic of overly-centralized control by offering users and communities control that is more direct and context-specific. An open market (think “app store”) of middleware software and services would allow users to freely choose from a variety of algorithms and/or human-in-the-loop moderation services to compose their feeds for them. For instance, one user might prefer to subscribe to a feed optimized for civil discourse, another might choose one that highlights breaking news, while a third wants cat pictures. On the moderation front, some users may want to see profanity and nudity; others may want to subscribe to a tool that hides or labels such posts in their feed. Flexibility allows people to tailor their online environment to their needs (which shift depending on task, mood, or context) or to their political orientation or membership in different communities. This supports a greater diversity of online experience in terms of politics, values, and norms, enabling users and communities to select for their desired “vibe”—not one imposed by platform overlords or a tyranny of some majority.

Middleware can also reduce the risk of political capture, making it more difficult for incumbent platforms, or governments, to exert undue pressure or outright manipulation over online discourse. It fosters competition and innovation by enabling a robust market of providers, which improves both transparency and responsiveness to user and community needs. Importantly, middleware replaces the binary choice between centralized control and total anarchy with an adaptive middle ground that empowers individuals, communities, and institutions to shape their own social experiences.

Retaking control

So how does increased user choice become a reality? Where Facebook, X, and the other incumbent giants are concerned, middleware’s success depends on their cooperation. Third-party tools need the ability to interoperate through open protocols or interfaces. So far, platforms have shown very limited interest in enabling this. However, as moderation becomes more politically fraught, they may decide that devolving more control to users—selectively opening their “walled gardens”—really is a smart choice. Meta’s Threads app is experimenting with a limited degree of such openness.

Whatever the centralized providers do, an alternative path is already emerging. Decentralized platforms based on open protocols, such as Mastodon and Bluesky, have been designed from the ground up to prioritize user choice and agency—without needing permission from a corporate gatekeeper. This is most apparent on Bluesky, which now serves well over 30 million users, some of whom already subscribe to alternative feeds for curation and independent content labeler services that flag porn or hate speech. Newly-formed non-profit foundations that serve as custodians for the Bluesky and Mastodon protocols (one using the very apt #FreeOurFeeds hashtag) promise to ensure that these infrastructures can remain “billionaire-proof” and open to competition, as public goods.

This open infrastructure model is not anti-commercial. On the contrary, it opens space for innovation, extensibility, and entrepreneurship. Just as Apple’s App Store created a flourishing ecosystem of third-party tools, middleware could spur new markets for feed curation, trust labeling, moderation filters, and more. News outlets might create branded options: the “Fox News Feed,” or the “New York Times Feed.” Trusted intermediaries—civil society groups, perhaps—might offer labels grounded in shared community values. Interoperable services can compete and cooperate across an ecosystem of distinct but connected communities. The goal is not to overwhelm users with technical choices, but to create options—similar to how users can now easily choose an email service or an add-on function extension for a browser.

Policy support

Policymakers can help promote user choice by removing barriers that entrench the status quo. On the regulatory front, lawmakers should reimagine outdated statutes like the Digital Millennium Copyright Act (DMCA) and Computer Fraud and Abuse Act (CFAA)—laws that, while originally designed to protect creators and national security, have too often become tools for corporate suppression of competition. By reforming these laws, barriers that favor entrenched monopolies can be dismantled, promoting a more open internet, and ensuring that the interests of users, communities, and innovators come before exploitative profit. There are also worthwhile legislative efforts like the proposed Senate ACCESS Act, which would require “the largest companies make user data portable – and their services interoperable – with other platforms, and to allow users to designate a trusted third-party service to manage their privacy and account settings.”

Middleware empowers communities to decide how they wish to balance competing democratic values—free speech, protection from harm, pluralism—even in a time of high polarization. It offers a path toward a more democratic and resilient information ecosystem, where users have more agency over their attention. The question is no longer whether such alternatives are necessary or feasible—it’s whether they can be scaled, enhanced, and sustained to meet the moment.

Renée DiResta is an Associate Research Professor at the McCourt School of Public Policy at Georgetown and author of Invisible Rulers: The People Who Turn Lies Into Reality.

Richard Reisman is Nonresident Senior Fellow at the Foundation for American Innovation.