Wednesday, December 09, 2020

“How to Save Democracy From Technology" (Fukayama et. al.)

An important new article in Foreign Affairs by Francis Fukayama and others makes a compelling case: Few realize that the real harm from Big Tech platforms is not just bigness or failure to self-regulate, but that they threaten democratic society, itself. As they say: 
Fewer still have considered a practical way forward: taking away the platforms’ role as gatekeepers of content …inviting a new group of competitive ‘middleware’ companies to enable users to choose how information is presented to them. And it would likely be more effective than a quixotic effort to break these companies up.”

[This is a quick preliminary alert -- a fuller commentary is planned.]

The article makes a strong case that the remedy is to give people the power to control the “middleware” that filters our view of the information flowing through the platform in the ways that we desire. Controlling that at a fine-grained level is beyond the skill or patience of most users, so the best way to do that is to create a diverse open market of interoperable middleware services that users can select from. The authors point out that this middleware could be funded with a revenue share from the platforms – and that, instead of reducing the platform revenue, it might actually increase it by providing better service to bring in more users and more activity.

This resonates with similar proposals I have made over the past decade or two, as described in Architecting Our Platforms to Better Serve Us -- Augmenting and Modularizing the Algorithm.  Part of the relevant section:

Filtering rules

Filters are central to the function of Facebook, Google, and Twitter. As Ferguson observes, there are issues of homophily, filter bubbles, echo chambers, and fake news, and spoofing that are core to whether these networks make us smart or stupid, and whether we are easily manipulated to think in certain ways. Why do we not mandate that platforms be opened to user-selectable filtering algorithms (and/or human curators)? The major platforms can control their core services, but could allow users to select separate filters that interoperate with the platform. Let users control their filters, whether just by setting key parameters, or by substituting pluggable alternative filter algorithms. (This would work much like third party analytics in financial market data systems.) Greater competition and transparency would allow users to compare alternative filters and decide what kinds of content they do or do not want. It would stimulate innovation to create new kinds of filters that might be far more useful and smart...

I have also proposed complementary changes in algorithms and business models that would make such an open market in filtering middleware even more effective.  

The Foreign Affairs article is backed up by an excellent white paper that provides much more detail.

(The article is entitled “How to Save Democracy From Technology,” but it is really a problem of bad technology -- bad algorithms and architectures, driven by bad business models.) 

No comments:

Post a Comment