Thursday, August 05, 2021

Unbundling Social Media Filtering Services – Updates on Debate and Development

This informal work in progress updates and expands on my 4/22/21 article in Tech Policy PressThe Internet Beyond Social Media Thought-Robber Barons

The focus is on how to manage social media -- specifically the similar proposals by a number of prominent experts to unbundle the filtering services that curate the news feeds and recommendations served to users to work as "middleware" selected by users. 

These updates are best understood after reading that article or the more recent articles listed in the Selected Items tab (above). 

RUNNING UPDATES (most recent first):

  • 1/17/22]
    For insights into the dangers of simplistic privacy regulation that ignore the non-binary, non-personal nature of privacy see Networked privacy: How teenagers negotiate context in social media by Alice E. Marwick and dana boyd. 

  • [12/19/21]
    Concerned about what Facebook and other platforms know about you and use to manipulate you now? Consider The Ghost of Surveillance Capitalism Future, which builds on Heller's article just below. Now is the time for policy planners to look to the future – not just to next year, but the next decade -- some suggestions.
    (Two additional references: Facebook Patent Shows How You May Be Exploited in the MetaverseWhy Facebook, Snapchat, and Google want smart glasses to be a thing.)

  • [12/14/21]
    A valuable perspective on where surveillance capitalism abuses are going, with lessons for social media reforms that can head that off is in Watching Androids Dream of Electric Sheep: Immersive Technology, Biometric Psychography, and the Law, by Brittan Heller. If you think the manipulation is bad now, just read this. (I am writing about the lessons I see.)

    Further evidence that crowdsourcing can be serviceably effective and far more scalable than professional fact-checking is in Moderating with the Mob: Evaluating the Efficacy of Real-Time Crowdsourced Fact-Checking by William Godel, et. al. (summarized in Tech Policy Press). As I have suggested, instead of explicit crowdsourcing, implicit signals can crowdsource far more people, far faster (much as Google already does, and as a Facebook data scientist has recognized).

  • [12/1/21]
    Reporting and building on the Stanford HAI Conference (noted, just below):
    Progress Toward Re-Architecting Social Media to Serve Society (Tech Policy Press, 12/1/21) reports on the conference session with panelist suggestions for bolder variations. Directions Toward Re-Architecting Social Media to Serve Society (11/29/21) is a companion piece with my suggestions for bolder, longer-term directions, including infomediary/dat
    a cooperative strategies.

  • [11/10/21]
    The Stanford HAI Conference provided another productive session on the filtering middleware unbundling proposals, which drew connections to the separate session on data cooperatives (infomediaries) to enable filtering services to consider privacy sensitive data on interaction and flow dynamics -- aligning with ideas I had explored in my post, Resolving Speech, Biz Model, and Privacy Issues – An Infomediary Infrastructure for Social Media? (More to follow.)

  • [11/3/21]
    Resolving Speech, Biz Model, and Privacy Issues – An Infomediary Infrastructure for Social Media? -- a quick sketch suggests how open concerns with unbundling of filtering services via middleware might be simplified by using infomediaries (data cooperatives) to serve users as fiduciary agents that control their data and negotiate its proper use and compensation. Posted in anticipation of the Stanford HAI Conference with separate sessions on middleware (11/9) and on data cooperatives (11/10).  

  • [10/26/21]
    The Best Idea From Facebook Staffers for Fixing Facebook: Learn From Google -- A leaked Facebook paper from 2018 reinforced the potential value of filtering using a reputation/authority-based strategy similar to Google's seminal PageRank algorithm -- essentially what I have been proposing.

  • [10/29/21]
    A brief counter to a frequent concern about unbundling proposals: if filtering services are funded by a revenue share from the engagement-driven platforms, how do they not just perpetuate the problem of being engagement-driven? Counter: The revenue share need not be based on engagement, but relate to number of active users (such as MAUs). That significantly decouples the filtering services from engagement, and motivates them to attract users based on quality of service and value, not addiction. And -- if there are many filtering services, their individual actions will have little effect on total engagement, and thus their share of total revenue.

  • [10/12/21]
    It Will Take a Moonshot to Save Democracy From Social Media -- that is my quick take on the 10/7 mini-symposium (see previous item). There was general agreement by most speakers that there is no silver bullet...but that shifting power from the platforms is important. (more...)

  • [10/7/21]
    The Tech Policy Press mini-symposium, Reconciling Social Media & Democracy that I helped organize and moderate, was held with eleven speakers prominent in this space. It was very gratifying to catalyze this excellent discussion reflecting diverse perspectives, which I hope will help further a movement toward productive reforms. [10/14:] Recordings and transcripts for my session with Francis Fukuyama, Nathalie Marechal, and Daphne Keller. Other sessions being posted at TechPolicy.Press.

  • [9/20/21]
    "The wood wide web" provides a very relevant analog to the ecosystem perspective in my article preprint -- as explained in The word for web is forest, by Claire Evans (9/19/21). She likens the "context collapse" on social media to monocultures and clear-cutting in forests, which ignores "just how sustainable, interdependent, life-giving systems work." 

  • [9/10/21]
    "Context collapse" is a critical factor in creating conflict in social media, as explained in The day context came back to Twitter (9/8/21), by Casey Newton. As he explains, Facebook Groups and the new Twitter Communities are a way to address this problem of "taking multiple audiences with different norms, standards, and levels of knowledge, and herding them all into a single digital space." Filters are a complementary tool for seeking context, especially when user controlled, and applied in with intentionality. Social media should offer both.

  • [8/25/21]
    The importance of a cross-platform view of the social media ecosystem is highlighted in one of the articles briefly reviewed in Tech Policy Press this week. The article by Zeve Sanderson et. al. on off-platform spread of Twitter-flagged  tweets (8/24/21) argues for “ecosystem-level solutions,” including such options as 1) multi-platform expansion of the Oversight Board, 2) unbundling of filters/recommenders as discussed here (citing Francis Fukuyama et. al. middleware proposal), and 3) “standards for value-driven algorithmic design” (as outlined in the following paper by Helberger).

    A conceptual framework On the Democratic Role of News Recommenders by Natalie Helberger (6/12/19, cited by Sanderson) provides a very though-provoking perspective on how we might want social media to serve society. This is the kind of thinking about what to regulate for, not just against, that I have suggested is badly needed. It suggests four very different (but in some ways complementary) sets of objectives to design for. This perspective -- especially the liberal and deliberative models – can be read to make a strong case for unbundling of filters/recommenders in a way that offers user choice (plus perhaps some default or even required ones as well).

    I hope to do a future piece expanding on the Helberger and Goldman (cited in my 8/15 update below) frameworks and how they combine with some of the ideas in my Looking Ahead post about the need to rebuild the social mediation ecosystems that we built over centuries -- and that digital social media are now abruptly disintermediating with no replacement.
  • [8/17/21]
    Progress on Twitter's @BlueSky unbundling initiative: Jay Graber announces "I’ll be leading @bluesky, an initiative started by @Twitter to decentralize social media. Follow updates on Twitter and at blueskyweb.org" (8/16). Mike Masnick comments: "there has been a lot going on behind the scenes, and now they've announced that Jay will be leading the project, which is FANTASTIC news." Masnick expands: "There are, of course, many, many challenges to making this a reality. And there remains a high likelihood of failure. But one of the key opportunities for making a protocol future a reality -- short of some sort of major catastrophe -- is for a large enough player in the space to embrace the concept and bring millions of users with them. Twitter can do that. And Jay is exactly the right person to both present the vision and to lead the team to make it a reality. ...This really is an amazing opportunity to shape the future and move us towards a more open web, rather than one controlled by a few dominant companies."

    Helpful perspectives on improving and diversifying filtering services are in some articles by Jonathan Stray, Designing Recommender Systems to Depolarize (7/11/21) and Beyond Engagement: Aligning Algorithmic Recommendations With Prosocial Goals (1/21/21). One promising conflict transformation ranking strategy that has been neglected is “surprising validators,” suggested by Cass Sunstein, as I expanded on in 2012 (and since). All of  these deserve research and testing -- and an open market in filtering services is the best way to make that happen.

  • [8/15/21]
    Additional rationales for demanding diversity in filtering services and understanding some of the forms this may take are nicely surveyed in Content Moderation Remedies by Eric Goldman.  He suggests "...moving past the binary remove-or-not remedy framework that dominates the current discourse about content moderation." and provides an extensive taxonomy of remedy options. He explains how expanded non-removal remedies can provide a possible workaround to the dilemmas of remedies that are not proportional to different levels of harm. Diverse filtering services can not only have different content selection criteria, but different strategies for discouraging abuse. And, as he points out, "user-controlled filters have a venerable tradition in online spaces." (Thanks to Daphne Keller for suggesting this to article to me as relevant to my Looking Ahead piece, and for her other helpful comments.)
  • [8/10/21]
  • [8/9/21]
    My review and synthesis of the Journal of Democracy debate mentioned in my 7/21 update are now published in Tech Policy Press.
    + I expand on those two articles in Tech Policy Press in The Need to Unbundle Social Media - Looking AheadWe need a multidisciplinary view of how tech can move democratic society into its digital future. Democracy is always messy, but that is why it works – truth and value are messy. Our task is to leverage tech to help us manage that messiness to be ever more productive. 

Older updates -- carried over from the page of updates to my 4/22/21 Tech Policy Press article

  • [7/21/21]
    A very interesting five-article debate on these unbundling/middleware proposals, all headed The Future of Platform Power, is in the Journal of Democracy, responding to Fukuyama's April article there. Fukayama responds to the other four commentaries (which include a reference to my Tech Policy Press article). The one by Daphne Keller, consistent with her items noted just below, is generally supportive of this proposal, while providing a very constructive critique that identifyies four important concerns. As I tweeted in response, "“The best minds of my generation are thinking about how to make people click ads” – get our best minds to think about empowering us in whatever ways fulfill us! @daphnehk problem list is a good place to start, not to end." I plan to post further comments on this debate soon [now linked  above,  8/9/21].

  • [6/15/21]
    Very insightful survey analysis of First Amendment issues relating to proposed measures for limiting harmful content on social media -- and how most run into serious challenges -- in Amplification and Its Discontents, by Daphne Keller (a former Google Associate General Counsel, now at Stanford, 6/8/21). Wraps up with discussion of proposals for "unbundling" of filtering services: "An undertaking like this would be very, very complicated. It would require lawmakers and technologists to unsnarl many knots.... But unlike many of the First Amendment snarls described above, these ones might actually be possible to untangle." Keller provides a very balanced analysis, but I read this as encouraging support on the legal merits of what I have proposed: the way to preserve freedom of expression is to protect users freedom of impression -- not easy, but the only option that can work. Keller's use of the term "unbundling" is also helpful in highlighting how this kind of remedy has precedent in antitrust law.
    Interview with Keller on this article by Justin Hendrix of Tech Policy Press, Hard Problems: Regulating Algorithms & Antitrust Legislation (6/20/21).
    + Added detail on the unbundling issues is in Keller's 9/9/20 article, If Lawmakers Don't Like Platforms' Speech Rules, Here's What They Can Do About It. Spoiler: The Options Aren't Great.
  • Another perspective on the how moderation conflicts with freedom is in On Social Media, American-Style Free Speech Is Dead (Gilad Edelman, Wired 4/27/21), which reports on Evelyn Douek's more international perspective. Key ideas are to question the feasibility of American-style binary free speech absolutism and shift from categorical limits to more proportionality in balancing societal interests. I would counter that the decentralization of filtering to user choice enables proportionality and balance to emerge from the bottom up, where it has a democratic validity as "community law," rather that being imposed from the top down as "platform law." The Internet is all about decentralized control -- why should we sacrifice freedom of speech to a failure of imagination in managing a technology that should enhance freedom? Customized filtering can provide a receiver-specific richness of proportionality that better balances rights of impression with nuanced freedom of expression. Douek rightly argues that we must accept an error rate in moderation -- why not expect a bottom up, user-driven error rate to be more open and responsive to evolving wisdom and diverse community standards than one applied across the board?
  • [5/18/21]
    Clear insights on the new dynamics of social media - plus new strategies for controlling disinformation with friction, circuit-breakers, and crowdsourced validation in How to Stop Misinformation Before It Gets Shared, by Renee DiResta and Tobias Rose-Stockwell (Wired 3/26/21). Very aligned with my article (but stops short of the contention that democracy cannot depend on the platforms to do what is needed).
  • [5/17/21]
    Important support and suggestions related to Twitter's Bluesky initiative from eleven members of the Harvard Berkman Klein community are in A meta-proposal for Twitter's bluesky project (3/31/21). They are generally aligned with the directions suggested in my article.
  • [4/22/21]
    Another piece by Francis Fukuyama that addresses his Stanford group proposal is in the 
    Journal of DemocracyMaking the Internet Safe for Democracy, April, 2021.
    (+See 7/21/21 update, above, for follow-ups.)
---

Grandfather clause: Many people have contributed to the idea of unbundling social media filters, but I believe I was the first, dating to 2002-3 -- see The Roots of My Thinking on Tech Policy.

============================================
============================================
[Original opening section of this updates post, as posted 8/5/21]

This is an informal work in progress updating and expanding on my two articles in Tech Policy Press (8/9/21) that relate to an important debate in the Journal of Democracy on The Limits of Platform Power

The focus is on how to manage social media and specifically the similar proposals by a number of prominent experts to unbundle the filtering services that curate the news feeds and recommendations served to users. The updates are best understood after reading those articles.

UPDATE: A further round of notable discussion on this occurred around the Reconciling Social Media & Democracy mini-symposium hosted by Tech Policy Press on 10/7/21 that I helped organize, representing all of the Journal of Democracy debate authors, plus other notables. Recordings and transcripts for my session with Francis Fukuyama, Nathalie Marechal, and Daphne Keller..

Also relevant to this debate:

This visualization from my 4/22/21 Tech Policy Press article may also be helpful:

No comments:

Post a Comment