• Slow Upload
  • Posts
  • Creators & The Question of Algorithm Neutrality

Creators & The Question of Algorithm Neutrality

Section 230, which shields online platforms from liability for user-published content, was enacted in 1996. Since then, lawmakers have faced increasing pressure to narrow the Code’s scope of protections. Companies like Twitter, Meta’s Facebook and Instagram, and Google’s YouTube have been central to this discussion given the role they play (or don’t) in content and user moderation, and their recommendation algorithms.

Last week, in response to cases brought against Twitter and Google, the Supreme Court delivered two rulings that support the immunization of tech companies against content accountability. Justice Clarence Thomas likened social platforms to other digital technologies, such as cell phones or email, that should not “incur culpability merely for providing their services to the public writ large.”

In yesterday’s Stratechery, Ben Thompson highlighted important language in Justice Thomas’ opinion. Thomas contends that recommendation algorithms should be considered a part of the platform’s core infrastructure, and are “agnostic to the nature of content.” Thompson insightfully draws an important parallel to the First Amendment, underscoring the central issue of neutrality:

“That bit about algorithms being part of the infrastructure is important; it calls back to an important First Amendment principle known as “content neutrality”. Content neutral laws regulate speech without regard to what the speech is actually about, and are generally allowed under the First Amendment; as an example, you can have laws that require a permit for a protest, but those laws cannot be based on what the protest is about.”

This call out is important for two reasons. First, in the near term, it solidifies Section 230 (though it still has to go to congress), preserving (at least for now) the control tech platforms have over creator discovery and growth via their recommendation algorithms. Second, in the longer term, it opens the door for potential new legislation outside of Section 230. By Thompson’s logic above, the algorithm authors are writing neutral code to optimize audience entertainment; however, in practice, the code itself favors specific types of content (and creators) based on data and corporate priorities.

The public discussion around Thomas’ opinion has only reinforced a reality creators have long been aware of: their strategies for discovery, growth, and monetization must continue to adapt in relation to an evolving technological and regulatory landscape. More than ever, they have to prioritize diversification across platforms and first party ownership of audience and revenue. Look no further than YouTube’s concerted effort to push Shorts, which inevitably buoyed creators native to or intentionally prioritizing that format. While these platforms are and will continue to be foundational for creators, many of whom have leveraged them to build meaningful businesses, they may not be fundamentally dependable. In response to the unfettered, unpredictable influence platforms have over success, creators should work to manage them actively, strategically, and conservatively.