Yes, we can have better algorithms
A new report from recommender system experts highlights practical alternatives to addictive and engagement-driven feeds
We’ve been told that social media feeds can either be engagement-maximizing or chronological—and that those are the only two options. But this is a false choice. A new report by the Knight-Georgetown Institute, Better Feeds: Algorithms That Put People First, makes it clear: platforms could offer far better feeds—ones that serve users' interests without the distortions of engagement-driven design.
Written by leading technologists and researchers in algorithmic design and social cohesion—including Jonathan Stray, Jeff Allen, Ravi Iyer, Julia Kamin, Leif Sigerson and Aviv Ovadya—the report states:
“Suggesting that users must choose between today’s engagement-optimized feeds and chronological or non-personalized feeds creates a false choice. In reality, the design space for recommender system optimization is vast.”
Yet despite the range of possible alternatives, tech companies haven’t built them. Not because it’s technically difficult, but because they haven’t been required to. That’s a policy failure, not a technology limitation.
What could a better feed look like?
The Better Feeds report lays out several ways to design feeds that don’t default to clickbait-heavy engagement or barebones chronological order. With a comprehensive bibliography reflecting the extensive research about the effects of diverse recommender systems, the report highlights where we can draw on evidence to inform designs which serve users better and how we can develop metrics to evaluate future choices. These alternative approaches to recommender systems aren’t speculative—they’re feasible today. The only thing stopping them is lack of incentive.
1. Let users choose
Right now, platforms lock users into a feed designed to serve corporate goals. But users could have more control over how their feeds work, for example:
A feed that prioritizes expertise and verified information.
A well-being-oriented feed that reduces inflammatory, anxiety-inducing content.
A “quiet mode” that slows feeds down at night, reducing compulsive scrolling.
These changes aren’t complex to build. But right now, they’re not offered to users, as they are seen as not in the platforms’ financial interest.
2. Look long-term, beyond today’s clicks
Most feeds optimize for short-term engagement—what hooks users in the moment, even if they regret it later. But platforms could prioritize content people find meaningful weeks or months later.
“Optimizing recommender systems to maximize predicted short-term engagement does not typically promote long-term value… Predictions of long-term value must be supported by evidence of explicit, expressed desires held by users,” according to the authors.
Research shows that users prefer substantive content when they reflect on their feed later. But platforms don’t measure this—they measure what gets immediate reactions.
3. Feeds for civic and public interest
Platforms could actively support democracy and public interest by redesigning feeds to:
Prioritize high-quality election information over misinformation.
Promote local news, community initiatives, and mutual aid networks.
Offer education-focused feeds that highlight trusted sources over sensationalism.
None of this requires reinventing social media—just a shift in priorities. Last year, the Panoptykon Foundation unpacked these ideas in Safe by Default, Moving away from engagement-based rankings towards safe, rights-respecting, and human centric recommender systems. The report proposes, for example, profiling off by default, meaning that instead of behavioral profiling based on passive data collection, platforms use only explicit user-provided data (e.g., declared interests, "show me more/show me less" signals) to shape recommendations.
It’s not new tech
None of these solutions require new technology. They require new incentives.
Right now, platforms:
✅ Aren’t required to report how their algorithms work.
✅ Aren’t incentivized to prioritize long-term user well-being.
✅ Face no consequences when their feeds amplify harm.
Regulators are starting to act, creating an opportunity to push for real changes in algorithmic design.
In the EU, the Digital Services Act (DSA) already mandates greater user autonomy over recommender systems.
Article 27 requires large platforms to disclose the key parameters of their algorithms.
Article 38 mandates that users must have at least one alternative feed that does not rely on profiling.
In the U.S., states are moving ahead on algorithmic regulation. Minnesota's Prohibiting Social Media Manipulation Act was passed into law in 2024 and requires platforms to disclose whether and how they assess content quality and explicit user preferences and how those signals are weighted in algorithmic systems in relation to engagement signals. Other global regulators are watching closely—momentum for algorithmic transparency is growing beyond just the EU and U.S.
The report outlines policy interventions that could complement these efforts globally:
Transparency requirements – Platforms should disclose how feeds are ranked, what content they amplify, and how they measure success.
Algorithmic plurality mandates – Users should have more feed options, instead of being forced into a single system designed for ad revenue.
What product teams can do
The Better Feeds report provides concrete guidance for product teams designing recommender systems. It urges platforms to offer clear, user-friendly customization tools. For example, platforms could experiment with opt-in ranking systems that let users weigh different priorities—whether they want a more deliberative newsfeed, a balanced engagement model, or a lower-emotion, utility-focused stream. The goal is not to eliminate engagement, but to reshape it in ways that build trust and social cohesion.
Engagement isn’t going away, but it can be redirected toward better outcomes that build trust and social cohesion. The report reminds us that we can demand regulators require platforms to give us options for better feeds – ones that help us think, learn, and connect in meaningful ways.
Lena Slachmuijlder Co-Chairs the Council on Tech and Social Cohesion.

