The internet we could still build
Small shifts in design are already changing the internet — and it’s not too late to scale what works
Sometimes the tiniest shift in a product’s design makes the biggest difference.
Trisha Prabhu was in middle school when she had a simple idea: What if, before posting something cruel, users were simply asked — ‘Are you sure?’”
This gentle nudge — a moment of intentional friction — reduced harmful messages from 71% to just 4% in her early tests. She later founded Rethink Citizens, where she now helps young people design similarly powerful interventions. This idea — of using design to prompt reflection, not just reaction — is now at the center of a broader movement for prosocial technology.
At last month’s Responsible Tech Summit convened by All Tech is Human, in a session titled “Thinking Beyond Harms: Designing for Prosocial Outcomes in Digital Spaces”, Prabhu joined industry leaders from Snap and Pinterest to spotlight what pro-social design looks like in practice — and how even modest product changes can shift outcomes at scale.
Good friction
Viraj Doshi, Platform Safety Lead at Snap, shared how a single feature — a prompt asking teens to reconsider accepting suspicious friend requests — led to over 12 million accounts being blocked within six months. It’s a small speed bump that gives users more agency, especially when no mutual connections exist or when accounts have been reported.
Pinterest is also rethinking its design assumptions. Ally Millar, Head of Product Marketing for Wellbeing and Safety, highlighted personalization tools that let users filter content by skin tone, hair type, and body shape — increasing representation, agency, and engagement. Users who use these filters pin 75 % more often — challenging the idea that ethical design must mean slower growth.
Pinterest is a co-founder of the Inspired Internet Pledge, signalling a broader commitment to embedding well-being into its product ethos and sharing its research insights along the way.
Prevention by design
These principles — intentional friction, user-centered defaults — aren’t limited to youth or everyday content. The Prevention by Design: Tackling TFGBV at the Source report, developed by the Council on Tech and Social Cohesion, Search for Common Ground and the Integrity Institute, applies them to the urgent challenge of tech-facilitated gender-based violence. Context-aware prompts, better user agency in determining their content, and early interruption cues can reduce grooming, sextortion, and abuse — not reactively, but before they escalate.
“The language of norms is set by the design affordances of a platform,” said Tobias Rose-Stockwell, author of The Outrage Machine, speaking at the Thinking Beyond Harms session.
What a platform enables — and how it nudges users — becomes the culture. Design doesn’t just reflect values; it shapes them.
He also praised the uptake of features like bridge-ranking algorithms, used in tools like Community Notes. Within five years, these ideas have gone from edge-case experiments to infrastructure serving billions.
It’s 2012, what should we have done?
In a recent episode of Your Undivided Attention, Tristan Harris and Aza Raskin imagined an alternate 2012 — where we had acted quickly on digital harms.
“We replaced the division-seeking algorithms of social media with ones that rewarded unlikely consensus using Audrey Tang’s bridge-ranking for political content,” said Harris, “You were suddenly seeing optimistic examples of unlikely consensus from everyone around the world. And that started to turn the psychology of the world around.“
They envision a future where we’ve implemented dopamine emission standards, where feeds spotlight unlikely consensus, and where devices help us reconnect with each other — not just content. It’s not science fiction. It’s a plausible fork in the road.
“The story feels bleak only because we haven’t yet articulated the alternative,” Raskin added. But we can — and must — build that alternative now.
Audrey Tang, former Digital Minister of Taiwan and a leader in the Collective Intelligence Project and Plurality Institute, speaking at this month’s MozFest, spoke of the urgency to address today’s dominant platforms’ “PPM — polarization per minute.” The challenge, she argued, is to redirect that capacity — not to disengage from tech, but to harness tech to upgrade democracy itself.
Dr. Ruha Benjamin, in her MozFest keynote, pushed us even deeper — urging us to unlearn the notions of ‘technology as self-propelled, intelligence as smartness, and human nature as selfish,’ unlearning patterns that hold us back from embracing our creativity and imagination to build a better future. Unlocking the market enablers, including through recent legislation mandating interoperability such as in Utah, can spark the building of new social platforms.
New platforms such as WeAre8, Tribela, Sparkable, Filament and Monnet are reimagining what social media can look like when designed for belonging, trust and cohesion, not just engagement.
The Blueprint on Pro-Social Tech Design Governance, published earlier this year by the Council on Tech and Social Cohesion, offers a systems framework for advancing these ideas and gaining wider adoption. Our forthcoming Pro‑Social Tech Regulation: A Practical Guide outlines practical ways for regulators to encourage healthier upstream design — from incentive structures to multi‑stakeholder reviews of user experience.
It can feel like we’re trapped in feeds we didn’t choose, shaped by incentives we didn’t create. But design isn’t destiny. Once we understand how these systems work, we can start to shift them.
The internet is still being shaped. Through small nudges, better defaults, and thoughtful policy, we can steer it toward more pro‑social outcomes.
Lena Slachmuijlder is Senior Advisor, Digital Peacebuilding at Search for Common Ground, Senior Practitioner Fellow at the USC Neely Center, and Co-Chairs the Council on Tech and Social Cohesion.


👏