From Harm to Health
The new Blueprint for Prosocial Tech Design Governance lays out systems that incentivize prosocial design—and goes beyond just removing harm.
We know what we want to stop: disinformation, polarization, manipulation, surveillance, addiction. From whistleblowers to researchers to users, the warning signs are clear.
But stopping harm isn’t enough.
What would it mean to start asking for what we actually want from our digital spaces? To demand not just the end of harm, but the beginning of something better?
That’s the bold invitation behind the newly released Blueprint on Prosocial Tech Design Governance from the Council on Technology and Social Cohesion with the University of Notre Dame and the Toda Peace Institute. Authored by Dr. Lisa Schirch, it’s more than a report—it’s a practical, systems-level map of the internet we could build together.
It doesn’t just warn—it proposes. It doesn’t only diagnose the crisis—it seeds a future.
Tackling root causes, not symptoms
Tech governance conversations are often dominated by damage control: removing harmful content, chasing bots, removing user accounts, patching platforms after scandals break. Downstream fixes. In many contexts, misinformation or hate speech is seen primarily as a problem of ‘bad users,’ leading to a response anchored in law enforcement or censorship.
This Blueprint looks upstream—at the design choices and market incentives that make those harms so likely in the first place. And instead of treating this as an inevitable reality, it lays out what we can—and should—build in its place.
“Most ‘Trust and Safety’ efforts inside large tech platforms downplay their design choices and steer regulators to focus on content moderation or ‘downstream’ removal of harmful content,” writes Schirch. “A systems approach to digital harms focuses on root causes rather than symptoms.”
The Blueprint offers a comprehensive set of entry points and avenues for change, structured as tiers.
Tier 1: Basic building standards for privacy, safety, and agency.
Tier 2: Low-barrier UX features (like nudges, reaction cues, friction) that support healthier norms.
Tier 3: Algorithms and recommender systems designed to bridge, not divide.
Tier 4: Platforms that enable respectful dialogue, informational integrity and collective problem-solving.
Tier 5: Middleware infrastructure that empowers users through data sovereignty, portability and interoperability.
These tiers offer entry points for designers, developers, regulators, and funders to align around shared benchmarks for ethical, prosocial technology. It lets us measure what platforms enable, not just what they allow or prevent.
Governance, research and markets are interconnected
Design matters—but it doesn’t operate in isolation. “Platform design decisions are policy choices with social consequences,” writes Schirch.
The Blueprint underscores that changing technology means changing the systems around it: who governs it, how it's researched, and what the market rewards. That’s why it’s built around three interconnected strategies:
1. Advance Prosocial Tech Design
This section is the creative core: the tiered design framework. It invites technologists, UX designers, and platform architects to imagine tech that promotes social trust, not just engagement. It moves design from a neutral or profit-driven act into a civic practice. And it reminds us: design is policy.
2. Transparency, research and metrics
Here the Blueprint turns to the conditions that make meaningful oversight possible. It calls for transparency mandates, the development and use of standardized metrics for prosocial outcomes, and legal protections for independent researchers. Without the freedom and data to study platforms, accountability will remain evasive, as will the potential for offering rewards for prosocial outcomes.
3. Shift market forces to support prosocial innovation
Finally, it tackles the economic engine beneath it all. The Blueprint proposes enforcing antitrust laws, redefining liability for social harms, and creating public and private investment incentives for ethical platforms. By reshaping incentive structures, the Blueprint supports a more competitive and pluralistic tech ecosystem—where users have real alternatives and ethical platforms can gain a foothold.
Building on global collective insights
The Blueprint is the product of 12 workshops between 2023 and 2025 with more than 450 participants —technologists, peacebuilders, investors, policy-makers, academics, and civil society leaders across the globe. It reflects the very model of the Council’s approach to drive cross-sector collaboration, demonstrating what can emerge when we break through silos and start designing together.
With more than 100 references and links, it integrates and builds on work from the Council’s founding members including the Prosocial Design Network, Build Up, New_Public, Integrity Institute, Center for Humane Technology, Toda Peace Institute, University of Notre Dame, Search for Common Ground, Exygy, the Alliance for Peacebuilding, the Center for Human-Compatible AI, GoodBot, and the University of Southern California’s Neely Center. It draws on ongoing work from the Plurality Institute, Audrey Tang and Glen Weyl, Google’s Jigsaw, and the founders of the deliberative platforms Pol.is and Remesh.
The result is a vision that is not only systemic, but global—learning from regulatory innovations in the Global South, civic tech experiments, and grassroots movements as much as on academic research.
Renewal, not reckoning
We’re in a moment of profound reckoning with technology. Trust is eroding between regulators, platforms, and civil society. Transparency and researcher access are shrinking. Litigation is rising. But reckoning is not the same as renewal.
The Blueprint for Prosocial Tech Design Governance offers a plan not just to reduce harm, but to build health—socially, civically, and digitally.
Whether we’re confronting extremism, misinformation, or intergroup distrust, our ability to manage conflict online depends on the upstream choices shaping these digital spaces. Let’s start asking for what we want: systems that respect agency, platforms that build trust, tools that incentivize our best values—not our worst impulses.
We welcome your feedback on the Blueprint—and your ideas on how to advance its recommendations.
Lena Slachmuijlder Co-Chairs the Council on Tech and Social Cohesion.