Content moderation has never been more controversial, facing criticism not only for perceived censorship, bias, and ineffectiveness but also intense political scrutiny. Amid this charged landscape, tech companies, civil society, and regulators are increasingly exploring "prosocial" design—platform features that proactively promote positive interactions rather than merely responding after harm occurs. But is this promising alternative gaining traction in the Global South?
‘Mapping Tech Design Regulation in the Global South’, by technology policy analyst Devika Malik, supported by the Council on Tech and Social Cohesion and the Toda Peace Institute, explores precisely this issue in eight Global South countries: India, Brazil, Kenya, Nigeria, Sri Lanka, Pakistan, Indonesia, and the Philippines.
Malik underscores the limitations of traditional moderation:
“Content governance suffers fundamentally on account of being limited to reactive crisis management and does not address inequitable design choices that platforms are built on.”
This observation aligns with views just this week from Trust & Safety professionals at the popular "Everything in Moderation" blog:
“With significant pushback on the reactive/enforcement side of Trust & Safety, there’s a stronger case than ever to lean into proactive/prosocial practices that prevent toxicity from happening in the first place.”
Prosocial examples from the Global South
Malik includes an annex outlining various legislative initiatives in the eight countries, and highlights specific prosocial initiatives already emerging:
India: The IT Rules (2021) mandate proactive content identification and removal, and the Digital Personal Data Protection Act strongly emphasizes child safety through design. In addition, India's proposed Digital Competition Bill emphasizes interoperability to ensure fair competition by mandating standardized protocols and addressing deceptive design as a significant competitive concern.
Brazil: Its proposed Fake News Bill requires proactive moderation and tracing of misinformation, similar to the EU’s Digital Services Act (DSA).
Pakistan: The Prevention of Electronic Crimes Act explicitly requires proactive systems to combat online harassment and hate speech.
Sri Lanka: The Online Safety Bill proposes a commission dedicated to proactive content moderation, though critics raise concerns around potential censorship.
Kenya and Nigeria: Both have emphasized data portability and interoperability, enabling users to control their data more freely and fostering platform competition—both essential prosocial principles.
Indonesia and the Philippines: While still developing explicit prosocial policies, civil society groups and policymakers increasingly focus on algorithmic transparency and accountability.
Barriers to adoption
However, Malik’s research also identifies obstacles that could slow the uptake of prosocial design:
Regulatory capacity and enforcement: Many countries in the Global South face resource constraints and limited technical expertise, complicating enforcement of sophisticated regulations. Political environments also raise concerns about regulatory overreach, potentially fueling censorship.
Influence of Big Tech: The interests of Big Tech significantly shape policy discussions. Regulatory efforts often face substantial lobbying, complicating progress toward prosocial outcomes.
Digital inclusion concerns: Stringent prosocial regulations may inadvertently hinder digital adoption in regions with significant digital divides, highlighting the delicate balance regulators must strike.
Civil society interest, but wary of over reach
Civil society across these regions often prioritizes immediate digital rights concerns, like internet access, censorship, and surveillance, outlines Malik.
“In many Global South contexts, emerging from decades of anaemic growth or colonial regulatory legacies, there is caution against overregulation that could deter investment and hinder beneficial emerging technologies.”
Yet, innovative prosocial initiatives are taking root:
India’s Advertising Standards Council leads the "Conscious Design" initiative to encourage ethical and user-friendly app designs.
Brazilian civil society actively engages in regulatory debates, pushing for greater transparency in algorithms and moderation practices. Despite the archiving of the Fake News Bill, there remains a strong civil society constituency working towards strengthening Big Tech accountability for digital harms, including via prosocial design approaches.
Malik emphasizes that effective prosocial design must be context-specific, reflecting local priorities such as consumer protection, economic growth, and digital literacy.
One promising approach is the "Design from the Margins" framework advocated by researcher Afsaneh Rigot. This approach focuses on designing platforms initially with marginalized communities in mind, creating broadly inclusive digital spaces beneficial for all users.
Prosocial design presents an appealing alternative at a time when traditional moderation faces intense scrutiny. The EU’s Digital Services Act and the UK’s Online Safety Act both embed upstream design principles—targeting dark patterns, persuasive and addictive design, and requiring platforms to mitigate risks before harm occurs.
Malik’s research shows that prosocial design holds great potential globally—but success in the Global South hinges on tailored strategies sensitive to local realities. Prosocial design is gaining momentum, yet thoughtful and localized implementation will determine its ultimate impact.
Lena Slachmuijlder Co-Chairs the Council on Tech and Social Cohesion.