Why the 1666 Fire of London matters
Our DIGITAL homes need a building code to ensure safety and trust, argues a Canadian policy institute
London, 1666. Fires rage, tearing through the city. Homes, shops, entire neighborhoods turn to ash. The Great Fire of London reveals a grim reality: cities are dangerous. Buildings are packed too close, made of flammable materials, with no safety standards in place.
Afterward, things change. New building codes emerge, enforcing fire-resistant materials and safer designs. Cities become safer. Centuries later, as skyscrapers rise, building codes adapt again to protect residents from new risks.
Imagine if our digital world had a similar code—a "building code" for our online spaces.
That’s exactly what the Centre for International Governance Innovation (CIGI) and others are calling for: a set of digital safety standards. In their recent policy brief, Michel Girard of CIGI warns that “standards bodies and regulatory authorities are not keeping up with successive waves of new digital products, platforms, devices, and services.” Without standards, we face unchecked harms like misinformation, privacy violations, and increased polarization. What if a “building code” for digital spaces could protect us as effectively as physical safety codes do?
Competing approaches stymy regulatory coherence
While there is consensus on the need to regulate digital technologies, global coordination efforts are hampered by competing approaches. The American market-driven model, the Chinese state-driven model, and the European rights-driven regulatory model represent divergent philosophies of governance, competing for dominance in the global digital economy. This lack of alignment has left digital governance fragmented and reactive.
One emerging solution is the concept of a model code for digital safety, a framework designed to proactively address digital harms while adapting to the rapid pace of technological innovation. In 2024, the Digital Governance Council (DGC), a Canadian organization representing public, private, and non-profit sectors, began exploring the feasibility of developing such a code. The envisioned code would aim to preserve health, safety, privacy, and security in the digital realm, safeguarding individuals and organizations from risks and harms.
Model codes like this include two key components:
Values and objectives: These outline what needs to be achieved, such as safety, transparency, and equity.
Technical standards and compliance: Annexes detail the specific standards and mechanisms that guide implementation and ensure compliance.
These codes are dynamic, updated regularly to address new challenges and advances, much like those in physical infrastructure governance. For example, Canada's physical building codes reference thousands of standards to ensure safety and performance—a precedent that a digital safety code could emulate.
What a Digital Code could look like
A digital code would aim to establish core standards for online safety, promoting transparency and accountability. Here are three principles it might include:
Transparency in algorithms: Platforms would be required to explain how their algorithms work, especially in regard to content recommendations. This would help users understand how information is prioritized, countering hidden drivers of polarization and misinformation.
Improved content moderation: Just as buildings are inspected for safety, a digital code could require platforms to uphold standards for addressing harmful content. Human oversight combined with third-party audits could help maintain safe and respectful online spaces.
Data privacy and security: Platforms would follow baseline security and privacy standards, protecting users’ data and minimizing vulnerabilities. Safeguards could prevent unauthorized data access and help users maintain control over their information.
Building on existing frameworks
Governance efforts such as the European Digital Services Act (DSA) offer a roadmap for addressing platform accountability, algorithmic transparency, and systemic risks. The model code proposed by CIGI could complement these initiatives, extending their principles globally and integrating lessons from regions with varying regulatory infrastructures. For instance, the code could adopt the DSA’s algorithmic transparency requirements while offering flexible compliance paths for regions with less regulatory infrastructure.
Other governance approaches offer further inspiration. Casey Mock, a policy lead at the Center for Humane Technology, advocates for a consumer protection model that holds tech companies accountable for the social impacts of their platforms. “We’re not determining the outcome here,” Mock says, “but rather encouraging developers to innovate around safety, however they see fit.”
Much like automakers are liable for their vehicles' safety, tech companies would bear responsibility for social impacts of their products.
Similarly, Helena Puig Larrauri of Build Up proposes viewing harms like polarization as negative externalities, comparable to carbon emissions in environmental policy. A 'polarization tax' could financially incentivize platforms to minimize engagement-driven harm and foster ethical user experiences. Such a tax could potentially encourage platforms to minimize harmful engagement-driven practices and design more balanced, ethical user experiences.
Multistakeholder and measurable
The development of a digital safety code would bring together stakeholders—civil society, industry leaders, and regulators—to articulate shared values and create adaptable, sector-specific guidelines. This collaborative effort could mirror the success of physical infrastructure codes, which have evolved over decades to improve safety and functionality.
How would success be measured? Companies could track metrics such as the spread of harmful content and the levels of community trust and cohesion. It could also be measured by a systematic adoption of surveys of user experiences, such as the Neely Social Media Index. Coupled with voluntary adherence to a digital code, these metrics could help reshape the tech landscape, promoting responsible innovation.
Physical building codes arose from the collective need to protect us from physical dangers. Similarly, embedding core values like transparency, responsibility, and security into our digital infrastructure could prioritize public safety and trust, paving the way for a healthier online world. Just as physical structures evolved to meet modern standards, it’s time our digital spaces followed suit.
Lena Slachmuijlder is Executive Director of Digital Peacebuilding at Search for Common Ground and Co-Chair of the Council on Tech and Social Cohesion.