Will the peace deal hold? Ask the digital twins
Digital twins are transforming industries. Now they’re being used to understand—and help resolve—real-world conflicts.
What if we could forecast how armed factions—and the communities around them—might respond to a draft peace deal before it’s signed? What if we could test, virtually, whether a public apology would calm tensions… or make things worse?
That’s the provocative promise behind the growing use of digital twins in peacemaking: AI-powered simulations of complex social systems, designed to help us understand conflict—and imagine pathways out of it.
We already use digital twins in other domains. Engineers create virtual replicas of jet engines to test them under stress. Cities use them to optimize traffic or prepare for floods. These twins aren’t science fiction—they’re live, dynamic tools that mirror the systems they represent.
Now, researchers and conflict mediators are starting to adapt and apply computer modeling and simulation technology to create digital twins of divided societies.
One leading example is CulturePulse.ai, a platform that builds population-scale digital twins of entire nations or regions—capturing how groups think, feel, and respond under stress, using multi-agent artificial intelligence (MAAI). These models are already being explored in fragile contexts around the world, from Northern Ireland and South Sudan to Bosnia-Herzegovina and Israel-Palestine.
Modeling divided societies
These social digital twins use MAAI to simulate how individuals and groups behave under specific historical, psychological, and cultural conditions. People in the model aren’t just data points—they’re agents shaped by attitudes toward fairness, purity, authority, religiosity, fear, and more. Each agent is influenced by—and reacts to—events in its network and social environment, just like real people do.
What emerges is a system where analysts and peacebuilders can run different scenarios: How would the population react to a new ceasefire? What if religious leaders issued conflicting messages? Would an influx of migrants increase social cohesion—or strain it?
Simulation matched real conflict in Northern Ireland
In Northern Ireland, researchers employed the MERV model—Mutually Escalating Religious Violence—to simulate four decades of sectarian conflict between Catholic and Protestant communities. The model integrated theories like Terror Management and Identity Fusion and was tested against historical data with over 95% accuracy.
In Gujarat, India, the same modeling framework examined the 2002 Hindu-Muslim riots. By adjusting factors such as in-group threat perception and population density, researchers were able to simulate how collective fear escalated into mass violence.
In collaboration with the United Nations Development Programme, CulturePulse developed the Palestine-Israel Virtual Outlook Tool (PIVOT). Built from linguistic data, news media, and moral psychology, the models were used to test the effects of different policy strategies and leader messaging on diverse population segments.
Shults recently elaborated on this at a Digital Peacebuilding Community of Practice session, stating:
“We’ve created models of entire countries… each agent within these models has psychological and cultural dimensions—things like attitudes toward authority, harm, liberty, finance, and health—and they interact in ways that reflect how real humans think and respond.”
Practicing the future
How might digital twins be a relevant tool for peacebuilding?
Help policymakers anticipate the social consequences of their choices.
Allow negotiators to trial versions of peace agreements in virtual space.
Reveal unexpected obstacles—or unlikely allies—within polarized systems.
Provide shared reference points for multisectoral dialogue.
“Once the digital twin is in place,” Shults noted in the Digitial Peacebuilding session, “you can practice implementing your strategy for peacebuilding before you do it in the real world—and see the likely outcomes under different scenarios.”
The necessity of ethics
As with any emerging technology, it triggers concerns about bias, misuse or exploitation. Could these simulations be used to manipulate rather than mediate? Shults acknowledges and addresses these head on in his 2023 article “Simulation, Science, and Stakeholders,” stressing that simulations must not be built in isolation—they must be co-designed with communities, subject-matter experts, and policymakers.
Shults goes even further in his 2025 paper ‘Sustainable Altruism: Simulating the Future of Prosociality and Peacebuilding—A Conceptual Review,’ warning that:
“Like nuclear power, genetic engineering, or any other powerful technology, MAAI models could be used by ‘bad actors’ for their own nefarious purposes. Such tools could also be used to discover the conditions under which—and the mechanisms by which—conflict could be increased in a population.”
Digital twins won’t bring peace on their own. But they offer a new kind of foresight which holds potential to become more applicable and relevant through further piloting and testing, in partnership with peacebuilding practitioners and researchers. For more insights into this evolving work, visit Culturepulse.ai and LeRon Shults Blog.
Lena Slachmuijlder Co-Chairs the Council on Tech and Social Cohesion.