Beyond Military Responsibility: Corporate and Digital Complicity in the Rohingya Crisis

Deekshita Mathe

Introduction: The Multi-Layered Architecture of Atrocity

The persecution of the Rohingya in Myanmar has been widely recognised by the international community as one of the most severe human rights crises of the twenty-first century. Investigations conducted by the United Nations Independent International Fact-Finding Mission on Myanmar (FFM) documented mass killings, forced displacement, widespread sexual violence, and the systematic destruction of Rohingya villages, concluding that there were reasonable grounds to believe that acts amounting to genocide had been committed [1].

Despite these findings, responsibility for the atrocities is often framed as resting solely with the Myanmar military, the Tatmadaw. While the Tatmadaw remains the primary perpetrator, such a narrow focus obscures the broader structures that enabled the violence. Genocide is not executed in isolation; it is sustained by financial systems, corporate relationships, and information infrastructures that collectively facilitate mass harm. Recent UN investigations demonstrate that private enterprises and digital platforms were not passive observers, but active enablers of violence through economic support, technological amplification, and deliberate inaction [2].

The Financial Engine: MEHL, MEC, and Corporate Supply Chains

The Tatmadaw maintains economic autonomy from civilian oversight through an extensive network of military-owned enterprises. Central to this system are Myanma Economic Holdings Limited (MEHL) and Myanmar Economic Corporation (MEC), two conglomerates with dominant interests in sectors including banking, mining, manufacturing, tourism, and extractive industries such as jade and timber [3].

According to the FFM’s 2019 report on the economic interests of the Myanmar military, revenue generated through MEHL and MEC directly supported military operations, including those conducted during the 2016–2017 “clearance operations” in Rakhine State. The military’s control over confiscated land, coupled with forced labor practices, enabled military-linked businesses to profit from displacement while simultaneously financing further abuses.

Civil society investigations, such as those conducted by Justice For Myanmar, have identified numerous domestic and international companies that continued commercial relationships with MEHL and MEC despite credible evidence linking these entities to international crimes [4]. While such relationships are often framed as neutral business dealings, international legal standards suggest otherwise.

Under the UN Guiding Principles on Business and Human Rights, corporations are required to conduct human rights due diligence and address adverse impacts linked to their operations and business relationships [5]. The Rohingya case illustrates a persistent gap between these normative obligations and effective enforcement, allowing corporate actors to operate with impunity.

Digital Platforms as “Mass Influence Digital Weapons”

In Myanmar, Facebook functioned as the primary gateway to the internet, rendering its influence uniquely pervasive. Amnesty International found that Facebook’s platform played a “determining role” in the spread of anti-Rohingya hate speech and incitement to violence, facilitated by algorithmic systems that prioritised engagement over safety [6].

Internal company documents reported by Reuters revealed that Meta was aware its engagement-based ranking algorithms amplified inflammatory and polarising content, yet failed to take adequate action even as violence escalated [7]. Rather than isolated moderation failures, the harm stemmed from structural design choices that predictably rewarded divisive and hateful content.

Journalist Steve Stecklow has described this phenomenon as the creation of a “mass influence digital weapon,” whereby Facebook’s algorithmic architecture transformed the platform into an accelerant for ethnic hatred at scale [8]. The resulting harm challenges traditional legal frameworks that focus on individual speech rather than systemic amplification.

Legal Complicity and the “Aiding and Abetting” Standard

International criminal law increasingly recognises that liability for mass atrocities extends beyond direct perpetrators. Under the doctrine of aiding and abetting, an actor may be held responsible where it provides substantial assistance with knowledge that such assistance contributes to the commission of international crimes [9].

Crucially, shared intent is not required. Knowledge of foreseeable consequences, coupled with continued assistance or deliberate inaction, may suffice. Legal scholarship has increasingly explored the application of this standard to corporate actors whose business models generate predictable human rights harms [10].

In this context, legal actions against social media companies argue that platforms whose revenue models depend on toxic engagement cannot plausibly claim neutrality when their systems foreseeably facilitate mass violence. These cases represent a developing frontier in international law, particularly with respect to corporate omissions and structural complicity.

A Strategy for Systemic Reform

Addressing corporate and digital complicity requires structural reform rather than isolated prosecutions. First, mandatory human rights due diligence laws must be adopted to impose binding obligations on corporations operating in conflict-affected contexts. Second, algorithmic transparency is essential to allow independent auditing of ranking systems that shape public discourse, as advocated by organisations such as Ranking Digital Rights [11]. Finally, targeted economic sanctions against military-owned enterprises, including MEHL and MEC, remain critical to dismantling the financial foundations of repression, as documented by investigative organisations such as The Sentry [12].

Conclusion

An exclusive focus on military perpetrators risks obscuring the broader systems that enable genocide. The Rohingya crisis demonstrates that corporate actors and digital platforms were not neutral intermediaries, but active participants whose decisions enabled violence through profit, negligence, and technological design. If accountability is to be meaningful, it must extend beyond the battlefield to encompass boardrooms and server rooms alike. Only by confronting the full architecture of atrocity can international law respond effectively to contemporary forms of mass violence.

[1] UN Human Rights Council, Report of the Independent International Fact-Finding Mission on Myanmar (2018), https://www.ohchr.org/en/hr-bodies/hrc/myanmar-ffm
[2] UN Human Rights Council, Economic Interests of the Myanmar Military (2019), https://www.ohchr.org/en/documents/reports/economic-interests-myanmar-military
[3] Ibid.
[4] Justice For Myanmar, Military-Linked Businesses in Myanmar, https://www.justiceformyanmar.org
[5] UN Office of the High Commissioner for Human Rights, Guiding Principles on Business and Human Rights, https://www.ohchr.org/en/instruments-mechanisms/instruments/guiding-principles-business-and-human-rights
[6] Amnesty International, The Social Atrocity: Meta and the Rohingya Crisis (2022), https://www.amnesty.org
[7] Reuters, Facebook Failed to Stop Hate Speech in Myanmar, https://www.reuters.com
[8] Steve Stecklow, How Facebook’s Rise Fueled Chaos in Myanmar, Wall Street Journal, https://www.wsj.com
[9] Antonio Cassese, International Criminal Law (Oxford University Press),2008 https://academic.oup.com/book/3966
[10] Oxford Journal of International Criminal Justice, Corporate Complicity in International Crimes,https://academic.oup.com
[11] Ranking Digital Rights, Algorithmic Accountability Framework, https://rankingdigitalrights.org
[12] The Sentry, Myanmar: Following the Money, https://thesentry.org

Thoughts