Peer reviewed analysis from world leading experts

Keeping Facebook responsible in Myanmar

Reading Time: 5 mins
Military personnel stand guard in Mae Sot, Tak province, Thailand, 14 April 2024. (Photo: REUTERS/Athit Perawongmetha)

In Brief

In Myanmar, despite the government's efforts to block Facebook, continued access to the platform has been integral to maintaining freedom of speech among the population. Nonetheless, Facebook is criticised for facilitating the spread of harmful content and misinformation. There is an urgent need for an improved, responsible design on Facebook, including more proactive verification of political accounts, revisions to content management and compensation for harm resulting from its past policies.

Share

  • A
  • A
  • A

Share

  • A
  • A
  • A

In Myanmar, Facebook is the window to reality. In the early days of the 2021 coup, the Ministry of Transport and Communications imposed blocking orders on Facebook and WhatsApp. Most independent media have been extinguished amid financial and physical threats. Any disruption to Facebook access, relied upon by over half of Myanmar’s population as the country’s Internet, could be fatal to their freedom of speech. 

But Facebook has also attracted criticism for facilitating the spread of information that violates human rights in the country. Despite their attempts to ban Facebook, the junta weaponizes it by propagating divisive rhetoric to attract military recruits and funding. In the wake of the February 2024 forced conscription law, a user reported that ‘all the information on Facebook right now is about how to escape [the] country’. In times of crisis, people may be especially susceptible to misinformation disseminated by military supporters.

The need for reliable news is as urgent as ever. This could be achieved if Facebook was more willing to embrace the principles of responsible platform design and remediation.

In 2021, a video of the detained former Yangon region chief minister, Phyo Min Thein, accusing opposition leader Aung San Suu Kyi of corruption went viral. The video has been accused of being a deepfake or being filmed under duress. In an under-resourced nation with a limited public understanding of technological manipulation, algorithms should not be optimised only for engagement. In cases where timely fact-checking may be challenging, Facebook should ensure the availability of counter-narratives to combat bias and misinformation.

Information diversity requires an understanding of how algorithms prioritise and de-prioritise content. Notwithstanding internal misinformation policies and a promise to remove ‘advocacy of violence by Myanmar security forces’, Facebook’s page-recommendation algorithm has been accused of promoting pro-military content. As articulated in the Christchurch Call, algorithms should redirect users away from extremist content or promote credible counter-narratives.

To better recognise and encourage credible content, Facebook could develop a more specific version of its universal Community Standards in the form of ethical guidelines for political content in Myanmar. This requires actively engaging the myriad stakeholders — ethnic communities, experts, youth and diaspora — to develop a nuanced understanding of what is newsworthy, credible or harmful.

With a more tailored framework to bump up desirable — albeit less engaging — news, algorithms might be able to better prioritise diverse content and provenance over pure sensationalism. This goes a long way in helping people see the platform as a mere provider of information, rather than the peddler of truth.

Facebook should step up efforts to proactively verify the accounts of political parties and affiliated entities. Despite Facebook’s takedown policy against recidivists who post harmful content, detection may not be effective because the junta has thousands of soldiers spreading misinformation via fake accounts. Facebook should invest more resources in working at a grassroots level, such as with defectors already privy to the military’s information warfare tactics, to improve its detection mechanisms.

Due process mechanisms also contribute to transparency. Content takedowns are a double-edged sword that can chill extremist speech as well as lead to the over-removal of non-harmful posts. All users deserve the right to appeal restricted content and access mechanisms that reverse erroneous decisions. In light of the limited digital literacy rate of certain user populations and over a hundred spoken languages in Myanmar, Facebook should devote more resources to training both human and automated moderators that are sensitive to linguistic and cultural nuances.

Platforms should be held accountable when improper content management leads to grave consequences. The Rohingya conflict exposed this need in 2021 when Facebook rejected a proposal for victim remediation on the basis that ‘Facebook does not directly engage in “philanthropic activities”’. In response, Ireland’s National Contact Point for Responsible Business Conduct called for Facebook to amend its human rights policy and an Oversight Board mandate to provide rehabilitation or compensation when Facebook is found to have contributed to human rights violations.

Given the gravity of misinformation amidst an ongoing civil war, social media platforms have no reason to shirk from providing rehabilitation after the fact or even resources to educate and empower its users.

One suggestion is for Facebook to divest some profits, particularly those obtained from military-affiliated extremist content over the years, to fund urgent humanitarian efforts that provide digital services and education. Additionally, in response to complaints from victims of doxing, especially women under threat of violence, Facebook could offer heightened account security and monitor disclosures of victims’ personal information across the platform for a period of time. Beyond philanthropy, these gestures could represent Facebook taking accountability for entrenching strife and recognising its power to mitigate harms.

Despite Facebook’s missteps in Myanmar, it is an essential tool in keeping the people in touch with one another and the outside world. This only serves to underline its responsibility to protect the safety of its users.

Leave a Reply

Your email address will not be published. Required fields are marked *

Support Quality Analysis

Donate
The East Asia Forum office is based in Australia and EAF acknowledges the First Peoples of this land — in Canberra the Ngunnawal and Ngambri people — and recognises their continuous connection to culture, community and Country.

Article printed from East Asia Forum (https://www.eastasiaforum.org)

Copyright ©2024 East Asia Forum. All rights reserved.