Silence, Confidentiality Mark Facebook’s Engagement With Local Groups

9 min read
Facebook, recently rebranded as Meta, said it had a network of local civil society groups to address rights issues, but only one Cambodian civil society representative said they had sustained contact with Facebook. (Kuoy Langdy/VOD)
[responsivevoice_button voice="US English Female"]

The fake news story started small on Saturday, with a supposedly first-person account from a NagaWorld worker anonymously posted on a Facebook group with just more than 100 followers.

The post alleged the NagaWorld union members currently on strike outside the casino complex were organized by a secretive network of foreigners, an inflammatory claim evidenced only by a series of photos of union leaders posing with non-Cambodians at unspecified events. The anonymous poster went on to add pictures of union organizers and local journalists at the weekend event, misidentifying the photo subjects to serve a narrative of outside influence.

Though it seemed a clear piece of disinformation, the claim didn’t stay small for long. Government-aligned outlet Fresh News quickly boosted the claim with coverage that was then shared — again on Facebook — by an official with the Ministry of Interior. 

The NagaWorld workers’ union and a coalition of civil society groups denounced the spread of the unfounded claim. By Monday, the anonymous group seemed to have disappeared from Facebook, though its claim lived on in Fresh News.

When it comes to protecting the rights and safety of users across its vast digital realm, Facebook’s efforts can resemble a patchy safety net held up by the advocacy of major institutions.  

Company representatives had previously told VOD that Facebook maintains a network of civil society contacts in the kingdom as part of an ongoing push to address human rights concerns across its platforms. But the tech giant revealed no additional information about this outreach, so VOD contacted representatives of more than a dozen civil society organizations to ask if their groups had any contact with Facebook, the flagship platform of the recently rebranded tech giant Meta.

Over the past few weeks, only one Cambodian civil society representative said they had sustained contact with Facebook, and even that had been sparse since it began a few years ago.

Two other civil society figures who responded to messages from a reporter, Ou Virak of Future Forum and Soeng Senkaruna of Adhoc, said they’d met with Facebook a few years ago as the social media giant began expanding its outreach on human rights matters in Asia. However, both men said they hadn’t heard from Facebook in the years since, and were not aware of other Cambodian civil society figures with connections to the company.

“I have met with a senior representative a few years ago. That’s the end of that,” Virak said. “If it’s for information, that’s legit. If it’s to deal with the mountains of issues with Facebook, that’s a terrible attempt.”

“How do we know [if] it’s purely PR vs. giving civil society organizations a say?” he questioned.

Other Cambodian respondents said they had no contact with Facebook at all, or only in cases of basic technical issues. The most substantial relationship found by a reporter seemed to be between the company and international organization Human Rights Watch (HRW). The group’s Asia division deputy director Phil Robertson, who is based in Thailand, said in a brief email that “we deal with Facebook on various things, relating to freedom of expression, safety of dissidents, etc.”

Robertson said the most notable example of that cooperation in Cambodia came in the case of the monk Luon Sovath. A long-time civil rights advocate, Sovath was driven into exile after being charged with rape based on claims made in a crude Facebook misinformation campaign apparently waged by a government propaganda office.

Robertson did not answer additional questions about HRW’s relationship with Facebook.

From Cambodia, the sole rights worker who said they’d had additional contact with Facebook said that communication was still infrequent.

The rights worker asked for anonymity for both their name and that of their organization, explaining the social media company had requested their communications remain confidential. They did not say whether they knew of other Cambodian organizations working with the social media company.

This civil rights worker said they had been asked to join an advisory committee for Facebook in Cambodia after first speaking with company representatives in 2018 as the platform commissioned a human rights study of its presence in the country.

The rights worker said Facebook asked them in 2019 to become a more consistent advisory partner but rejected that offer on the grounds of its potential sensitivity. However, the rights worker said Facebook has reached out to them since then, including as recently as about a month ago.

“But it was not successful as when I replied to them, they never got back to me,” they said in a message.

That isn’t necessarily unusual for Facebook’s outreach strategy, say those who frequently deal with the company.

“I think an issue civil society has had is that Facebook reaches out only when they [Facebook] need support and when things have already gotten worse,” said Dhevy Sivaprakasam, the Asia Pacific policy counsel with digital rights group Access Now.

Sivaprakasam said Facebook does actively reach out to civil society groups across the region. From what she’s seen, Sivaprakasam believes motivated staffers and other parts of the tech giant are committed to the company’s human rights goals in Southeast Asia. However, as far as she knows, Cambodia is not yet a serious feature of that work.

“For Cambodia, I can confirm that Meta/FB has not reached out [to Access Now] for engagement,” Sivaprakasam said in an email. “I do not remember any instance of them substantially engaging with partners in Cambodia on content governance/safety.”

“I can say that in the region, Cambodia does not appear to be a priority country for the company.”

That lack of focus on Cambodia has fostered an environment in which fake news and disinformation can spread quickly from anonymous Facebook accounts into government-aligned media and on to criminal accusations. Though the NagaWorld claims seemed to have been removed from the platform by somebody as of Monday, such thinly evidenced accusations have in the past served as potent weapons used against civil society figures and members of the outlawed CNRP.

Facebook’s user safety net does generally exist, Sivaprakasam said, but remains a work in progress. She contrasted the company’s apparently minimal presence in Cambodia to that in Myanmar, which she said remains one of Facebook’s highest-priority countries following the turmoil unleashed by the February military coup.

But Myanmar had emerged as a problem for Facebook long before this year, as users took to the platform to spread genocidal rhetoric linked to the 2017 ethnic cleansing campaign against the Rohingya Muslim minority by the country’s central military.  The role of Facebook in promoting real-world violence in Myanmar has repeatedly landed the company in hot water, most recently as last week with a $150 billion class-action lawsuit filed against Meta by Rohingya refugees in the US.

Facebook’s attempts to mitigate its problems in Myanmar have also highlighted the platform’s attempts to moderate the digital commons.

In 2015, the company had just four Burmese language moderators overseeing content for the roughly 7.3 million users it had then in Myanmar, a sprawling and diverse state home to some 54.4 million people. Facebook now reportedly employs more than 100 Burmese-language overseers handling content for the roughly 22.3 million Myanmar users on its platforms as of last year. Even still, Facebook has struggled to enforce its bans on Myanmar military content put in place after the February coup and subsequent return to rule by the armed forces.

Part of that difficulty comes down to the shortcomings of algorithm-based moderation. 

University of Sydney lecturer Aim Sinpeng, who studies social media use in Southeast Asia and provides consultation for Facebook, said the onset of the Covid-19 pandemic led the company to rely more heavily on automated content monitors, rather than humans.

“They were making progress pre-pandemic but Covid did reduce their push to add more people,” Aim said. “They’ve reduced staff and need to basically come up with even more powerful algorithms to deal with it. For them, it’s always about solving local problems with scalable solutions that work for them — dealing with local conflicts is tedious.”

Before the onset of Covid, the company had already been in the process of decentralizing its content moderation and policy-building, pushing more oversight to regional headquarters and in-country offices. Aim pointed out that Facebook doesn’t have offices in every regional country it operates in, including those such as Myanmar and Vietnam where Aim said the company’s platforms hold “basically a monopoly” of internet users.

Facebook has explained that in the past by citing in-country security concerns for both staff and data, particularly in states that tightly monitor social media for political dissent. Those safety considerations hold true in several Southeast Asian countries, Aim said, pointing to the fraught political environments of both Cambodia and Thailand as also creating a potentially risky environment for Facebook assets.

To overcome the lack of in-country staffing, Facebook has routed most of its regional moderation efforts through its Asian headquarters in Singapore, hiring staff with relevant language skills from countries where the company is active. Some of these workers are full-time in Singapore, Aim said, while others — such as many of those from Myanmar — were flown into the city-state on a regular basis before the pandemic.

Such workers are key to making even the most sophisticated algorithms actually work.

“For algorithms to work well, they need context, they need moderators, local staff or regional staff, and other partners to feed that intel so they can either add on or revise text that could be revoked,” Aim said. “They want their algorithms to remove them before they appear, so the damage isn’t done. … It’s always about scaling up, dealing with millions of messages each day.”

In past releases, Facebook stated it had “almost tripled” its moderation efforts for Khmer language content. When reached by a reporter, a company representative in Singapore did not provide any specifics as to what those efforts look like, including the level of human staffing.

A news release posted by Facebook stated that its teams were also focusing on preventing the spread of disinformation in Cambodia ahead of the country’s elections in 2022 and 2023.

Besides its formal employees or contractors, Aim continued, Facebook does maintain a somewhat informal network of consultants it deems “trusted partners.” Aim said she’s one of these partners, a group that typically includes selected figures from the fields of human rights and academia.

Aside from funding more formal research projects, Facebook calls on these “trusted partners” on a case-by-case basis. For the most part, Aim said, the partners are not paid for their advice, and there’s no real structure as to who they approach for what.

She wasn’t aware of any partners in Cambodia working with Facebook, but said that didn’t mean they weren’t out there. Still, any civil society groups that may be interested in partnering with Facebook may have few options but wait for the giant to come to them.

“It’s very out of the blue,” Aim said of contact from Facebook. “They’re not the most approachable organization and I think they want to keep it that way.”

VOD. No part of this article may be reproduced in print, electronically, broadcast, rewritten or redistributed without written permission. VOD is not responsible for any infringement in all forms. The perpetrator may be subject to legal action under Cambodian laws and related laws.