Facebook has discovered and removed another coordinated hate campaign operated by the Myanmar military, which has used the service to spread false news and insults about the Rohingya people, Myanmar’s mostly Muslim ethnic minority.

In a blog post published Tuesday night, Facebook said it took down 425 Pages and 150 additional Facebook and Instagram accounts “linked to the Myanmar military.” At least 2.5 million people “followed at least one of these Facebook Pages,” the company added.

The Pages — which looked on the surface like “news, entertainment, beauty and lifestyle” Pages — were actually “periodically being used to drive specific anti-Rohingya messages,” said Nathaniel Gleicher, Facebook’s head of cybersecurity policy, in an interview with Recode.

It’s the third time since August that Facebook has taken down Pages and accounts linked to the Myanmar military, and these new Pages were part of the same network of accounts Facebook removed in the past, a spokesperson confirmed. Facebook was able to link these accounts to the Myanmar military in a number of ways. In some cases, military personnel were listed as administrators on the Pages. In other cases, Facebook noticed “infrastructure overlap,” like matching IP addresses, between military devices and the Facebook accounts.

Facebook’s actions here may be too late to fix Myanmar’s problems. The government and military of the mostly Buddhist country has been working to eliminate the country’s Rohingya minority, and more than 725,000 Rohingya — out of a population estimated at just over one million — have fled to Bangladesh to escape ethnic genocide in Myanmar since late 2017, according to a recent report from the Human Rights Council.

Facebook has been at the center of the government’s campaign of hate and misinformation. The military has used Facebook to help spread propaganda to support that mission, and Facebook has been criticized for moving too slowly to stop the spread of this propaganda. Military officials in Myanmar “were the prime operatives behind a systematic campaign on Facebook that stretched back half a decade” and targeted the Rohingya, as the New York Times reported in October.

That’s particularly damning, considering Facebook’s presence in Myanmar. The social network is used by an estimated 20 million people in Myanmar, or roughly 40 percent of the population. That’s the same number of people who have the Internet there, according to a human rights impact report Facebook commissioned and published in November. The Facebook app comes preinstalled on many smartphones sold in the country.

Those inside Facebook admit that the company was much too slow to find and remove these kinds of posts, and it’s clear now that Facebook was not properly staffed up to execute the manual review necessary to take down posts from Myanmar that violate the company’s community standards.

When Facebook launched in Myanmar in 2011, it had “a couple” of full-time Burmese speakers on its moderation team, said Monika Bickert, Facebook’s head of global content policy, in an interview. Now it has more than 100.

Facebook claims that it is getting better at removing bad content, and Tuesday’s account takedown is evidence that the company is making some progress. Facebook’s artificial intelligence technology can proactively flag hate speech in the country at a much higher rate than it did one year ago, for example. Facebook says its systems detected 63 percent of the hate speech it removed or suppressed last quarter, up from just 13 percent at the end last year.