The issue of whether or not controversial, harmful posts should stay up on a social media platform has puzzled content moderators for years. And it's a question whose answer becomes even less clear when the poster is a government official.

Recently Facebook found itself facing this very issue—and took a significant step in response. This week, the company banned a major government official from its platform for the first time.

It remains to be seen whether Facebook's move is a new line in the sand for content moderation—but what is clear is that platforms face numerous difficulties deciding how best to respond when politics, human rights and social media collide.

After criticism from United Nations investigators probing the genocide of Rohingya Muslims in Myanmar, the Menlo Park, California-based company this week removed 20 individuals and organizations in Myanmar from its platform. Among them was the country's top general, Min Aung Hlaing.

“International experts, most recently in a report by the UN Human Rights Council-authorized Fact-Finding Mission on Myanmar, have found evidence that many of these individuals and organizations committed or enabled serious human rights abuses in the country. And we want to prevent them from using our service to further inflame ethnic and religious tensions,” Facebook said in a press release.

The company acknowledged it was “too slow to act” in its crackdown on government officials' accounts and that it's aiming to improve reporting tools and identification of hate speech on its platform.

But some content moderation lawyers have noted that, even in a case this extreme, decisions of who to allow on a platform are a difficult one.

Twitter and Facebook have said that a post's newsworthiness factors into whether it stays up, even if the post goes against other parts of their content moderation policies. Both companies have run into controversies over moderation decisions before.

Cathy Gellis, an outside policy counsel who works on internet platform law and other areas of tech law, noted that in the case of Myanmar, and many other cases, companies will take at least some criticism no matter what they decide.

“Whether [companies] leave it up or take it down, both options solve some part of the problem, but doesn't solve the other parts, or creates more problems,” she said. “There's no silver bullet. But there is a lot of public pressure to come up with silver bullets.”

Gellis said that, in cases where politicians are committing genocide, kicking them off of Facebook isn't likely to stop those atrocities. It could just move the violence further from public view.

Once exceptions are made to kick some government users off of a platform, she added, the public may ask why that exception wasn't made in other cases.

In the U.S., some called for Twitter to delete President Donald Trump's tweet to North Korean leader Kim Jong Un back in January with the rationale that it threatened nuclear war. The tweet was not removed, despite arguments that it violated Twitter's policies against violent threats.

Eric Goldman, a professor at Santa Clara University School of Law, said that it's hard to apply decisions that different social media platforms have made in one country to those made in another, because the laws of each country and the policies of each platform can vary so much.

A country's free speech laws, freedom of the press, ease of internet access, political stability and other factors can all play into whether a post stays up.

“The same content in different countries can have different results,” Goldman said. “It's about the cultural context, not just the legal context.”

Goldman said that's why it is important for platforms to have content moderators who understand the culture and language of the company in question, as much as possible. Most platforms still have not completely attained that goal, he said.

Gellis said it can also be helpful for companies to explain their decision online—as Facebook did—so that users can understand why an exception to the policy was made, and can feel that moderation decisions are being kept transparent.

But there can still be difficulties. “Transparency has helped control reactions [to removals], but transparency has also invited some troubles. People said, 'You made this decision over here, but not over here,'” Gellis explained. “You can't win if you're running a platform, but you can maybe choose your poison.”