Following the violent war that erupted nearly a year ago after Russian soldiers invaded Ukraine, Facebook parent company Meta is revising its content moderation system.
The far-right Ukrainian military outfit Azov Regiment was most recently removed off the social media behemoth's list of potentially dangerous people and groups. The Azov Regiment will be able to create profiles on Facebook and Instagram and publish stuff there without worrying about it being taken down until it violates the platform's content guidelines. Other users will be able to openly laud and support the group's work as a result of the change.
The change in policy comes after months of scrutiny about where the social media juggernaut is drawing the line between promoting free speech about the conflict and restricting statements that can have violent or harmful repercussions offline.
In recent months, the independent Meta Oversight Board has claimed that the firm has gone too far in eradicating information that challenges authoritarian regimes or leaders. The Oversight Board is a group of academics, activists, and specialists that review Meta's content moderation decisions.
The Azov Regiment has sparked discussion in the past. It is one of Ukraine's most skilled military groups that has engaged in combat with Russian forces in strategic locations like the besieged city of Mariupol and close to Kyiv. But there were worries that the organization was attracting radicals because of its ties to far-right nationalist ideas. Putin partially referred to the Azov troops when he described his assault against Ukraine as an effort to "de-Nazify" the nation and depict Ukrainian nationalism as fascist.
In this instance, Meta contends that the far-right nationalist Azov Movement and the Azov Regiment are no longer affiliated. It indicates that the unit is formally under the command and authority of the Ukrainian government.
Other components of the Azov Movement, notably the National Corp., and its founder Andriy Biletsky, according to a statement from Meta, are still on its list of potentially harmful people and groups.
Hate speech, hate symbols, calls for violence, and any other types of content that contradict the company's community standards are still prohibited, and they will be removed if they are discovered, according to the statement.
Ukraine's minister for digital transformation, Mykhailo Fedorov, lauded Meta's choice and singled out former British deputy prime minister Nick Clegg as Meta's president for global affairs.
As Fedorov tweeted, "Means a lot for every Ukrainian. New approach enters the force gradually. Big contribution @nickclegg & his team in sharing truthful content about war."
Fedorov had protested in a letter to Clegg last summer that, at a time when Russian propaganda was rampant online, Meta's use of automatic content filtering systems unfairly prevented Ukrainian media groups from spreading truthful information about the war. Federov also exerted pressure on Apple, Facebook, and other businesses to create a "digital blockade" on Russia in the early phases of the war.
The company's policies have also recently changed in addition to Meta's choice about Azov. The Meta decision to remove a Facebook post criticizing the treatment of women by the Iranian government, particularly Iran's stringent rules requiring women to wear the hijab, was overruled by the Oversight Board, the Oversight Board stated earlier this month.
The tweet in question featured a caricature of Iranian Ayatollah Ali Khamenei with his beard forming a fist, who is holding a lady wearing a hijab and with shackles around her ankles. The Farsi caption demanded "marg bar" or death to the anti-women Islamic regime and its "filthy leader Khamenei.
Facebook took down the post due to its incitement to violence, but when the Oversight Board agreed to hear the appeal, it was eventually republished under the exemption for newsworthy material.
In its decision, the Oversight board noted that "marg bar" might often be taken to indicate "down with." The Oversight Board argued that since the post had not initially violated the company's guidelines, Meta did not need to invoke a newsworthy exemption. According to the Oversight Board, the post's language was being used as a "political slogan, not a genuine threat."
The board issued its decision with the statement that they have made suggestions to further safeguard political expression in crucial situations, such as that in Iran, where unprecedented, massive demonstrations are being forcefully put down. This includes approving the widespread usage of "marg bar Khamenei" during Iranian protests.
Numerous Iranians were protesting Mahsa Amini's murder while in the custody of Iran's infamous "morality police", so the Oversight Board was deliberating.
The Oversight Board also rejected Meta's decision to take down a Facebook post in November that compared Russian invaders of Ukraine to Nazis. The Oversight Board determined that the Facebook post, which contained an apparent dead body image and referenced a poem advocating for the murder of fascists, did not breach the company's content guidelines or its obligation to uphold human rights.
Following the Oversight Board's selection of the case, Meta reversed its earlier decision to delete the post for breaking its anti-hate speech policies, which prohibit users from publishing "dehumanizing" information against certain racial or ethnic groups. Later, the business included a warning box alerting viewers that the image's content could be violent or explicit. The decision by Meta to display a warning message on the post was overruled by the board, and the business stated at the time that it will analyze additional posts containing the same material to decide whether to take action.
This year, Meta chose to make an extraordinary exception to its long-standing ban on hate speech by allowing certain appeals for violence against Russian invaders. The Washington Message was able to examine a copy of Clegg's internal post in which he stated that the firm will be referring to the Oversight Board with regard to the instructions it gave moderators.
Later, claiming "ongoing safety and security concerns," Meta withdrew its request that the Oversight Board examine its policy regarding information about the war. The Oversight Board criticized that, anyway.
The board issued a statement at the time that said, "While the Board understands these concerns, we believe the request raises important issues and are disappointed by the company’s decision to withdraw it."