Meta is making changes to how huge accounts are handled on Instagram and Facebook

In an effort to prevent commercial interests from influencing judgments, Meta said it would change the way it handles posts by celebrities, politicians, and other users with large audiences on Facebook and Instagram. This decision-making practice has been criticized.

The internet giant pledged to put most of the 32 improvements to its "cross-check" program that were suggested by an impartial review panel that it sponsors to serve as a type of supreme court for judgments about content or policy into effect, either whole or in part.

This will result in considerable changes to how they manage this system, Meta global affairs president Nick Clegg wrote in a blog post.

These improvements will increase this system's effectiveness, accountability, and equity.

When it comes to content filtering judgments, Meta refuses to openly identify which accounts get preferential treatment. It also declined to establish a formal, transparent application procedure for the program.

Meta reasoned that the cross-check program's labeling of users may make them the subject of abuse.

The adjustments were made in response to the oversight panel's request that Meta revamp the cross-check mechanism in December. The panel said that the software seemed to prioritize economic interests above human rights by providing some users' posts preferential treatment when they broke the rules.

In a report at the time, the panel concluded that the initiative seemed to be more specifically tailored to address business issues.

Cross-check enables material that would normally be immediately deleted to stay up for a longer amount of time, possibly harming users, by offering additional protection to a small group of users who have been chosen mostly based on corporate interests.

According to the article, Meta said to the board that the program's goal is to prevent content-removal errors by adding a second layer of human scrutiny to posts made by prominent users that at first glance seem to violate the rules.

"We will continue to ensure that our content moderation decisions are made as consistently and accurately as possible, without bias or external pressure," Meta declared in its answer to the monitoring board.

"While we acknowledge that business considerations will always be inherent to the overall thrust of our activities, we will continue to refine guardrails and processes to prevent bias and error in all our review pathways and decision making structures."