Last modified 16 April 2026

Meta is making changes to how huge accounts are handled on Instagram and Facebook

In an effort to prevent commercial interests from influencing judgments, Meta said it would change the way it handles posts by celebrities, politicians, and other users with large audiences on Facebook and Instagram. This decision-making practice has been criticized.

The internet giant pledged to put most of the 32 improvements to its "cross-check" program that were suggested by an impartial review panel that it sponsors to serve as a type of supreme court for judgments about content or policy into effect, either whole or in part.

Meta is making changes to how huge accounts are handled on Instagram and Facebook

This will result in considerable changes to how they manage this system, Meta global affairs president Nick Clegg wrote in a blog post.

These improvements will increase this system's effectiveness, accountability, and equity.

When it comes to content filtering judgments, Meta refuses to openly identify which accounts get preferential treatment. It also declined to establish a formal, transparent application procedure for the program.

Meta reasoned that the cross-check program's labeling of users may make them the subject of abuse.

The adjustments were made in response to the oversight panel's request that Meta revamp the cross-check mechanism in December. The panel said that the software seemed to prioritize economic interests above human rights by providing some users' posts preferential treatment when they broke the rules.

In a report at the time, the panel concluded that the initiative seemed to be more specifically tailored to address business issues.

Cross-check enables material that would normally be immediately deleted to stay up for a longer amount of time, possibly harming users, by offering additional protection to a small group of users who have been chosen mostly based on corporate interests.

According to the article, Meta said to the board that the program's goal is to prevent content-removal errors by adding a second layer of human scrutiny to posts made by prominent users that at first glance seem to violate the rules.

"We will continue to ensure that our content moderation decisions are made as consistently and accurately as possible, without bias or external pressure," Meta declared in its answer to the monitoring board.

"While we acknowledge that business considerations will always be inherent to the overall thrust of our activities, we will continue to refine guardrails and processes to prevent bias and error in all our review pathways and decision making structures."

Share with Friends:
facebook-share facebook-share facebook-share facebook-share

Was This Article Helpful?

Click on a star to rate it!

Thank you for your vote!

Average Rating: 5/5 Votes: 48

Be the first to rate this post!

Latest Posts

Art Authentication: Unveiling the Mystery Behind Provenance and Attribution
Art authentication is a critical process that verifies the origin, authorship, and authenticity of a work of art.
ME-QR Team
24.04.26
min
Read More
The Art of Thrift Shopping: Finding Hidden Treasures at Secondhand Stores
Thrift shopping, a delightful blend of treasure hunting and sustainability, has transcended its reputation as a budget-friendly endeavor to become a bona fide art form.
ME-QR Team
22.04.26
min
Read More
Social Media Challenges: Participation, Parodies, and Controversies
Social media challenges have become an iconic part of internet culture, offering users opportunities to participate in viral trends, showcase creativity, or raise awareness about social issues.
ME-QR Team
22.04.26
min
Read More
Risk Management Strategies: Identifying, Assessing, and Mitigating Business Risks
In the world of business, uncertainty is inevitable.
ME-QR Team
16.04.26
min
Read More