Bluesky steps up content control as millions flock to the platform
More users means more problems, and social media wunderkind Bluesky is no exception. On Monday, Bluesky announced new moderation efforts to address a growth in concerning user content amid its incredible growth.
Bluesky Quadruples Content Moderation Team
In an exclusive with Platformer, Bluesky explained that it would be quadrupling its content moderation team, currently a 25-person contracted workforce, in order to curb a worrisome influx of child sexual abuse material (CSAM) and other content that violates the sites’ community guidelines — cases that have so far fallen through the existing moderation systems and warrant human oversight.
Bluesky’s Challenges with User Reports
At large, the platform is navigating an explosion in user reports being handled by an extremely small company. On Nov. 15, the platform posted that it was receiving 3,000 reports per hour, compared to only 360,000 reports for all of 2023. “We’re triaging this large queue so the most harmful content such as CSAM is removed quickly. With this significant influx of users, we’ve also seen increased spam, scam, and trolling activity — you may have seen some of this yourself,” the platform wrote at the time.
Bluesky’s Approach to Content Moderation
Bluesky’s bolstering of its human workforce supplements what is often a complex and confusing world of automatic, AI-powered content moderation. In a follow-up thread, Bluesky noted issues with “short term” moderation policies put in place over the last week to tackle harmful content under high-severity policy areas, including CSAM. Responding to automatic flags and efforts from the trust and safety team, Bluesky explained, several accounts were temporarily suspended. The platform is reinstating accounts who feel they were unjustly removed.
Bluesky’s in-house content moderation systems are also paired with third-party tools, like reporting tool Safer, created by child safety nonprofit Thorn. Branded as a user-powered, decentralized social network, Bluesky prioritizes an “ecosystem of third-party providers” and eschews a “centralized moderation authority” in favor of user customizability.
Topics
Social Good
Social Media
