Cocoland revival raises concerns after Coco closure

The reappearance of a chat platform tied to thousands of cases and serious crimes has provoked sharp warnings from public officials and child protection groups

The sudden reappearance of a chat platform that many authorities and associations remember as Coco has stirred intense reaction. Originally closed by the courts in 2026 amid allegations that it had been used to recruit perpetrators and facilitate serious crimes, the service now presents itself under the name Cocoland. Officials, led by Sarah el Haïry, the high commissioner for childhood, have publicly denounced the relaunch and announced stepped-up measures to detect, report and shut down illicit activity.

News outlets noted that the new site reuses several visual and functional elements familiar to former users of Coco, while the platform itself insists it is independent and operated without a human administrator. The historical context is crucial: the original chat service had been linked to more than 23,000 judicial files, and investigators established connections to high-profile crimes, including the case in Mazan involving the victim Gisèle Pélicot and the suspect Dominique Pelicot. Any apparent continuity between the old and new platforms therefore triggers immediate concern among victim advocates and law enforcement partners.

What the authorities say

On 18 April, Sarah el Haïry described the platform’s revival as “a slap” to commitments to protect minors, stressing that these chat services are not innocuous. She warned listeners that procedures were already under way and promised persistent legal and administrative pressure: “we will track them, we will harass them, we will not give them a moment’s respite.” Officials intend to rely on coordinated reporting to national hotlines such as PHAROS and on judicial requests to obtain blocking or takedown orders, while criminal investigations continue to pursue those suspected of operating or facilitating illegal networks.

Legal and criminal background

Understanding the relaunch requires remembering prior actions: the original service was taken offline by judicial order in mid-2026, and the person identified as its founder, the Italian national Isaac Steidl, was placed under formal investigation on 9 January 2026 on several counts, including complicity in the possession and dissemination of child sexual images and association of wrongdoers. Authorities stress that closing a platform does not always eliminate the infrastructure or the users, because technical components and domain registrations can shift across borders, complicating enforcement.

Regulatory responsibility

Legal frameworks such as the LCEN and the Digital Services Act set out duties for platforms: they are not passive bystanders. Under these regimes, a site that becomes aware of illegal material and fails to act can face liability. That is why officials and child protection associations monitor relaunches closely and why rapid, documented reports to judicial authorities and to regulators are essential to trigger blocking, removal and possible criminal probes.

What Cocoland claims and how the service differs

The site presenting itself as Cocoland displays an explanatory notice asserting it is a “brand new, independent” platform with no legal, technical or organizational link to the former service. It claims to be operated by an artificial intelligence and describes itself as a resurrected fragment of a historical francophone chat, abandoned years ago. The operators also emphasize user responsibility and disclaim liability while implementing visible content warnings and mandatory pop-ups addressing forbidden behavior.

Technical choices and monetization

Unlike the purported anonymity of the old service, the new platform advertises processes that reduce anonymity: SMS authentication, paid premium accounts, and an age verification step using third-party tools that may request biometric selfies or automated age estimation. The site also shows advertising and partners with third-party marketing services, creating revenue streams. Observers note that these design choices change the risk profile but also raise questions about data protection and the security of sensitive personal information.

Experts underline that cross-border hosting and the use of multiple intermediaries—reported to include servers in places such as Ukraine or Bulgaria—mean that national measures alone are not always sufficient. For this reason, Sarah el Haïry and other officials are pressing for persistent reporting to PHAROS, cooperation with platform moderators, and recourse to judicial measures where needed. The situation demonstrates the continuing challenge of balancing technical countermeasures, legal accountability under instruments like the Digital Services Act, and the urgent task of protecting vulnerable people from exploitation.

Scritto da Sarah Palmer

Free daytime PrideFest celebrates inclusion in Moreton Bay