Ofcom Opens Formal Investigation Into Telegram's Content Practices
The UK's communications regulator Ofcom announced on April 21, 2026, that it has launched a formal investigation into Telegram following evidence that the messaging platform is being used to distribute child sexual abuse material. The investigation marks a significant escalation in regulatory scrutiny of the encrypted messaging service, which has faced mounting pressure from governments worldwide over content moderation practices.
The regulator's decision comes after months of evidence gathering that revealed systematic issues with how Telegram handles illegal content, particularly CSAM. Ofcom's investigation will examine whether Telegram has adequate systems in place to detect, remove, and prevent the sharing of such material, as required under the UK's Online Safety Act that came into full effect in 2024.
Telegram, founded by Pavel Durov in 2013, has positioned itself as a privacy-focused alternative to mainstream messaging platforms like WhatsApp and Signal. The platform uses end-to-end encryption for its "Secret Chats" feature but employs server-side encryption for regular chats and group conversations. This hybrid approach has made it popular among users seeking privacy but has also created challenges for content moderation.
The investigation represents the first major enforcement action under the UK's comprehensive online safety framework, which requires platforms to proactively identify and remove illegal content. Under these regulations, companies can face fines of up to 10% of their global annual revenue for non-compliance. For Telegram, which reported revenues of approximately $1 billion in 2025, this could translate to penalties of up to $100 million.
Ofcom's announcement follows similar regulatory actions in other jurisdictions. The European Union has been investigating Telegram under its Digital Services Act, while France arrested CEO Pavel Durov in August 2024 on charges related to the platform's alleged failure to moderate criminal content. These international pressures have created a complex regulatory environment for the messaging service.
Platform Users and UK Digital Services Face Regulatory Scrutiny
The investigation directly affects Telegram's estimated 900 million global users, including approximately 15 million users in the United Kingdom. UK-based users may face service restrictions or changes to platform functionality if Ofcom determines that current moderation practices are insufficient. The regulator has the authority to order internet service providers to block access to non-compliant platforms, though such measures are typically considered a last resort.
The investigation also impacts the broader UK technology sector, as it establishes precedent for how regulators will enforce online safety requirements. Other messaging platforms operating in the UK, including WhatsApp, Signal, and Discord, are closely monitoring the proceedings as they may face similar scrutiny if their content moderation practices are deemed inadequate.
For enterprise users and organizations that rely on Telegram for business communications, the investigation creates uncertainty about the platform's long-term viability in the UK market. Many companies have already begun evaluating alternative communication tools in anticipation of potential service disruptions or compliance-related changes that could affect functionality.
The investigation particularly affects privacy advocates and digital rights organizations, who have long supported Telegram's encryption practices but now face the challenge of balancing privacy protections with child safety concerns. The outcome could influence how encrypted messaging services implement content moderation without compromising user privacy.
Regulatory Process and Platform Response Requirements
Under Ofcom's investigation process, Telegram must provide detailed information about its content moderation systems, including technical specifications for how the platform detects and removes illegal material. The company has 30 days to respond to initial information requests and must demonstrate compliance with UK online safety requirements through documented policies and technical implementations.
The investigation will examine Telegram's use of automated detection systems, human moderation processes, and reporting mechanisms for illegal content. Ofcom will specifically assess whether the platform's encryption practices create barriers to effective content moderation and whether alternative technical solutions could address these challenges without compromising legitimate privacy interests.
Organizations using Telegram for business purposes should review their communication policies and consider implementing additional safeguards. IT administrators should document current usage patterns and prepare contingency plans for potential service changes or restrictions. Companies should also evaluate whether their use of Telegram complies with internal data governance policies and regulatory requirements in their respective industries.
The regulator has indicated that it will work with international partners, including the European Commission and US authorities, to coordinate enforcement actions. This collaborative approach suggests that Telegram may face simultaneous regulatory pressures across multiple jurisdictions, potentially leading to comprehensive changes in how the platform operates globally. The investigation timeline extends through 2026, with preliminary findings expected by the third quarter.






