Britain’s privacy watchdog, the Information Commissioner’s Office (ICO), has launched an investigation into TikTok, Reddit, and Imgur to assess how these social media platforms safeguard children’s privacy.
Social media companies rely on complex algorithms to prioritize content and keep users engaged. However, these algorithms can also amplify harmful material, exposing children to increasing amounts of inappropriate content.
The ICO’s investigation primarily focuses on how TikTok, owned by Chinese firm ByteDance, uses the personal data of 13–17-year-olds to recommend content. On the other hand, Reddit and Imgur are under scrutiny for their age verification processes and how they determine whether a user is a child.
“If we find there is sufficient evidence that any of these companies have broken the law, we will put this to them and obtain their representations before reaching a final conclusion,” the Information Commissioner’s Office said in a statement.
This is not the first time TikTok has faced scrutiny in the UK. Back in 2023, the ICO fined the Chinese company £12.7 million ($16 million) for violating data protection laws by collecting personal data from children under 13 without parental consent.
A Reddit spokesperson told Reuters that the company has been working closely with the ICO and intends to comply with all relevant regulations. They also noted that most Reddit users are adults but confirmed plans to roll out new age assurance measures this year in response to updated UK regulations.
ByteDance, TikTok, and Imgur have yet to respond to requests for comment.
The UK has already introduced stricter laws for social media platforms, requiring them to enforce age limits and implement stronger age-checking measures to prevent children from accessing harmful content. Under proposed regulations, companies like Facebook, Instagram, and TikTok must adjust their algorithms to filter or downgrade harmful material, ensuring a safer online experience for younger users.