Rise in Online Child Luring Cases Sparks Concern Over Underreporting, Experts Warn

0
4

Key Takeaways

  • Reported cases of “luring a child via a computer” rose nearly 20 % in 2025, increasing from 2,882 (2024) to 3,456.
  • Experts believe the true scale of online child exploitation is far higher due to under‑reporting, easy access to smartphones, and limited supervision.
  • Predators are using a widening array of platforms—including Snapchat, Instagram, TikTok, Telegram, and even newer chat features on services like Spotify—to groom victims in secrecy.
  • Legal advocacy groups are pushing courts to treat social‑media platforms as “defective products,” seeking financial liability that could incentivize safer design.
  • Several countries (Australia, France, Türkiye) have enacted age‑based bans on social‑media use; Canada’s Online Harms Act stalled in early 2025 but consultations are underway for a revised bill with broader scope, covering private‑messaging functions and emerging technologies.
  • Upcoming rallies, such as the Children First Canada event, aim to pressure federal leaders to reintroduce robust online‑safety legislation.

Rise in Reported Child Luring Cases
According to Statistics Canada, cyber‑related violations reported to police remained steady at just over 85,000 cases in both 2024 and 2025. However, the specific offence of “luring a child via a computer” showed a sharp uptick, climbing from 2,882 incidents in 2024 to 3,456 by the end of 2025—a rise of roughly 19.9 %. This increase stands out against the flat overall trend, signalling a growing problem that warrants closer examination.


Under‑Reporting and Contributing Factors
Cybersecurity analyst Ritesh Kotak warned that the official figures likely underestimate the true magnitude of online child luring. He cited three main drivers: ubiquitous technology access, the proliferation of social platforms, and insufficient adult supervision. Kotak argued that the absence of robust safeguards has created an environment where predators can operate with relative impunity, and the reported numbers merely scratch the surface of the issue.


Platforms Favored by Predators
Jacques Marcoux, director of research and analytics for the Canadian Centre for Child Protection, explained that most reports originate from Snapchat and Instagram, with growing activity on TikTok and the encrypted messaging app Telegram. He noted that groomers deliberately cultivate relationships over extended periods, using the secrecy afforded by private or semi‑private channels to avoid detection. The pattern reflects a calculated strategy rather than opportunistic abuse.


Expansion of Chat Functions Across Services
Marcoux highlighted that the risk landscape is broadening as more apps introduce private‑messaging capabilities. He pointed out that Spotify, traditionally a music‑streaming service, rolled out private chat functions a few months prior, joining a trend where platforms add engagement‑driven features to boost user retention and advertising revenue. Each new chat avenue increases the potential for strangers to contact minors, amplifying overall exposure.


Economic Incentives Behind Feature Roll‑outs
According to Marcoux, the motivation behind embedding chat tools is straightforward: higher engagement translates directly into greater profits for tech companies. Platforms compete for users’ attention, and interactive features keep users logged in longer. While these functionalities may seem convenient, they simultaneously lower barriers for malicious actors seeking to exploit children, creating a tension between business goals and child safety.


Legal Strategies Targeting Platform Liability
Child‑advocacy lawyer Matthew Bergman has pursued a novel legal approach, arguing that social‑media platforms should be treated as “defective products” that cause foreseeable harm. In a landmark Los Angeles jury verdict earlier this year, YouTube and Meta were found negligent for designing addictive products that contributed to plaintiff injuries, resulting in a $6 million award. Bergman contends that only financial consequences—hit‑to‑the‑pocketbook—will compel companies to redesign their services with child protection in mind.


Public Pressure vs. Economic Change
Bergman expressed skepticism that moral outrage or negative publicity alone will drive meaningful reform. He observed that despite extensive media coverage of platforms exploiting kids, substantive change has been elusive. Instead, he believes altering the economic calculus—making platforms financially liable for harms—will align corporate incentives with safety, prompting the development of safer designs and stronger protective measures.


International Legislative Responses
Several nations have already enacted age‑based restrictions on social‑media use. Australia, France, and Türkiye have passed laws that ban children below a certain age from accessing major platforms. These measures aim to curb exposure at the source by delaying young users’ entry into environments where grooming is prevalent. Policymakers elsewhere are watching these experiments closely, considering similar bans or stricter age‑verification regimes.


Canada’s stalled Online Harms Act and Future Directions
Canada introduced the Online Harms Act in 2024 to regulate harmful content and shield children online, but the bill collapsed in early 2025 after Prime Minister Justin Trudeau’s resignation and the subsequent prorogation of Parliament. Marcoux noted that the government has signalled intent to reintroduce a revised bill, with ongoing consultations addressing emerging challenges such as artificial intelligence and the need to capture a wider range of companies beyond traditional social‑media giants.


Elements Sought in the Proposed New Bill
Advocates like Marcoux are urging the forthcoming legislation to include several key provisions. First, the scope should extend to any firm offering private‑messaging or chat services—not just platforms commonly labelled as “social media.” Second, such services ought to incorporate design requirements: accessible reporting tools, parental controls, user‑blocking capabilities, and other basic safety features. Third, there is support for establishing a minimum age for social‑media access, mirroring the approaches taken by Australia, France, and Türkiye, to delay children’s exposure to high‑risk environments.


Grassroots Mobilization for Reform
On the horizon, Children First Canada plans a rally calling on federal leaders to reinstate and strengthen online‑safety legislation. The event aims to amplify public demand for concrete action, leveraging the momentum generated by rising statistics, expert testimony, and legal victories. Organizers hope that a visible, unified call for change will persuade legislators to prioritize child protection in the digital age, balancing innovation with the imperative to safeguard vulnerable users.

SignUpSignUp form

LEAVE A REPLY

Please enter your comment!
Please enter your name here