Australia’s Kids’ Social Media Model: Lessons for Canada’s Regulatory Path

0
4

Key Takeaways

  • Many Canadian teens, like 13‑year‑old Sadie Dooley Thiffault, feel socially excluded when they are not allowed to use platforms such as Snapchat, which their peers rely on for instant, disappearing‑message communication.
  • Australia became the first country to ban social‑media use for anyone under 16 in December 2023; the law places the compliance burden on platforms, which must shut down under‑16 accounts and face fines of up to CAD 45 million for repeat violations.
  • Early enforcement has been uneven: while Meta reported shutting down ≈ 550,000 accounts in the first days, most platforms still rely on self‑declared age and lack robust age‑verification tools, allowing many kids to bypass the ban via VPNs or false information.
  • Surveys show mixed outcomes in Australia—about 61 % of parents notice more offline social engagement and better parent‑child relations, yet 25 % report reduced connection for their children, indicating the ban’s impact is not uniform.
  • In Canada, political momentum is growing: the federal Liberal party adopted a motion to set 16 as the age of majority for social media, Manitoba’s premier announced plans for a provincial ban, and a Angus Reid poll found 75 % of Canadians support restricting under‑16 access.
  • Critics argue that bans may drive youth activity underground rather than eliminate harm, citing risks of cyberbullying, grooming, and compulsive use; some advocate for stronger platform safety measures, age‑appropriate design, and legal actions (e.g., Ontario school‑board lawsuits) to hold companies accountable.
  • Experts suggest that any effective policy must combine age verification, parental education, and platform responsibility, recognizing that outright bans alone cannot address the complex social and psychological dynamics of adolescent online life.

Sadie’s Perspective: Missing Out on Digital Social Life
Thirteen‑year‑old Sadie Dooley Thiffault from Toronto describes feeling left out because she does not have a smartphone or access to Snapchat, the app her friends use for rapid, disappearing‑message chats. She recalls secretly using Snapchat on her computer for a year before her parents discovered the account in March and shut it down. Sadie notes that her friends can reply at any hour, leaving her waiting no more than five minutes for a response, whereas without the app she often learns about gatherings only at the last minute. Her mother, Mary Dooley Thiffault, echoes the frustration, saying she wishes there were government‑imposed age restrictions similar to Australia’s or to rules governing driver’s licences and alcohol, to help protect her daughter from the pressure of constant online comparison and exclusion.


Australia’s Social‑Media Ban: How It Works
In December 2023, Australia enacted the world’s first nationwide ban on social‑media accounts for users under 16, targeting platforms such as Facebook, Instagram, TikTok, Snapchat, and X (formerly Twitter). The legislation places the onus of compliance on the companies: when the ban took effect, Meta reported shutting down approximately 550,000 accounts registered to under‑16 users in the initial days. New users must now self‑declare their age, and platforms are expected to flag and delete accounts suspected of belonging to minors. The government has encouraged the use of additional age‑verification technologies—such as ID uploads, facial or voice recognition, and behavioural analysis—but enforcement relies heavily on the platforms themselves, with repeat offenders facing fines of roughly CAD 45 million.


Enforcement Challenges and Work‑arounds
Despite the law’s ambitious design, enforcement has been patchy. A recent report by the Age Verification Providers Association found that nine of the ten major platforms still do not perform real‑time age checks at sign‑up, relying instead on self‑declaration. Consequently, many under‑16 users circumvent the ban by using virtual private networks (VPNs) to mask their location, providing false birthdays, or maintaining multiple accounts. Kaydee Farrell’s mother, Amy, reports seeing friends continue to post publicly on TikTok in school uniforms without being removed from the platform. Experts like Iain Corby of the Age Verification Providers Association argue that true verification—not merely self‑attestation—is required, and that the current leniency is intended to give companies time to adapt, though stricter measures are expected soon.


Kids and Parents: Mixed Reactions Down Under
The ban has produced a blend of positive and negative experiences. Twelve‑year‑old Kaydee Farrell, who had hoped to get Snapchat for her 13th birthday, says the restriction makes her feel “very sad” because she cannot join her friends’ plans, such as trips to the mall in Brisbane. She acknowledges her parents’ safety concerns but laments missing out on spontaneous social arrangements. Conversely, some parents report benefits: a January YouGov survey indicated that 61 % observed their children engaging more in offline social activities, and 38 % noted improved parent‑child relationships after the ban. However, 25 % of respondents said their children felt less socially connected, highlighting that the policy’s impact varies widely across families and communities.


Canadian Political Momentum and Public Opinion
Australia’s experiment has resonated in Canada. Earlier this month, the federal Liberal party adopted a motion at its convention to set 16 as the age of majority for social‑media use. Manitoba Premier Wab Kinew became the first provincial leader to announce plans for a social‑media ban (including restrictions on youth use of AI chatbots), though details remain pending. A nationwide Angus Reid Institute poll revealed that 75 % of Canadians support restricting under‑16 access to platforms like Snapchat and Instagram. Similar discussions are underway in Austria, Denmark, France, Germany, the U.K., Malaysia, and Indonesia, indicating a growing international appetite for regulatory action.


Critiques: Why a Ban May Not Be Enough
Legal scholar Sonia Nijjar, representing 22 Ontario school boards in lawsuits against Meta, TikTok, and Snapchat, warns that outright bans could push youth activity underground rather than eliminate harm. The lawsuits allege that pervasive, compulsive use of these platforms disrupts education and that companies should be compelled to make their products safer and compensate school boards for the interference. Nijjar stresses that the goal is not to police children but to obtain recognition that these platforms pose risks akin to alcohol or tobacco, thereby encouraging safer design and greater accountability. Others, like Sabrina Caldwell of the University of New South Wales Canberra, note that while the ban sends a clear societal message protecting kids from cyberbullying and grooming, determined users will still find ways to evade it, meaning the law must be paired with robust education and technical safeguards.


Alternative Approaches and Ongoing Initiatives
Beyond bans, experts advocate a multifaceted strategy: strengthening age‑verification systems, enforcing platform‑level safety features (e.g., default private settings, limits on data collection, and rapid removal of harmful content), and promoting digital‑literacy programs for both parents and children. In Ontario, the government prefers engaging directly with social‑media firms rather than joining the school‑board lawsuits, aiming to collaborate on federal‑level solutions. Meanwhile, the Australian eSafety Commission is rolling out additional age‑verification enforcement controls, signaling that the initial leniency may tighten as regulators assess compliance. The evolving dialogue suggests that while age‑based restrictions can set an important baseline, sustainable protection will likely require continuous adaptation, corporate responsibility, and informed parental guidance.

SignUpSignUp form

LEAVE A REPLY

Please enter your comment!
Please enter your name here