Archives

  • February 2026
  • January 2026
  • December 2025
  • November 2025
  • October 2025
  • September 2025
  • August 2025
  • July 2025
  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • August 2024
  • July 2024

Categories

  • 2025 MEDICAL CAREERS EXPO
  • 2025 RURAL HEALTH AWARDS
  • ADVERTORIAL
  • ADVERTORIAL – NORTH METROPOLITAN HEALTH SERVICE
  • AGED CARE
  • AMA (WA) PRACTICE GROUPS
  • AMA FINANCIAL SERVICES
  • AMA25 CONFERENCE
  • AMBULANCE RAMPING
  • ANNUAL GENERAL MEETING
  • AUSTRALIA DAY HONOURS
  • AUTHOR INTERVIEW
  • CLASSIFIEDS
  • CLINICAL EDGE
  • COLORECTAL CANCER
  • COMBAT SPORTS
  • COMMENT
  • CONGRATULATIONS
  • COVER STORY
  • COVER STORY – SURGERY & ANAESTHESIA
  • CPD HOME
  • DHASWA
  • DOCTORS IN TRAINING
  • DR YES
  • DRIVE
  • FAMILY LAW
  • FEATURE – PATHOLOGY
  • FEATURE – RESEARCH PROFILE
  • FEATURE – WOMEN & BABIES
  • FROM THE EDITOR
  • GRADUATION CEREMONIES
  • HEALTH INFRASTRUCTURE
  • HEALTH LEADERSHIP
  • HEALTH WATCH
  • HOSPITAL HEALTH CHECK (HHC) 2025
  • IMG DIARY
  • INDUSTRIAL RELATIONS
  • INTERN ORIENTATION WEEK
  • INTERNSHIP
  • MEDBALL25
  • MEDIA MATTERS
  • MEDICO-LEGAL MATTERS
  • MEET THE COUNCILLOR
  • MEMBER BENEFITS
  • MEMBERS SOIREE
  • MENTAL HEALTH FEATURE
  • MINISTER'S MILESTONE
  • NEWS
  • ON THE SHELF
  • ONLINE SAFETY
  • OPINION
  • OPINION – RACGP
  • OUT & ABOUT
  • POLICY
  • POLICY – ERIDEABLES
  • PRACTICE MATTERS
  • PRESIDENT'S DESK
  • PROFESSIONAL NOTICES
  • PROFILE
  • PUBLIC HEALTH – WAAC
  • RESEARCH
  • RURAL PRACTICE
  • SLEEP SURVEY
  • SOCIAL MEDIA LEGISLATION
  • STATE BUDGET 2025-26
  • STUDENT SPEAK
  • THE LAST WORD
  • TRANSPLANT MILESTONE
  • VALE
  • WHAT'S ON
  • WINE

Advertisment

http://Advertisment

Newsletter

  • Current Edition
  • Previous Editions
    • December 2025 – January 2026
    • October – NovemberΒ 2025
    • August – SeptemberΒ 2025
    • June – JulyΒ 2025
    • April – MayΒ 2025
    • February – MarchΒ 2025
    • Older Editions
  • Advertise with us
  • AMA (WA) Homepage
  • Current Edition
  • Previous Editions
    • December 2025 – January 2026
    • October – NovemberΒ 2025
    • August – SeptemberΒ 2025
    • June – JulyΒ 2025
    • April – MayΒ 2025
    • February – MarchΒ 2025
    • Older Editions
  • Advertise with us
  • AMA (WA) Homepage
Back To Home
Previous Post
Lessons of a junior doctor
Next Post
Urgent action needed to tackle declining vaccination rates
SOCIAL MEDIA LEGISLATION

It's a delay, not a ban

Paul Litherland

Founder, Surf Online Safe
AMA (WA) Youth Friendly Doctor Workshop speaker

Claim EA: Professional Reading CPD hours with AMA CPD Home. Learn more in our helpful article and log your hours.

With access to social media now regulated for Australian children under 16, there are still big challenges ahead, particularly because of US hostility to similar legislation.

Fom an international cyber safety perspective, we will be living in β€œinteresting times” over the next few years.

Over the past decade, there has been a significant rise in pressure from international organisations for the regulation of social media and the ethical design of those environments.

Australia’s very own eSafety Commissioner and the European Commission have become world leaders in pressuring Big Tech towards the safety-by-design principle of their networks.

That pressure has included billions of dollars’ worth of fines over the past decade for breaches of international law or the non-removal of harmful content.

The Act to protect our children

Now, from 10 December 2025, children under 16 in Australia can’t continue to operate a current social media account or make a new account on many social media networks, because of the Online Safety Amendment (Social Media Minimum Age) Act 2024 coming into force.

The Act was legislated to help protect young people from aspects of the platforms that try to keep them scrolling for too long, and from content that can affect their safety, health and wellbeing.

It’s important to view this law not as a ban, but as a delay to help children under 16 build skills and awareness about the risks of technology.

The legislation is aimed at any service or environment that meets the definition of a β€œsocial media platform”. The platforms currently targeted by the law are Facebook, Instagram, Snapchat, TikTok, X, YouTube, Reddit, Kick and Twitch.

The networks currently included are likely to grow or change in the coming months as Australia’s eSafety Commissioner, Julie Inman Grant, makes decisions on which platforms might fall under that definition.

At this stage, the excluded environments are messaging apps (WhatsApp, Messenger, Signal, Telegram, etc), email services, voice/video calling apps (Zoom, Teams, FaceTime, etc), online games (Fortnite, Roblox, Minecraft, etc), and professional networking (LinkedIn).

The key component of the legislation is the requirement for declared social media networks to take β€œreasonable steps” to ensure under-16s in Australia are removed from their platform within a reasonable amount of time once an underage account is identified, and to prevent a child from creating a new account before they turn 16.

The eSafety Commissioner has published regulatory guidance describing examples of β€œreasonable steps” platforms can take to identify underage accounts – such as age verification or estimation tools, parental-control options, and better detection options.

If the networks fail to take these actions, they can be fined up to $49.5 million.

The legislation will have to rely on cooperation from the platforms, whose response will be based on their own interpretation of the legislation – especially the β€œreasonable steps” requirement. Excuse the cynicism, but I’m not very confident about what they will deem to be β€œreasonable”.

I’m strongly of the opinion they will argue that trying to actively remove all Australian children under 16 from their environments is not reasonable.

I believe they will introduce detection methods that will pick up on the most basic under-16 identifiers. They might also make a few alterations to code for Australian accounts, or design a specific algorithm to pick up kids who’ve added their date of birth to an account or recently changed the date; users who have a daily location at an Australian school; what the ages of most of their β€œfriends” are; and the nature of the websites they’re visiting.

Based on these very simple steps, the networks might well remove 100,000 Aussie kids. They could deem that to be β€œreasonable” under the law, despite Australia having about two million children aged 10 to 15!

I expect the networks will also rely heavily on parents, schools and the eSafety Commission reporting underage accounts and essentially doing the work for them.

β€œ Meta boss Mark Zuckerberg told a US Senate Hearing into Child Online Safety in 2024: β€œI do not support the conclusion that social media causes changes in adolescent mental health.” And he said that with more than 30 parents sitting behind him who had lost children to suicide on Instagram and Facebook.

The Big Tech hurdle

While Australia is tackling this head-on, the government that needs to act the most – in the country where the vast majority of the most popular social media apps are based – sadly refuses to do so.

In fact, instead of introducing legislation to address online harm, the US Congress in the 1990s introduced laws to protect Big Tech from civil and criminal litigation.

Section 230 of the US’s Communications Decency Act has allowed online environments to flourish without regulation for years, and Big Tech giants have taken full advantage of the protection that law provides.

The stubborn nature of Big Tech has remained a constant hurdle for legislators and international bodies such as our own eSafety Commission.

Even politicians within the US who support the global drive for social media regulation are facing staunch opposition, often being beaten down by the β€œfree speech” argument.

This defiance by Big Tech is set to ensure that the collective army of cyber-safety advocates and legislators outside of the US pushing for much-needed increases in online safety will continue to encounter a massive wall of resistance.

Who can forget Meta boss Mark Zuckerberg’s announcement in January that he would be scaling back on moderation and content filtering?

His pushback against those who seek to address the lack of safety and regulation on social media networks was condemned internationally.

But Zuckerberg has been here many times before. He continues to deny the clear and present danger of social media.

As he told a US Senate Hearing into Child Online Safety in 2024: β€œI do not support the conclusion that social media causes changes in adolescent mental health.”

And he said that with more than 30 parents sitting behind him who had lost children to suicide on Instagram and Facebook.

His statement would be utterly destroyed by those poor parents, and by the families of the 11 children in my Australian schools over my 15 years of working in this space, who were lost to suicide as a result of harms experienced from social media.

Echoing her boss, Meta’s Vice-President and Global Head of Safety, Antigone Davis, told our Senate’s Joint Select Committee on Social Media and Australian Society in June last year: β€œI don’t think social media (Facebook & Instagram) does harm to our children.”

Recently there was a great deal of hype around Instagram, owned by Meta, introducing a β€œPG 13” setting to their network. A number of Instagram influencers (most on Meta’s payroll) touted this as a game changer.

For his part, Instagram head Adam Mosseri says how committed he (and Meta) is to protect teens on the platform, and how this change will help parents. I don’t trust that apparent commitment, nor any such statement coming out of Meta.

To support why I say this, here’s what Mosseri told a US Senate hearing in 2021: β€œI believe at Instagram we try and respond to all reports; and if we fail to do so, that is a mistake and something we should correct.” But an in-depth online study I conducted last year showed that less than 2% of teen reports of harm were acted on by Instagram in the first instance, and only 22% of reports were responded to after multiple attempts at removal of harmful material. So, the supposed β€œmistake” has not been corrected, despite Mosseri’s commitment four years ago to fix it.

β€œ The legislation will have to rely on cooperation from the platforms, whose response will be based on their own interpretation of the legislation – especially the β€œreasonable steps” requirement. Excuse the cynicism, but I’m not very confident about what they will deem β€œreasonable”. I strongly believe they will argue that trying to actively remove all Australian children under 16 from their environments is not reasonable.

The whistleblowers’ testimony

Just weeks before Instagram’s PG 13 hype, at yet another congressional hearing in the US about the unethical practices of Meta, two former employees came forward to rip back the curtain on exactly what Meta thinks of our children.

On 9 September this year, whistleblowers Cayce Savage and Jason Sattizahn provided testimony to a US Senate Judiciary Committee hearing on Privacy, Technology and the Law about how Meta buried child safety research.

Over nearly two hours of evidence, the pair went into explicit detail regarding their experiences.

Most concerning was the manner in which Meta was said to be acting in relation to research into the harms being faced by children using Meta’s VR headsets, especially within Horizon Worlds.

“Meta are fully aware children are being harmed in VR and all Meta products.”
– Cayce Savage

“Meta is aggressively ambivalent to people!”
– Jason Sattizahn

The statement by Ms Savage, who worked at Meta from 2019 to 2023, directly contradicts what Antigone Davis told the Australian Parliament last year.

Mr Sattizahn, who has a PhD in Integrative Neuroscience, was at Meta from 2018 to 2024 and worked exclusively in user research and safety, tasked with understanding users and their needs. He used this research to try and make Meta products safer.

He testified that Meta deliberately deleted evidence of children being exposed to harm on their products, and failed to report such harm to investigative or judicial bodies.

He and others in research were told not to investigate harms against children, and all research was monitored by a β€œlegal surveillance” team in the wake of the testimony of Facebook whistleblower Francis Haugen in 2021.

For the record, Meta has described these recent claims by Mr Sattizahn and Ms Savage as β€œnonsense” – but their sworn testimony perfectly reflects what other online safety advocates and I have been saying for years.

Save the article
Previous Post
Lessons of a junior doctor
Next Post
Urgent action needed to tackle declining vaccination rates
Medicus
© 2026 Medicus - All Rights Reserved.

Start typing and press Enter to search

UNLOCK FULL ACCESS TO MEDICUS

You're just getting to the good part! Log in to your AMA (WA) member account to continue reading, or join today for full access to exclusive content, member benefits, and more.

Month End Offer is live!

Get up to 40% off on AMAMedius

AMA (WA) MEMBER ONLY ACCESS

Sign in using your membership email and date of birth.

If you are experiencing issues, please contact the AMA (WA) membership team via email membership@amawa.com.au