Banner Photo

Wellbeing

Mr. John Ryan - Deputy Principal

 

New Social Media Laws

How did we get here?

 

In March 2024, Meta (Facebook) informed Australian publishers it would not renew its News Media Bargaining Code agreements in Australia.

In response, the Australian government fast-tracked debate over the perceived imbalance of bargaining power for big social media companies. This included discussion of a digital platform levy (effectively arbitrated taxation).

It also led to the establishment of a Joint Parliamentary Select Committee into the influence and impacts of social media on Australian society more generally.

 

Findings of the Joint Select Committee

The Committee received 220 submissions and held 10 public hearings. It formally tabled its final report on 18 November 2024.

Social media’s role in adolescents’ mental health, addictive algorithms, and child exploitation—plus risks to democracy and journalism—were central.

Significant concern regarding the business practices of social media platforms. Advertising-based platforms prioritise engagement, boosting harmful content through algorithmic recommendations.

 

There are privacy and redress failings. Social media platforms lack age verification, adequate protections and user complaint mechanisms.

Concerns about individual vulnerability. Harm can include scams, cyberbullying, mental health concerns, child sexual exploitation, self-harm, and suicide. Vulnerability varies with content type, screen time, and user factors.

The evidence suggests social media acts as an amplifier of existing mental health pressures, rather than being a direct cause. But more research is needed.

  

Mental health concerns for kids

 

Social media is ubiquitous In Australia, 96% of 10-15-year-olds are using social media

 

Kids are targets 70% of 10-15-year-olds report having been exposed to harmful behaviour, and 14% having experienced grooming.

 

Use can become “problematic” Problematic social media use” (PMSU) refers to patterns of compulsive use, e.g. mood regulation dependency, and functional impairment.

 

Effects of PMSU 10-20% of adolescents show addictive or harmful levels of social media use. Direct correlation with depression, anxiety, and stress

 

The response?

The response from the Australian government was to introduce world-first legislation to regulate social media platforms. There are also related privacy reforms in the pipeline, which together aim to create a significantly safer online environment for children.

 

Social media minimum age requirement

Children’s Online Privacy Code

Digital duty of care

 

The New Laws

 

Minimum age requirements

From 10 December 2025, children under the age of 16 will not be able to create accounts on “age-restricted social media platforms”.

Under recent amendments to the Online Safety Act 2021 (Cth) (Act), age-restricted social media platforms:

1.  Must take reasonable steps to prevent underage account holders from creating or maintaining accounts.

2.  Must deploy reliable age-verification or inference mechanisms, which are scalable and transparent. The platforms cannot mandate federal or digital ID

3.  Must comply with the Privacy Act and APPs with stricter requirements.

4.  Must comply with requests for information, inspection of governance systems, and compliance notices from the OAIC.

5.  Face fines of up to $49.5 million for failing to meet the Act’s requirements.

 

Which platforms are affected?

The list is not static and may grow. The eSafety Commissioner may re-evaluate platforms over time if their functionality or risk profile for children changes.

For some platforms such as YouTube and Reddit, children under 16 will still be able to watch videos or view threads, as an account is not required to access them; however, they will not be allowed to create accounts, or to post comments or content on them.

Gallery Image

Can anyone else breach the new laws?

 

No. Children, parents, internet providers, and schools are not at risk of breaching the new legislation.

The government insists the law is about protecting children from harmful content on platforms and believes punishment would have a detrimental impact. 

As a result, children who access an age-restricted platform after the ban takes effect will not face any punishment, nor will their parents or carers. 

Similarly, there are no sanctions for internet providers or schools.

 

We hope this information is valuable for both students and parents. At Mary MacKillop Catholic Regional College, student online safety remains a top priority. All students using a College laptop must adhere to the Computer User Cyber Safety Agreement. In addition, our Digital Learning Policy, Cyber Safety Policy, and Social Media Policy work together to guide our approach to maintaining a safe, responsible, and positive online environment for every learner.

 

https://www.mackillopleongatha.catholic.edu.au/wp-content/uploads/2025/07/COMPUTER-USAGE-AND-CYBER-SAFETY-AGREEMENT-2025.pdf

https://www.mackillopleongatha.catholic.edu.au/wp-content/uploads/migrated/2022/07/Digital-Learning-Policy.pdf

https://www.mackillopleongatha.catholic.edu.au/wp-content/uploads/migrated/2018/08/Cyber-Safety-Policy-1.pdf

https://www.mackillopleongatha.catholic.edu.au/wp-content/uploads/migrated/2021/06/Social-Media-Policy-1.pdf

 

 

Gallery Image

 

Gallery Image