Cyber Safety

Check out our Cyber Safety Hub
We are delighted to introduce you to a new resource made available to you through our partnership with Family Zone - our new school Cyber Safety Hub.
As you may already be aware, our partnership provides your family with access to the Family Zone tools to use at home with your children if you wish. The purpose of the Cyber Safety Hub is to complement those tools with practical guidance and information to further support you in engaging with your children in their digital development. These tools and resources also allow the school and parent body to work together on creating a holistic approach to guiding each student's online journey.
You can access the Cyber Safety Hub using the link below:
https://johnxxiii.cybersafetyhub.com.au/
About the Parent Cyber Safety Hub
The Cyber Safety Hub includes resources to help your family better understand the different Family Zone tools available to you and how to use them, plus access to regular cyber safety events to help you stay informed about the latest digital trends.
Also, the Cyber Safety Hub provides expert advice from leading cyber experts, ySafe, on the most pertinent issues and frequently asked questions around platforms like TikTok, Fortnite, Instagram, and more. There are app reviews with age and safety recommendations, along with a range of guides to help ensure healthy boundaries around screen-time & gaming, plus step-by-step instructions for using parental controls and filtering out inappropriate content.
We are very excited to be able to offer you this level of expertise and support. We look forward to working closely with you as we develop the cyber safety conversation within our school community.
We don't allow under-age users ... and other myths tech companies hide behind
Social media companies are evading responsibility for kids' safety by pointing to official age restrictions - allowing abuse to children to go unaddressed.
True or false? No child under the age of 13 is on social media.
Obviously true - and just as obviously completely false.
Yet the ways in which this paradox endangers underage children is a lot more complicated than it may first appear.
Sure, every platform under the sun - from Facebook to TikTok - has a minimum age requirement of 13. But verification procedures are universally either flimsy or downright non-existent.
The most recent research, based on self-report data from 1,000 children, gives a pretty clear idea of what you no doubt already knew: which is that kids - maybe even your kids - are using adult social media in droves.
Result? Underage users comprise a healthy proportion of the most social popular apps.
For obvious reasons tracking their precise numbers is nearly impossible. Yet the most recent research, based on self-report data from 1,000 children, gives a pretty clear idea of what you no doubt already knew: which is that kids - maybe even your kids - are using adult social media in droves.
Kids' top social media platforms
The study, conducted by the US-based nonprofit Thorn, found that among kids aged 9-12:
- 45 percent say they use Facebook daily
- 40 percent use Instagram
- 40 percent use Snapchat
- 41 percent use TikTok, and
- 78 percent use YouTube.
Now, none of that may seem particularly surprising or even alarming. But wait, there’s more.
Platforms where the most harm occurs
Because the same study also found disturbing numbers of children reporting potential harm on these same platforms - while their attempts to deal with bullying, grooming and unwanted contact from strangers often ended in frustration.
The platforms with the highest number of minors reporting potential harm were Snapchat (26 percent), Instagram (26 percent), YouTube (19 percent), TikTok (18 percent), and Messenger (18 percent).
Those where the most minors said they had an online sexual interaction were Snapchat (16 percent), Instagram (16 percent), Messenger (11 percent), and Facebook (10 percent).
Reporting and blocking of abuse
The good news was that kids who experienced abuse were keen to take advantage of platform-based blocking and reporting tools. The bad news was how inadequate these in-app tools were to address their needs. As one observer noted, such tools “can feel like fire alarms that have had their wires cut.”
Typically, kids were frustrated that none of the choices “fit the situation” - especially that of being pestered for nudes, whether by an adult or another child.
One in three children who reported an issue said the platform took more than a week to respond. And nearly a quarter said the concern they raised was never resolved. Typically, kids were frustrated that none of the choices “fit the situation” - especially that of being pestered for nudes, whether by an adult or another child.
So kids are experiencing real distress and abuse on social platforms. Yet tech companies have managed to evade responsibility by pointing to their official age restrictions - which, as we have seen, bear little resemblance to real-world usage by kids.
These companies say, in effect, we don’t allow children to be users. Therefore, we’re under no obligation to protect them from harm, or devise reporting tools that are appropriate and kid-friendly.
The time has come, argue the authors of the Thorn report, to expose that self-serving myth, and hold tech companies accountable.
How? For starters, by making age-verification procedures much more robust. (Among the report’s other findings, it revealed 27 percent of 9- to 12-year-old boys had used a dating app.)
By integrating crisis support numbers into messaging platforms.
By getting much more serious about ban evasion, to prevent blocked users from simply creating alternate accounts and carrying on the abuse.
“Let’s deal with the reality that kids are in these spaces, and re-create it as a safe space,” says Julie Cordua, Thorn’s CEO.“When you build for the weakest link, or you build for the most vulnerable, you improve what you’re building for every single person.”
Reference: https://www.familyzone.com/anz/families/blog/underage-users-tech-companies