Cyber Safety

Introducing our new Cyber Safety Hub

We are delighted to introduce you to a new resource made available to you through our partnership with Family Zone - our new school Cyber Safety Hub. 

 

As you may already be aware, our partnership provides your family with access to the Family Zone tools to use at home with your children if you wish. The purpose of the Cyber Safety Hub is to complement those tools with practical guidance and information to further support you in engaging with your children in their digital development. These tools and resources also allow the school and parent body to work together on creating a holistic approach to guiding each student's online journey. 

 

You can access the Cyber Safety Hub using the link below: 

https://johnxxiii.cybersafetyhub.com.au/

 

About the Parent Cyber Safety Hub 

The Cyber Safety Hub includes resources to help your family better understand the different Family Zone tools available to you and how to use them, plus access to regular cyber safety events to help you stay informed about the latest digital trends. 

 

Also, the Cyber Safety Hub provides expert advice from leading cyber experts, ySafe, on the most pertinent issues and frequently asked questions around platforms like TikTok, Fortnite, Instagram, and more. There are app reviews with age and safety recommendations, along with a range of guides to help ensure healthy boundaries around screen-time & gaming, plus step-by-step instructions for using parental controls and filtering out inappropriate content. 

 

We are very excited to be able to offer you this level of expertise and support. We look forward to working closely with you as we develop the cyber safety conversation within our school community.


Instagram is coming for our kids

A child-friendly version of Instagram promises to “help kids keep up with their friends, discover new hobbies and interests, and more!” But beyond the spin, what will it really mean for our children?

 

“Increasingly kids are asking their parents if they can join apps that help them keep up with their friends,” a Facebook spokesman told the media this week.

 

“Right now there aren’t many options for parents, so we’re working on building additional products — like we did with Messenger Kids — that are suitable for kids, managed by parents. 

 

“We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests, and more.”

 

Good news, right? Well … sort of. Let’s break it down.

 

Why age limits matter

13+ is pretty much the universal age restriction for all social media accounts. That’s not to say that underage kids don’t sign up in droves anyhow.

 

After all, there’s no age verification and - as every parent of a tween has no doubt heard a million times already - “all my friends have it, so why can’t I?”

 

But experts point out that having an age restriction on the app does send a message to kids and to their parents too, to proceed with caution (if at all). To keep in mind that the app, while millions use it without risk, poses real and present dangers for younger kids. 

 

It can be a portal to porn, it’s the number-one app for cyberbullying, and the negative impact of influencers on girls’ self-esteem and body image has been well documented. And there’s the danger of lurking predators.

An internal company post obtained by BuzzFeed broke the news last week, ahead of a formal company announcement (image: BuzzFeednews)

 

The age limit is useful to parents "in the same way that buying alcohol ... and having consensual sex at 16 could be: ‘Not only do I not approve, but it’s ALSO THE LAW,’” notes mum and journalist Sarah Rodrigues in an incendiary opinion piece in telegraph.co.uk.

 

Given the risks to kids of interacting on an adult platform, a P-plate version - specifically designed to keep the guardrails up - sounds like a great idea. 

 

Reasons to worry

But as Rodrigues points out, even if proposed rules around messaging strangers and excluding adult users are scrupulously observed - and you might recall there were serious issues with predators on Messenger Kids early on - there are plenty of other reasons to worry about starting kids on social media so early.

 

“Bullying, comparison, mental health issues, lack of activity – what ‘rules’ do these platforms have planned to mitigate the effects of these?” she asks.

 

Facebook, Instagram’s parent company, claims to have nothing but good intentions in offering a dedicated platform for children: providing an opportunity to discover “new hobbies and interests,” for example.

 

“Bullying, comparison, mental health issues, lack of activity – what ‘rules’ do these platforms have planned to mitigate the effects of these?”
“Just how many hobbies and interests do you think the younger generation share on social media?” asks Rodrigues. 

 

“Let me give you a clue – there aren’t a whole lot of cross stitch samplers, book recommendations or calligraphy tutorials taking place. Competitive dieting, self-harm, gender identity crises? Knock yourself out, kiddo, you’ll be spoiled for choice.”   

 

Do our under-13s really need “more time online, more exposure to blue light, more rabbit holes of distraction, more exposure to potential hatefulness?” 

 

How kids harm kids

Yes, children should be protected from contact with adult strangers. But what about the risks that kids face from other kids - and themselves?

 

Finally, let’s not forget that, marketing spin aside, the primary purpose of a kids’ version of Instagram is exactly the same as that of the adult version: to make money.

 

“Unkindness, teasing, bullying: these have long existed, but when I, and other adults, were young, we had a reprieve from it. The school day would end; we would, if we were lucky, go home to the security and love of our families.

 

"We’d have a chance to patch up the chinks in our armour before venturing back into the fray the following day.

 

“Now, the bully and their taunts have the ability to come home with you.”

In the opinion of Rodrigues, and it's a point echoed by many other experts, “Social media is difficult enough to navigate as an adult.”

 

How much more testing would it be for vulnerable children, whose immature nervous systems are already carrying an unprecedented load?

 

Finally, let’s not forget that, marketing spin aside, the primary purpose of a kids’ version of Instagram is exactly the same as that of the adult version: to make money. And to do it by getting users hooked into spending more and more time online, so that their attention, and the data collected about their online lives, can be onsold to the highest bidder.

 

Really, Instagram?

Clinical psychologist Jordan Foster of ySafe, Australia's leading cyber safety educator, shares these concerns. "Will the proposed platform will minimise harm for young users - especially around psychosocial issues such as online self-comparison, technology self-regulation and validation loops?" she asks. 

 

The answer is almost certainly no.

 

"I am skeptical as to whether offering a safer alternative to the platform is at the betterment of the child, or is just an opportunistic, commercial reaction to a genuine problem. The real question is, do children need this platform? Or are there alternatives to promoting their safety and wellbeing online?"

 

Foster also points out that, in her professional opinion, Instagram still underperforms with respect to teen safety in the currently version of the app. 

 

"If they haven't excelled in their current safety offerings," she says, "I think their priorities of child safety are misplaced."

 

Reference: https://www.familyzone.com/anz/families/blog/instagram-is-coming-for-our-kids