AI in Education
Lisa Kerekes, Head of Digital Technology

AI in Education
Lisa Kerekes, Head of Digital Technology
The proliferation of AI programs is having a noticeable impact in educational spaces over the past few years. Numerous tools are now available that support students’ deep learning and critical thinking, rather than just providing quick or easy answers.
As with any technology, we don't want to allow AI tools to usurp students’ God-given human creativity, intelligence and relational vocations. Instead, our aim is to equip students to leverage technology in ways that support their learning and endeavours.
At DCC we regularly reflect on best practices in all aspects of teaching, including the use of emerging technologies—to assist students to become discerning users who can critically evaluate and engage with technologies they will encounter beyond the school gates.
In the Secondary school, a small working party continues to engage with rapidly changing AI tools, along with ethical and educational considerations. Staff are now participating in training that covers the basics of AI tools to increase efficiency in our work, including guidance around privacy considerations, data bias and ethical issues, and transparency. We are also considering positive classroom applications and best practices to ensure that AI supports learning rather than replaces it. As tools and technologies such as AI continue to evolve, this work is ongoing.
Over the next semester we plan to begin similar conversations with Secondary students. This will involve educating them about the strengths and limitations of AI tools, ethical considerations, and when it may be appropriate to use AI intentionally to enhance learning.
An authentic learning process involves grappling with problems, considering different perspectives, reflecting on and learning from mistakes, building understanding through exploration and collaboration, while developing creative and critical thinking skills – these cannot be replaced by automated answers.
Artificial intelligence is increasingly part of how young people study, create, and solve problems in both their personal and school lives. For this reason, we all share responsibility in helping young people develop discernment when using technology. While teaching staff carefully design learning experiences with technology in mind, parents play an important role in shaping how these tools are used at home.
Parents can support this by having occasional conversations with their teenagers, modelling thoughtful and selective use of AI tools, and discussing their limitations. Rather than asking, “Did you use AI for this?”, curious and non‑judgemental questions may encourage more open and productive conversations, such as:
What do you use AI for, and why?
How does it help you think or understand the task better?
What parts of your work did you then edit or improve yourself? Are you still learning, or mainly producing an answer using the tool?
Was the AI response accurate and reliable? And how did you check this?
When parents and staff model thoughtful behaviour, ask reflective questions, and focus on growing as discerning disciples of Christ who utilise technology well, rather than mere compliance, students are more likely to engage with AI and other technologies in considered, useful, ethical and educationally beneficial ways.