About | Assessment | Assessment and academic integrity | Guidelines for using Artificial Intelligence (AI) in SACE assessments
Guidelines for using Artificial Intelligence (AI) in SACE assessments
Students have always been keen to use new and exciting tools to support their learning. Public access to generative AI through ChatGPT has given students access to a powerful tool to aid their studies, including when completing school-based submitted assessments and external investigations. This access will only improve as AI tools are integrated into the websites and software that students use every day. AI technologies have already had a profound impact on education and student outcomes, improving student access to learning and assessment through tools like text-to-speech, and in supporting schools to manage and support student learning through tools such as Learning Analytics.
This page outlines the SACE Board’s approach to generative AI in SACE school-based assessments and submitted investigations. Central to this approach is consideration that the work submitted for assessment is the student’s own, and that any sources used are acknowledged. Guidance provided aligns to the existing suite of assessment and academic integrity policies of the SACE, relying on a shared partnership between schools and the SACE Board to ensure the integrity of school-based and external assessments.
These guidelines should be read alongside the Supervision and Verification of Students’ Work Policy and Procedure [PDF 180KB], which is underpinned by the principle that students must only submit work "that is their own, produced without undue assistance from other people or sources." The breach of rules page provides more advice for schools about the rules for students who undertake SACE assessments and the procedures to follow when a breach of rules occurs.
On this page
- Can students use generative AI in their school-based assessments and investigations?
- How much AI can be used in a submitted task?
- What ways can AI be used in a submitted task?
- How can students acknowledge their use of generative AI?
- How can I use AI in my assessment task design?
- Can students use generative AI tools in their external exams?
- What can teachers do to support integrity of student work?
Can students use generative AI in their school-based assessments and investigations?
Yes.
Generative AI tools, including ChatGPT, are like the many other information sources available online. They are a research tool that can provide access to knowledge for students.
The SACE Board does not specify which ways of accessing knowledge are valid across subjects. We rely on the discipline expertise of educators to teach students how to evaluate the many sources available to them, and which are appropriate for use in their disciplines. We encourage broad research from a number of sources and for students to always view sources critically, as we know their teachers have taught them to.
How much AI can be used in a submitted task?
There is no limitation on the amount of information a student can gather from a generative AI source to use in an assessment task. Teachers should work with students to ensure that they understand the limitations of AI sources, and how the overuse of limited sources can impact the quality of their work.
What ways can AI be used in a submitted task?
There are many appropriate ways that students can use generative AI in their submitted assessment tasks. Students can access generative AI sources to research and inform their assessment, just as they would for a textbook or other traditional information source. As with these traditional sources, students cannot use text from generative AI and present it as their own.
Students may also use generative AI tools to support their assessment development and writing. These tools are increasingly integrated into software used by students, and at times students may be using them without them realising that they are (including predictive writing support provided by many word processors). While the use of these tools is appropriate, students must ensure that the work submitted for assessment is their own. These tools can only be used to support a student’s own writing processes, not to replace them.
For example, it may be appropriate for a student to use an app to provide suggestions that they can consider to improve their writing during their drafting process. However, it is not appropriate for students to enter their draft into an app which could change the syntax and structure of the text without the students making decisions about phrasing.
Students must not submit work generated by AI as their own work.
How can students acknowledge their use of generative AI?
In all contexts, students must provide an acknowledgement of any generative AI used as a part of their task. It is expected that students will acknowledge any use of AI in a way that is appropriate for the subject and school context. This acknowledgement should declare which tools were used and provide a list of all prompts that were entered to generate any information for the task. This practice is particularly useful for tasks where individual sources are not directly referenced throughout, or where the AI provided broader support of the student work. In some cases, such as image generating AI, providing the output images generated and any reference images entered into the tool, would also be appropriate.
In some cases, it is also appropriate for students to make specific references to AI generated work when used throughout their task, as they would when citing other information sources. In most cases, this would include students providing a reference to work created by generative AI when quoted or paraphrased in their task including: the name of the AI tool used, a link to access this resource (if appropriate) and any prompts that were entered to generate the response.
The SACE Board Guidelines for Referencing [DOC 160KB] have been updated to include some suggestions of how schools and students might choose to reference generative AI.
How can I use AI in my assessment task design?
Educators across schools and tertiary providers have begun to consider how they might utilise AI in task design or their assessments. Secondary schools across South Australia have already begun to consider how they might assess students in light of access to AI tools, and how this technology can be leveraged to support their students to grapple with and understand content. Considering how learning design and pedagogy can be supported with access to AI – as well as questioning the what and the how of student assessment – is emerging as a positive way to navigate the uncertainties that this (and other) new technologies pose. Some emerging suggestions include:
- integrating working with AI tools into task design, requiring students to generate initial ideas or improve AI outputs as a starting point for tasks
- analysing prompts used in the assessment task with AI tools and evaluating the response the tools can generate
- focusing assessment on the processes and skills used to create an outcome
- developing assessment tasks that AI tools do not, or cannot, know about including self-reflection, responding to classroom discussion prompts and embedding student experience into tasks.
Can students use generative AI tools in their external exams?
No.
The use of ChatGPT and other generative AI tools is not appropriate for use in external exams. The SACE Exam Browser undergoes regular vigorous testing to verify that generative AI tools are blocked in e-exam settings.
Verification and trust of assessment is a shared partnership between students, schools and the SACE Board. Exams are an extension of the trust and verification in established policies and practices. Raising the security settings of the e-exam platform is one method that will support existing academic integrity policies and the active invigilation of examinations.
The SACE Board is re-engaging and collaborating with IT experts in schools to ensure clear communication and best practice across the system. We are also reviewing our e-exam support materials to best prepare schools, invigilators and students for exam day, and how best to respond to anything unexpected or out of their control.
What can teachers do to support integrity of student work?
The integrity of student results is a shared responsibility between teachers and students, SACE Board and schools. As part of this shared responsibility, teachers are best placed to support and guide students in the acceptable support they receive in their learning and assessments, including the use of AI tools. Expectations and parameters should be clearly communicated regarding what forms of assistance are permitted or prohibited in the work submitted for assessment. Teachers are also encouraged to work in partnership with students throughout the learning process to develop shared confidence in the authenticity of student work, and establishing trust in whether a submission genuinely reflects the student’s own understanding and effort.
Various SACE Board resources include suggested practices that can improve teacher confidence in the integrity of students' work.
- Supervision and Verification of Students’ Work Policy and Procedure [PDF 180KB]
- Avoiding Plagiarism: Guidelines and Examples for Teachers and Students [ DOC 420KB]
- Supervision and verification of student work (making sure a student's work is their own)
Some suggested practices include:
Use a variety of assessment activities to assist in detecting anomalies.
Different types of tasks allow teachers to observe a student’s typical style, strengths, and challenges.
e.g. course work, group discussions, fieldwork, practical or laboratory activities, research, oral/multimodal presentations, short tests or quizzes.
Require parts of the assessment task to be done in the classroom.
Teachers can verify key thought processes and problem-solving approaches by breaking a task into layers and supervising key layers.
e.g. students complete a structured essay plan in class, including selecting a theme, main argument, key evidence, and interpretation notes.
Re-frame drafting as feedback and verification.
This enables teachers to monitor progress and identify inconsistencies or signs of external assistance early. Splitting the drafting process into a whole or part feedback interview phase allows a side-by-side comparison of the students demonstrated understanding in an oral interview when compared to their written response. Alternatively, when reviewing a draft, compare it to in-class work to check for changes in tone or complexity, use of advanced vocabulary not previously demonstrated, unusual formatting or citation styles.
Conduct check-ins to gauge student understanding and confirm authorship.
e.g. during class, have 5-minute one-on-one chats where students explain their project focus and how they’re approaching the task. Note any discrepancies in understanding or fluency.
Require process documentation at key moments throughout the development of a task.
Have students maintain a journal or logbook that records their steps, or ask students to submit work-in-progress notes such as brainstorming, outlines, and reflections to show their thinking journey.
Embed peer review and collaboration as part of the assessment response.
e.g. have students exchange drafts and provide documented feedback, which can then be used in the verification process where a student adopts aspects of the feedback provided.
Create a safe space for questions and disclosure.
Foster transparency by encouraging students to ask about AI use and disclose their methods without fear of punishment.
e.g. consider a Q&A box in your classroom where students can anonymously submit questions.
Set tasks that incorporate student reflection and analysis of information rather than fact or information gathering.
Where appropriate to the subject outline requirements, shift focus from product to process, rewarding originality, reasoning, and reflection.
e.g. as part of the task, ask for “evidence of original thinking”, “reflection on process”, and “ethical use of tools,” alongside content and structure.
Have students complete an academic honesty statement.
Require students to declare the originality of their work and disclose any use of AI tools.
e.g. Students complete a declaration at the end of each assignment "“I confirm that this work is my own and that I have disclosed any use of outside sources including AI tools”.
Post hoc discussion with the student.
Where concerned about the authenticity of student work, give the student a chance to explain or clarify their response by asking open-ended questions.
e.g: “Can you walk me through how you researched and wrote this?”, “What sources did you use?”
Teach students about academic integrity, plagiarism and proper citation.
Explicitly guide students through the SACE Board’s academic integrity policy. Plagiarism may be unintentional and be the result of poor citation practices. Provide feedback on proper referencing and academic integrity and offer the opportunity to revise and resubmit work.