These FAQs were compiled in response to questions from the learning community by Members of the GenAI Working Group and organised around three strategic recommendations.
FAQs on Use of AI in Teaching, Learning and Assessment
1.1 Are there standard statements we can provide about appropriate vs. inappropriate uses of AI?
Statement for module handbooks:
“It is important for you to understand the difference between appropriate and inappropriate use of artificial intelligence in your assignments, so that you approach your studies effectively. Complete the ‘Use of Artificial Intelligence’ course on Moodle to learn about making ethical decisions in using AI in your assignments. When you have completed the course, upload your completion certificate via the link on your module. Remember that if you use AI tools in your assignment, you must declare them using the declaration form on Moodle”.
1.2 Should we ban or restrict use of generative AI in student assignments?
Firstly, it would be almost impossible to impose such a ban or restriction as it’s very hard to reliably detect the use of GenAI (except in the most obvious cases). In a recent blogpost, Martin Compton proposed four visions of assessment for 2033, the first of which is a nightmarish vision of ‘panopticopia’, which would be the only way of assessing students to ensure GenAI is banned or restricted, but is highly undesirable. But more importantly, it is not in the best interests of students or the university to ban GenAI as digital and information literacies are not only a key plank of the Oxford Brookes Graduate Attributes framework and remaining agile to, and embracing technological change runs throughout the Education and Enterprise Pillar and Educational Character statement of the Oxford Brookes Strategy 2035, but the use of GenAI and skills in prompt engineering are increasingly likely to become essential employability skills. In terms of restricting the use of GenAI, this may be desirable in cases where there are no other ways in which to assess the learning outcomes (e.g. demonstrating knowledge of / competencies in accurate translation or professional practice), or in cases where you want to develop independent thinking, critical analysis or distinctly ‘human’ skills. This will entail devising either authentic assessments that purposely try to engineer out ways in which GenAI could be used, or actually facilitating the deployment of GenAI as a platform for students to critically engage with its outputs. More information on authentic assessments and the types of assessment genres that can utilise / potentially circumvent GenAI can be found in our guidance or in the JISC Assessment Ideas for a GenAI Enabled World.
1.3 What should I do if I suspect a student may have inappropriately relied on AI?
- Contact sirt@brookes.ac.uk if you have any queries about making a referral.
- Assess the work and take a look at the following guide for general tips which may help identify the inappropriate use of AI. Tips for Identifying Inappropriate Use of AI in Assessment
- Please note, this is just a guide, not a definitive list. Cases are considered using a range of factors, and although indicators can be a helpful starting point, the investigation will not be based on these alone.
- AI is not easy to detect and techniques for detecting its use will need to constantly evolve to keep up with the frequent advancements in AI.
- Submit an academic conduct referral form using this link: Academic conduct referral form.
Explain clearly the reason for referral. The following link provides examples of information that is useful to include when making a referral: Key information to include when referring a case for inappropriate use of AI.
1.4 Do we need to update our academic integrity and plagiarism policies to address AI?
The academic conduct procedure was updated in September 2023. The following addition was made to breach 3.1(f), which now allows SIRT to award an appropriate penalty specifically in relation to the inappropriate use of AI.
3.1 (f) Custom writing services – the use of materials created by third parties and/or web sites and/or Artificial Intelligence (AI) software/paraphrasing/image tools, and passed off as one’s own work. This includes all forms of contract cheating, such as the use of, running of, or participation in, auction sites and essay mills to attempt to buy or use assessments or answers to questions set. It is also an offence to provide one’s own work to others with the intention of personal gain.
The use of Artificial Intelligence is evolving very fast; we will continue to monitor and it is likely that we will develop a specific breach related to inappropriate use of Artificial Intelligence in future.
2.1 Where can I learn more about different types of AI?
Jisc AI primer has lots of useful information and outlines the key points of the different types of AI:
- AI text generators such as ChatGPT are trained on a large amount of data scraped from the internet, and work by predicting the next word in a sequence
- All AI text generators can - and often - do produce plausible but false information, and by their nature will produce output that is culturally and politically biassed
- Bing Chat and Google Bard work in a similar way to ChatGPT, but can access information from the internet, and are aimed more at being search tools
- Image generators such as Midjourney and DALL-E 2 are similarly trained on data scraped from the internet
- Many other applications are being developed that make use of generative AI technology
Kings College London Macro and Micro AI Guidance is a useful starter for staff interested in immersing themselves in GenAI for a week-long, short, course.
2.2 How can I develop my own understanding and skills in using generative AI?
Brookes’ has devised guidance for staff which links to carefully curated resources:
- University-wide policy and guidance
- Guidance for Schools, programmes and modules
- GenAI events:
- Talking Teaching Across the Globe: Monthly seminar series
- Gen AI and Assessment Think Space: Fortnightly Q&A
- Subscribe to the Staff Learning and Career Development Newsletter to learn about further guidance and activities.
- Kings College London Macro and Micro AI Guidance is a useful starter for staff interested in immersing themselves in GenAI for a week-long, short, course.
2.3 How can we help students develop skills for using AI appropriately and effectively?
As a starting point, make sure that all students have completed the Use of Artificial Intelligence Moodle Course. Brookes has also published Information for Students which is a useful page that outlines concerns around Academic Integrity. Think about building critical AI literacy in the same way you would build any other academic literacy.Try to build in opportunities to engage with, and evaluate the implications of, using Gen AI across a range of activities. Students who are more practised in knowing how to establish the strengths and limitations of Gen AI are more likely to use it appropriately. Keep conversations about AI open and consider providing spaces (virtual or physical) where students can share their learning with you and their peers so that appropriate guidance and support can be offered as soon as a need is identified.
You might also find the Times Higher Education article on prompt engineering interesting.
Guidance for Schools, programmes and modules outlines how Gen AI might be embedded in TLA (teaching, learning and assessment), we advise a strategic, whole programme approach.
2.4 How can we design assessments to minimise inappropriate reliance on AI while still allowing beneficial uses?
Jisc’s ‘Assessment ideas for an AI-enabled world’ powerpoint gives a list of assessment ideas that can be used alongside or to mitigate the use of artificial intelligence.
2.5 What changes may be needed to grading criteria and rubrics in light of AI?
It depends on what your learning outcomes are. Grading criteria / rubrics need to accurately and transparently align with the module / programme learning outcomes, so if you want to include the use of GenAI or discourage its use, the learning outcomes have to reflect that before modifying the grading criteria. When that is achieved, grading criteria / rubrics can be modified in two ways. If you want to encourage the ethical use of GenAI, criteria / rubrics need to award marks for the use / development of skills associated with using and responding to GenAI (e.g. prompt engineering skills, critical thinking and evaluation skills). Furthermore, depending on what you want to assess, the weightings of the marks can be adjusted so that skills in critical analysis / evaluation or reflection are more heavily weighted than those associated with utilising the technologies. On the other hand,if you want to ‘engineer out’ the use of GenAI, weight marks more heavily towards things that GenAI does less well (such as reflection, human skills, empathy, thinking processes, authentic assessment). More information on authentic assessment and diversifying assessment can be found on our webpages: Using Generative AI Applications for Learning, Teaching and Assessment.
3.1 How can we capture and share effective practices to embrace and adapt Gen AI?
We have a number of initiatives across Oxford Brookes you might like to contribute to and that will create a resource from which to draw inspiration:
Talking Teaching Webinar series Call for case studies and pearls of wisdom in the innovative use of Gen AI in TL&A (AHE, ANTF funded project).
3.2 Which applications can I use in TL&A that are data-secure?
ITS regularly review the availability of are looking at university wide GenAI applications offered by Google and other established service providers.
Oxford Brookes recommend Microsoft Co-pilot
Oxford Brookes student's and staff have access to the data-secure Microsoft Co-pilot AI chatbot, available through a Microsoft academic institutional licence.
When using Microsoft Co-pilot, signing in with your Oxford Brookes’ log-in ensures your data is protected; in accordance with Microsoft’s privacy notice. However, we do not recommend you upload any confidential or protected data or information because Co-pilot does not comply with Oxford Brookes expectations of good practice with regard to information security.
The benefits are you can use Co-pilot to, among other things, create images, learn about new topics, compare and contrast text in documents, summarise content and generate ideas or problem solve.
The guidance below explains how to access Oxford Brookes Microsoft Co-pilot. You must first register with Microsoft, using your Oxford Brookes email address, you can do this by visiting Office 365 Education. Then, to access Co-pilot:
- Go to https://copilot.microsoft.com/
- Click 'login' in the top left of the page
- Choose the login with a work/school account option
- Login using your Office365 username and password
Google Gemini
Google Gemini is a useful information-search, production and reasoning tool, which you can sign into using your Oxford Brookes log-in.
However, Google’s privacy notice tells the user not to share anything personal as it has human reviewers and retains your data for three-years. Thus, Google’s approach may not adhere to general data protection regulations (GDPR) and so not be GDPR compliant. Do not upload any confidential or protected data or information to Gemini, go to Co-pilot for this.
Other available AI models and software tools
With so many AI tools emerging it is impossible for Brookes to review them all. We must all develop AI Literacy, i.e., knowing what AI to use, how and when to use it. This will help us all be safe and secure in this evolving AI landscape. Please see Guidance for Schools, Modules and Programmes for principles to support you in when and how to use GenAI.
Any use of GenAI software not supported by the University must be in accordance with the IT Acceptable Use Policy and sanctioned by the relevant authority in the IT Directorate before being introduced (please contact info.sec@brookes.ac.uk to discuss your requirements).
Note, the importance of adhering to the above approval process concerns both helping to minimise the data security of the AI software or tools and data privacy risk to Brookes.
We recommend seeking above approval for applications that might be used across a programme of study. Please do ensure that the application is available to all students who require it (Principle 1: Ensure equity in student access to GenAI).