Policies
AI use will follow the school’s (generic) AI policy. Some art and design specific guidelines and expectations might be helpful.
Thinking allowed:
- What formal and official regulations, guidance and AI policies exist which will inform schools, teachers and students use of AI?
- What are teachers’ professional obligations with regard to AI?
- Is the teaching of art different to other subjects with reghard to AI?
- How?
- Is there a need for AI policy guidelines at departmental level?

Five Principles to bind them all
‘…recent advances in technology mean that we can now use tools such as ChatGPT and Google Bard to produce AI-generated content. This creates opportunities and challenges for the education sector
A pro-innovation approach to AI regulation.
Department for Science Technology and Innovation.
Updated July 2023
This government white paper represents a broad approach to AI. It acknowledges that AI will unleash great innovation and drive productivity, and it recognises the speed at which these technologies are evolving. The paper describes how the government will avoid rushing to legislate too early and that it needs to take time to properly understand new and emerging risks through proportionate monitoring and a pragmatic approach. Alongside this pragmatism is a recognition of the need to build public trust through proportionate regulation.
The paper proposes five principles to guide and inform the responsible development and use of AI in all sectors of the economy. These principles will filter down to school level and will likely inform a school’s AI policy.
- Safety, security and robustness
- Appropriate transparency and explainability
- Fairness
- Accountability and governance
- Contestability and redress.
Generative artificial intelligence (AI) in education.
Department for education, training and skills. Updated October 2023
This Department for Education Training and Skills paper is informed by the government’s white paper on a pro-innovation approach to AI regulation (above). It is essentially descriptive and reflects the broad understanding of what AI technologies encompass (audio, code, images, text, simulations, videos) and that AI is already in use in everyday life. It notes that recent technological advances include AI tools, such as ChatGPT, which are able to produce AI generated content. The paper acknowledges that this creates opportunities and challenges for the education sector.
Opportunities:
Opportunities identified are that generative AI tools are good at quickly:
- Analysing, structuring, and writing text
- Turning prompts into audio, video and images
It can:
- Reduce workload across the education sector
- Free up teachers’ time, allowing them to focus on teaching.
Challenges
Content produced by generative AI could be:
- Inaccurate
- Inappropriate
- Biased
- Taken out of context and without permission
- Out of date or unreliable.
The paper explains that although AI can be convenient, it cannot replace human judgement and deep subject knowledge. In short, it states that the responsibility for any content generated through AI remains the responsibility of the user to verify and edit prior to using or presenting it. It suggests that
“Schools and colleges may wish to review homework policies and other types of unsupervised study to account for the availability of generative AI.”
The paper goes on to suggest that school AI policies should take account of specific issues. These may, in part, be traced back to the white paper ‘A pro-innovation approach to AI’.
These issues include:
- any data entered should not be identifiable
- personal and special category data should be protected in accordance with data protection legislation
- not allowing intellectual property, including pupil’s work, to be used to train generative AI models without appropriate consent or exemption to copyright
- review and strengthen their cyber security as generative AI could increase the sophistication and credibility of attacks
- ensure that children are not accessing, or creating, harmful or inappropriate content online, including through generative AI by:
- protecting students online
- limiting students’ exposure to risks from the school’s IT system.
- making sure they have appropriate systems in place
- being open and transparent and ensuring students understand that their personal data is being processed by AI tools
- guidance on what counts as AI misuse
- requirements for teachers to help prevent and detect malpractice.
In addition to identifying issues the paper explores the need to prepare students to develop the right skills and to make the best use of generative AI safely and appropriately. This will involve teaching students about AI, it’s benefits and potential for harm.
Art and design teachers will need to recognise and understand the implications of these statements. They will reappear at school level in AI policies, which will begin to be created. Teachers will need to negotiate the potential dissonance between the safeguarding obligations placed on school networks and the expectation that schools should prepare students to develop the right skills and make the best use of AI safely and appropriately. These are whole school issues, but art and design teachers will wish to contribute to the debate.
Further reading
Resources
Click here to download an open source school policy template by CAS (Computing at School).
Click here to download the Russel Group paper ‘Principles on the use of generative AI tools in education.