The Scottish Parliament's think-tank

AI and Education with the Goodison Group in Scotland

A robot teacher of the future

Tuesday 7 November 2023, at the Scottish Parliament and online


This event brought together leading figures from education to discuss how education can respond to developments in artificial intelligence.

Featuring an expert contribution by Professor Judy Robertson from the University of Edinburgh, this forum debate explored the levers in education that will help us make the most of the opportunities from AI in education.

This is the second in a series on Learning through Life in the 21st Century, which the Goodison Group in Scotland is running with support from the Futures Forum.


Report


Introduction

Chaired by Sir Andrew Cubie on behalf of the Goodison Group in Scotland, this event explored the levers that matter in the future of AI in education and learning. It covered the following questions:

  • How can Scottish education develop the AI policies, strategies and practises that drive and influence the right AI innovation future for education and learning?
  • What do we want AI to do in Scottish education and how can we ensure that our policies benefit all in the education ecosystem – learners and educators – into the future?
  • What are the fears and opportunities we need to discuss? How can we support and enable educational reform to promote responsible and ethical use of AI?

This report, produced by Scotland’s Futures Forum and the Goodison Group in Scotland, includes a lightly edited version of the introductory presentation and a summary of the key messages to come out from the workshop groups.

The report may not represent the views of everyone present at the event.

Welcome

As convener of the Education, Children and Young People Committee, Sue Webber MSP welcomed participants to the event.

She noted that the subject is of great interest to the Parliament, especially given the great opportunities from the development and the use of artificial intelligence in education, as well as the risks it presents too too.

As Sue noted, we will have to be brave to embrace the opportunities and balance them with the risks.

Key points

  • As we respond to these new technologies, we must remain strongly rooted in our own values and what we want to achieve through Scottish education.
  • AI is already being used in schools in Scotland. Early adopter schools show that it can help with reducing teacher workload, creating learning and teaching resources, and engaging parents/carers more in students’ learning. 
  • The Scottish Parliament, the policy community and the education sector will have to work more quickly than we have traditionally done to make the most of the opportunities.
  • We need clarity on who is leading the development of AI in schools and an approach within which the education sector can work through the implications and opportunities in AI.
  • Teachers need to be AI ready: they have to understand it to be able to use it to help students. 

The school is important as a community of young people and educators together. AI can help but it cannot do everything.


Presentation: The Future of AI in Education and Learning in Scotland

Professor Judy Robertson, Professor of Digital Learning, University of Edinburgh

Watch Judy Robertson’s presentation on YouTube


I have a degree in computer science and artificial intelligence, and when I studied that in the early 90s, the kind of AI we have now seemed impossible. It’s a personal astonishment for me to see it coming to fruition.

Throughout my presentation, to entertain myself and hopefully you, I’ve added in some images generated by my students in conjunction with various generative AI tools. We asked the students to think about schools of the future.

A large part of my talk is about how we can integrate AI and discussions about AI into classroom discussions. It’s been about a year since generative AI really took off with ChatGPT and other large language models. When I did my degree, the promise of these things seemed far in the future, but we have now reached the point where the advances have become so rapid that, if you want to keep on top of this, you have to read the news daily.

Large language models are based on a particular sort of statistical approach to artificial intelligence. They’re based on machine learning, and that gives them certain properties. One is that they rely on large amounts of data on the Internet and the other is that they rely on huge amounts of computing power to train them.

Reliability

Another feature is that even the people who develop them can’t explain how the large language models generate any of the texts that they write. That is really important: we know from trial and error that they’re not reliable and they’re not always accurate.

On the one hand, ChatGPT and other competitive products can generate high-quality texts, stories, poems, computer code or, crucially, essays, which is a large part of the discussion about education. They can also pass some university level exams or the the bar exam for law. But, on the other hand, they can just make stuff up.

Our challenge is to distinguish between those cases. When is AI telling you something useful and when is it making it up?

Implications for society

The rise and popularity of these generative AI tools bring huge implications for society and particularly for education. A lot of the time, especially in education, the fear is about academic misconduct: people cheating, using ChatGPT to write their essays, and then pretending that that they wrote it themselves. But that’s a narrow view, and we’re going to widen that out this evening.

In September, UNESCO wrote some guidance on generative AI in education. It stated that generative AI tools are not just another thing that teachers have to learn about, in the to-do list after learning how to use Teams. Generative AI tools are so profound that they force us to think about what we want people to learn when they go to school. They force us to ask: what is the purpose of our education system? What matters in education?

UNESCO also said that it’s important to have a human centred use of AI. It was disappointing to many people, myself included, that the Bletchley Summit didn’t include representation from civil society as part of the discussion. Big tech companies and government need to be involved, but so do members of society, including the education sector, and children and young people.

UNICEF have published “Policy guidance on AI for children”. We know that children use AI products in their lives now. Very young children use Alexa: it’s just part of the background of everyday life. And sometimes children use products which have AI but were not designed specifically for children, which brings up a lot of issues.

We need AI to support children’s development and well-being and be inclusive, and we need it not to be discriminatory. We need it to protect children’s data and their privacy and make sure that children are safe from the darker sides of AI.

Role of children and young people

I want to focus on two aspects: the need to prepare children for what’s happening with AI now and in the future; and the need to create an environment in which children and young people are consulted about AI in a way which has action associated with it.

Learners should be consulted about the use of AI in their education, and I am proud to say that this is happening in Scotland. The Children’s Parliament and the Turing Institute are running a project on the issue. As one of the Children’s Parliament members said, “It is important for children to know about artificial intelligence because it is the future and it is good to learn new things when they affect our lives.”

Professor Muir’s report speaks about putting learners at the heart of the new Scottish education system as a building block of educational reform. The learner voice should be part of that, which is why it is important that we have young people as part of the discussions today.

Preparing for the future

There are various options for how education systems can respond to AI.

One option is to do what has been done in Seattle, Paris, New York and in parts of Australia, which is to ban AI in school system. Another option is to try to evade academic misconduct through AI by moving to different forms of assessment. If you have everybody sitting high-stakes exams in a room with pen and paper and invigilators walking up and down the aisles, there’s no way for AI to get into the loop.

Either banning or evading AI would be ducking our responsibility. Whether we want it or not, we have AI in our lives, and we have a responsibility to help citizens become AI literate.

As part of helping to prepare children and young people for their lives with AI, we should be developing new methods of assessment and policy, and adapting and reshaping our education system around AI.

In Scotland, we are in a period of educational reform where all kinds of new and interesting things are happening. It would be a mistake to ban or to evade AI, because that would be the tail wagging the dog, which I was trying to capture in this strange picture generated by Dalle of this robotic dog with the enormous wagging tail.

We shouldn’t assess what’s easiest to measure. Sometimes it’s tempting to say that high stakes exams are the best way to avoid cheating and are easy to mark, but that misses the point. We should be assessing what we want and need people to learn: the knowledge and skills and attributes that are required by citizens in the age of AI.

We’re all living in the age of AI, and education should reflect that.

Education in the age of AI

What might that look like? The Hayward report, the “Independent Review of Qualifications and Assessment in Scotland”, brings out the values of Scottish education as the words which are written on the Mace of the Scottish Parliament: compassion, wisdom, justice and integrity.

As part of educational reform, we need collectively to ensure that learners leave the Scottish school system able to create and use AI with just these values, with wisdom, compassion, justice and integrity.

They need to use AI so they know how to create with it. There will be lots of opportunities and jobs, and the creation of new AI needs wisdom and integrity just as much as everyday life.

We need to go back to basics and ask what it means to be an educated citizen in Scotland at this time. What knowledge and skills are needed? The report acknowledges the AI is one of these skills that learners will have to be able to work with. The report says “cope with”, but I think it’s “thrive with” to be part of the future of education.

AI literacy

This brings up the the concept of AI literacy, which can be defined:

“a set of competencies that enables individuals to critically evaluate AI technologies; communicate and collaborate effectively with AI; and use AI as a tool online, at home, and in the workplace.”

Incidentally, I don’t think that this requires curriculum reform; it is more a question of supporting teachers.

The proposed changes to the Scottish assessment system in high schools includes a Scottish Diploma of Achievement. Within that, there would be programmes of learning of individual subjects, which may be like some of the subjects on offer now, assessed in various ways. There would also be personal reflections from each student about other skills they have learnt in their lives, maybe through community work or sport volunteering.

Interdisciplinary project learning on challenge projects is also part of this, and that is where AI can be integrated with what we’re trying to do within the education system. The proposal is that project learning would be assessed by teachers in a form of continuous assessment in the classroom. This is particularly a good match for the fears that people have around generative AI.

Teachers are already well placed to spot when learners use old fashioned forms of cheating, like getting somebody at home to help them or copying from another student. Because they know their learners so well over a long period of time, it is easier for teachers to work out when somebody’s performance changes dramatically. Teachers’ knowledge of learners is a good way to to guard against plagiarism from generative AI.

Also, the skills that we want to promote with project-based learning are ideal for developing the critical thinking and problem-solving skills that we need for AI literacy and which are needed to use generative AI effectively.

Generative AI can be part of such project-based learning but also the subject of the project-based learning. Someone could do a large project about how AI is used in their community or how it could be used within the school to make things better to be make people’s lives better.

AI in assignments

You might be wondering what that might look like in practice, so I thought I would share a sample assignment from the course that I teach at university for 4th-year students.

The students are welcome to use AI for their portfolio, but when they submit their portfolio, they write a statement that explains how they worked: whether they created all the work in the portfolio by themselves, whether they adapted the materials from elsewhere, and whether they worked in collaboration with one or more students and what they each did.

Finally, the student has to say which AI tools they used in which parts of the portfolio and what they learnt from working with AI. This last part is critical because it covers the extent to which they think that working with the AI made their final submission better. Why is it better to have used the AI than doing it all yourself?

This gets to the whole question: we want to use AI when AI makes what we as humans can do better; otherwise, it’s not worth doing. We need to teach people how to think through their strengths and weaknesses, and how they can use AI to augment the areas they are not so strong in.

Limitations of AI

One reason why people have to be critical of AI is because generative AI is not reliable: it makes stuff up. This is called hallucination. For example, if you get it to write an essay, it might just make up entire research studies, including references, that do not exist. Generative AI can get facts wrong as well, and not all systems will be able to access current information.

Last spring, some colleagues and I ran ChatGPT through our assignments in our exams, and our impression was that ChatGPT at that time was like a high D or low C students. It appeared to understand a lot of key concepts, but it was quite muddled in places.

Future generations like GPT 4 and competitors are quite a lot better than ChatGPT, but they currently still hallucinate. As far as I know, it is a property of the statistical models used underneath that they will never be 100% reliable.

There are also concerns about bad actors deliberately using large language models to generate more misinformation. As large language models write more and more of the text online, the Internet gets more and more swamped with misinformation. Being critical about information online, particularly of information generated by an AI, is crucial for education.

How can we use use generative AI in education?

If you ask ChatGPT how it thinks it should be used in education, it gives you reasonable output. It recommends that people should be aware of potential biases and discrimination within the AI systems, that students should check and edit what it comes up with, that students should respect the intellectual property, and that they should be absolutely transparent that they are using AI.

You’ll notice that, next to the strange pictures I’ve decorated the slides with, I’ve always put the student name plus the AI system that they’ve used. We need to get into the habit of attributing when AI has been used.

Recommendations

I will finish with some recommendations:

  • The Scottish Government should establish a cross-sector commission on AI and education as advised by the Hayward report. It cannot wait six months or a year.
  • Learners should be at the heart of decisions about AI within their own lives and within the education system. Children already seem to be included more as part of educational discussions, which is really encouraging.
  • AI literacy should be integrated within the Scottish education system from early primary school. This may not require curriculum changes but will require support.
  • There should be ongoing and flexible professional learning opportunities for teachers, so they can learn about AI, can discuss it with their colleagues and, crucially, have time to work out how it fits in with their new practice.
  • Finally, procurement decisions should be informed by pedagogy and educational values. Otherwise, we might end up with AI in our classrooms without ever meaning to. For example, the next release of the Microsoft Office suite might have generative AI built in. Before any decision is made about that, we should be asking whether we want that in classrooms.

Workshop groups

Following a short Q&A, Professor Robertson set the following two questions for the workshops:

  • How can AI benefit everyone in the education system? What fears and opportunities should we consider?
  • How can we make sure that AI is used responsibly and ethically in our education system?

The following points were raised during the Q&A and workshop group feedback.

How does AI fit in with our aims for education?

The International Council of Education Advisors third report reflects the importance of being clear about how artificial intelligence impacts on our classrooms, our learning and teaching, and all our lives.

As we respond to these new technologies, we must remain strongly rooted in our own values and what we want to achieve through Scottish education. We also need to be clear about the distinction between artificial intelligence, intelligence and real intelligence.

Our schools must not concentrate on simply working with artificial intelligence, but think through the human dimension that we need to continue to build.

Artificial intelligence can help to address some of the constraints which we operate in within education and release us to do things on the human side of the educational process.

What can AI help us to do?

AI is better as an assistive tool rather than a teaching tool, due to its limitations. What is happening in early adopter schools may allay some fears over the use of AI and demonstrate some of the advantages and opportunities it offers. These include: 

  • Reducing teacher workload. 
  • Helping to connect different elements of learning, as inferred in the proposed Scottish Diploma of Achievement. 
  • Creating learning and teaching resources. 
  • Providing the opportunity to engage parents/carers more in students’ learning. 
  • Learning and practising languages
  • Developing problem solving skills and enlarging creativity, through idea generation.

However, using AI in class is currently a grey area, with teachers sometimes making the basic assumptions that students are just using it to cheat. It would be better to be open and for teachers to tell students when they can use it to help.

AI is already being used in schools in Scotland.

Early adopters in using AI already exist, such as Dunblane High School, and the Scottish Government has plans to pilot a Norwegian AI-based formative assessment package that is already being used with much success across Norway. 

This package is also being piloted in Denmark and is about to be taken on board in Wales. It is being considered for use by the Falkirk-based team, Powering Futures, in the assessment arrangements for their successful SCQF Level 6 “Challenge” programme, which is currently being delivered in 43 secondary schools across Scotland.

At the same time, 34 high schools across Scotland are currently integrating the Daydream Believers Creative Thinking Qualification, at SCQF levels 5 and 6, into their curriculum. This includes an opportunity for both learners and educators to gain insights into the capabilities and challenges presented by generative AI.

Education is a huge market, and companies like Pearson are already developing tools, although their targets might be the parents and the young people and miss out the schools.

Who should lead the response to developments in AI?

Concerns were expressed at the apparent inability of the education system to adapt and take on new developments, such as AI, at pace. If this continued, it was noted that Scottish education, which is already behind the curve on AI, would fall even further behind, with serious implications for current and future learners. 

The Parliament, the policy community and the education sector will have to work more quickly than we have traditionally done so.

For this, we need clarity on who is leading the development of AI in schools. Should it be led in a “top down” manner through Scottish Government or a national agency? This model has been shown to bring problems like the slow pace of adoption, a tendency to be heavily bureaucratic, and perceptions of imposition. Or should local or regional development be promoted, encouraged and funded, building on the limited examples already in place? 

To respond, and as ICEA recommended, we need to bring together the key stakeholders in education and AI, and create a context within which the education sector can work through the implications and opportunities in AI.

AI literacy and skills

Learners and teachers all need to be clear on the extent to which the “linear processing” generally adopted by AI systems can lead to the production of poor outputs. This problem with AI provides an opportunity for students to develop their skills in critical thinking as they distinguish between what are rational responses from any AI tool and what are not. 

There will be extensive professional learning demands for practitioners (as well as inputs for parents and carers) in promoting the positive use of AI in our classrooms. Clarity is needed on who should lead this, and there was a strong feeling that practitioners with experience of AI and early adopters are best placed to do this. 

Teachers need to be AI ready: they have to understand it to be able to use it to help students. One idea was a single, readily accessible portal for those interested in using AI in the classroom to view exemplars and share best practice, including in the key area of procurement. 

AI cannot do everything

We saw during the the pandemic the vital importance of the school as a community of people, young people and educators together. One risk is that we create an individualistic learning process where people are interacting with machines and lose the nature of the school as a community.

We also need to be vigilant to the potential inequalities that AI will create and the existing inequalities that AI may embed. All learners should have access to the opportunities and the skills needed to make the most of those opportunities.

There are questions of trust: we are coming to rely on AI before we fully understand it. This may lead to the spread of false or inaccurate information.

AI is a reality in our lives and in our schools. People in education need to think carefully and quickly about how they harness the huge opportunities that AI offers in a way that does not undermine the fundamentals of what we all want to achieve for our young people.


Useful links

International resources

Scottish resources

Reflections on AI and education


Speaker

Professor Judy Robertson is Chair in Digital Learning at the University of Edinburgh, jointly between the departments of Informatics and Education.

Judy has a BSc in Computer Science and Artificial Intelligence and a PhD in educational technology. She has written extensively about designing technology with and for children in education and health, and about computer science education in schools.

Judy is the academic lead of the Scottish Government funded Data Education in Schools project, which aims to improve children’s data literacy through teacher professional learning. She has recently completed a study exploring children’s views about AI based on their experiences of using smart speakers like Alexa at home.


Partners

LOGO: Goodison Group in Scotland

The Goodison Group in Scotland is a charity dedicated to learning through life.