Hello World

What’s next for computer science education in 2026?

Hello World Season 9 Episode 5

In our end of year special where we look back at the biggest developments of 2025, from digital skills and data science to advances in AI education. Watch or listen now to hear practical advice on what to expect for your classroom in 2026.

JAMES: Hello world and welcome to the podcast for educators passionate about computing, AI, and digital making. I'm James Robinson, senior learning manager here at the Raspberry Pi Foundation, and today I'm joined by three guests in the studio who'll be looking back at the key developments of 2025 and giving us their insights and tips on the trends to look out for in 2026. We'll also hear from our global colleagues and partners as they reflect on their own highlights and hopes for the year ahead. 

So over to my guests to introduce themselves and to share three words that, for them, define 2025. Rehana, over to you.

REHANA: Hello, my name is Rehana Al-Soltane. I'm a learning manager on the AI literacy team. Together with my team, we work on Experience AI, which is a program around AI literacy that we develop with Google DeepMind. And these resources have gone all over the world, from Canada to Kenya, from Malaysia to Morocco. And I'm very excited to be here today.

JAMES: And your three words.

REHANA: My three words are research, collaboration, and community.

JAMES: Interesting. Right, we'll come back to those three words in a moment. Bobby, can you introduce yourself?

BOBBY: So hi, I'm Bobby Whyte, I'm a research scientist in the research team here at the Foundation. And my three words for 2025 are data-driven, evolution, and collaboration.

JAMES: So two collaborations already. Laura, do you want to introduce yourself?

LAURA: Hi, I'm Laura James, I'm a learning manager and I work at the Ada Computer Science team in the Foundation. And so we're building lots of content for A level and GCSE computer science students for the UK. My three words for 2025: Resilience, creativity, and agentic AI.

JAMES: Just for those that maybe haven't come across those terms before to, you know, just expand a little bit on agentic AI. What does that mean?

LAURA: Well, it was something I only learned this year. And it's the idea that AI is now becoming more of a tool that's more personalised to each one of us. So it's effectively we can build AI tools that are personalised and effectively our agents. So rather than having something like ChatGPT, which is the same for everybody, we can actually build AI tools that are personalised to our own needs. And I think that is very exciting, but also potentially quite scary.

JAMES: It feels like a really interesting tension. Yeah, yeah. And Rehana and Bobby, it was interesting cause you both mentioned collaboration.

REHANA: Yeah. So, so I think collaboration because we work together with industry experts a lot and so they inform our resources to make sure that they're technically accurate and etc. And I think as well with like, you know, collaborating with the research teams as well, it's I think research and collaboration kind of go hand in hand because we, we collaborate a lot. We are research informed.

BOBBY: I think at the Foundation we also have this commitment to practice. And so what's really nice is to think about when we've completed a research project, what next? Right. It doesn't just sit on a… well, they don't really sit on dusty shelves anymore because it's all PDFs. But, it doesn't just sit on a repository somewhere, that we think about, you know, let's share this with colleagues.

JAMES: And I think that point about collaboration is really important, is that external collaboration we have with researchers, with industry partners, with teachers and schools and all sorts. So I think collaboration as well, I would probably echo that. I feel like it’s a really important word. 

Throughout 2025, we've discussed a wide range of topics on our podcast, but the three that stand out for me and that are reshaping the future of computing education are data science, AI literacy, and digital literacy. 

Over the past year, we've learned a lot more from classroom practice, from teachers' professional development, and from new research about the knowledge and skills that young people need. And now, with the UK government considering a new A level or level three qualification in data science and AI, it feels like the stakes are even higher as we think about how best to prepare learners for the future. 

Bobby, reflecting on what we've learned in 2025, how much clearer do you think our understanding has become about what we should be teaching about data science? And what might an A level in data science and AI mean for educators and learners?

BOBBY: That's the golden question. Well, technically that's two.

JAMES: If you can answer that question in like like 30s just like this is it, done. That'd be great.

LAURA: You can do that.

BOBBY: I think in the Foundation, we've learned quite a bit in 2025 about what a data science curriculum might look like. We've sort of undertaken quite a significant piece of work to sort of synthesise evidence from a wide range of sources. So working in collaboration with the learning team, we've looked at academic research, but also grey literature, and we've been trying to synthesise all of this existing knowledge and evidence and think about what does that mean for our kind of the products at the Foundation, but also what a qualification might look like. And that's been really, really exciting, very ambitious. 

And we're very excited to sort of finish that work and publish it to the world and say, this is what we think, this is how we think data science should be taught. And we've learned a lot about kind of what pedagogies might be appropriate, what tools might be appropriate, be appropriate. And that's quite fundamental. It's such an emerging picture, and I think we can learn a lot from looking looking abroad, as we say, to inform our own work as well. 

So yeah, in terms of the to answer your second question, the potential A level qualification and also, you know, if we're talking about the curriculum and assessment review, the revised computing curriculum, I think is really, really exciting. For me, it reflects a kind of a redressing of the balance between how, when the computing curriculum changed in 2014, which is when I moved to the country as well, so I felt caught up in the change. It felt like there was a lot there was less emphasis on digital literacy and sort of data literacy skills. And it feels like this new A level qualification in data science and AI, and this revised computing curriculum could in some way provide a more balanced sort of learning experience for young people. So I think it's really, really exciting. Loads of challenges, a lot of clarity that needs to be defined.

LAURA: I think that's interesting, though, is is like you just talked about teachers teaching it. I mean, where are they coming from? Where are these teachers, these mythical beasts who are going to be teaching this A level in data science? Already it's really difficult to recruit computer science teachers. Already we're as rare as purple unicorns. So who is going to teach the data science A level? Are we looking for existing computing teachers? Because they're already scrambling, trying to catch up with this new, you know, this new idea that's suddenly been thrown at them. 

Are we looking at maths teachers? If there's a lot of data and statistics in there, potentially they could be upskilled. So yes, I think it's a great aim, but I'm a bit skeptical about how many schools are going to actually think it's worthy enough to put it in their curriculum. Already their A levels are, you know, very they're very STEM based.

BOBBY: And what's interesting is, in one of the projects that we organised in the research centre, we went out to schools and we were interviewing children about, you know, are they interested in careers in AI and things like that. And what was interesting was speaking to some of the teachers and asking them, well, who teaches who teaches data science?

LAURA: Yeah.

BOBBY: Whose responsibility is it? And there were some they are unicorns, but there were some really nice examples where teachers were collaborating together. And so I would hope that in this new, revised curriculum, there's a sort of more shared responsibility.

JAMES: For a bit of context as well, because I think like this is people outside of the UK might not be aware that we've just had this big curriculum review. There's a report generated on it, and that's going to inform some of the future curriculum development over the next few years. But as Bobby said, we're already seeing lots of innovation in different parts of the world, lots of exploration of, of, data science. I think this, this question of where it sits within the curriculum is quite interesting. 

Rehana and Laura, what do you think? Where does data science sit within the curriculum? Who should be teaching it?

REHANA: I think data science is such a peculiar topic and, peculiar subject. I think it sits between computer science and math. And I think we have the same challenge with AI literacy as well, because we see that even though we created our resources for non-technical teachers and educators, we do see that most people who take it up, are computer science teachers, and those are very rare, right?

LAURA: Yeah.

REHANA: And so I think the goal is to create resources that other subject experts can pick up as well and are confident teaching. And I think you're going to see the same thing with with data science potentially as well, further down the line.

JAMES: I think there's also there a little bit of like not just within the data science bit or computing bit, but subject specialists understanding how their subject interacts with others. So particularly in data science, like where does the sort of the stuff that's taught in maths begin and end, and where do we pick that up? And understanding that relationship means we're not going to be like reteaching stuff that's redundant. But also we understand how it's being taught.

BOBBY: Yeah, I think it's that difference between the kind of foundational understanding and of these sort of concepts and skills and competencies, and then an applied understanding across the disciplines. Right? So if you think about often when I look online and I see, you know, you can. You can attach the word computational before a discipline and that is a new subfield, right. Computational biology, computational physics, computational archeology. And so so I don't think it just sits within mathematics. It's everywhere. Absolutely everywhere.

JAMES: Rehana, lets move on to AI now. So you've been deeply involved in our work on AI literacy this year. Looking back at 2025, what developments have we seen in the space of AI literacy and how well is education managing to keep up?

REHANA: I think at the beginning we started off thinking that AI literacy is the skills, is the ability to understand AI tools and what happens under the hood. So you look at machine learning and what supervised learning is, and unsupervised learning and reinforcement learning. And these are still very important skills. 

But I think now we're shifting towards having more of a practical understanding of AI tools around you. And I think the framework that our friends at the Stanford Accelerator for Learning have come up with, it's called the User Developer Critiquer Framework, is really helpful in trying to understand what AI literacy is in the current developments that we've seen over the years. So you can look at AI literacy from a user point of view. So how do you use AI tools responsibly, ethically? How do you, you can also look at it from a developer point of view. So how do you develop your own AI tools and those kind of skills. Or you could look at it from a critiquer or point of view and really understanding the the limitations that AI tools are and their and their affordances.

And I think one of the biggest developments that we've seen over this past year, is a growing concern, really, around cognitive offloading and the phenomena that we see of students using AI tools to replace their thinking, to replace their, their work. And educators need to come together and really try to understand what makes learning so special, especially if we are in an age where AI tools could complete coursework and generate answers to, to, to problems. And so really, we need to start looking at what it is that makes learning so, so valuable, especially for young children and young people whose frontal lobes are still developing.

LAURA: I think students are now sort of aiming for the the final product much quicker. They just want to reach that final thing, like, here's my essay, here's my project, or here's my program that I've made. And they don't want to do those intermediate steps and that that's what worries me. I mean, I didn't know that's what it was called cognitive offloading, but that's effectively it. They don't want to actually put the work in. I mean, and human, not humans. And adults as well. You know, they just want to like finish that report or they don't want to go through the steps that actually allow your brain to make those connections, to actually learn, do they? I think that's worrying. Is that we're more like, product driven rather than the process driven.

REHANA: I read a report somewhere, and it was about how AI tools are solving problems, and they're not helping people become better problem solvers.

LAURA: Yeah.

REHANA: And, and really it's who does the thinking gets the learning, right. And if you're not doing the thinking yourself then you're not learning eventually. And so all these skills become half baked. And if you look at, you know, if you look at the younger generations as, as we are preparing them for a future that is unknown, that we don't really know how it's going to look like, what the job market is going to be like. They really need to develop, you know, a coherent set of concrete skills like problem solving and critical thinking and being able to evaluate the accuracy of information that is presented to them as well. And so feedback literacy, like for example, is a very important topic here. 

On a daily basis, you're presented with so much information, but you as a consumer, you have the ability to critically evaluate where this information is coming from and what the accuracy is of this information. But as this information presentation becomes so much more widespread around us, and we have access to tools that can generate information on a, you know, click of a button, we really need to start becoming even more aware of, of this, and we need to start honing in on those skills and really helping young people develop those skills even more than before.

JAMES: I think there's a really interesting thing about the two things you kind of said in that, your answer there. That one was almost similar to what Bobby was saying about there being this, sort of, you've got your fundamental skills and then you've got your applied skills. So your fundamental like knowledge of what AI is and how it works and the limitations, and then you're applying into the Stanford model and the different kind of dimensions that you that you then talked about. And then I think there's cognitive offloading. 

One of the things that I would expect, like adults to be doing or, you know, people that have is to be sort of selective and critical about when and how they're using AI. And that almost feels like the thing we should be targeting with young people through those applied sort of experiences of like, this is when AI is a useful tool for us to use, and here is actually a time when you shouldn't. Actually, you're much better off exercising your own brain. And so like, yes, there's a sweet spot. Again, I like a tension. There's a tension between those two things. Like here's a tool that can help you achieve some thing. But if we over rely on it, it's it's undermining your thinking capacity. And so finding that balance in between...

BOBBY: Yeah.

JAMES: ...is, is the difficult bit.

REHANA: And I think foundational to that is understanding what helps you learn and what is what is learnful for your learning. Like what is what is helpful for your learning. And I think as as young people and as you know, I think younger children especially and younger students, they don't have the ability to be metacognitive about their learning or to think about their thinking. And so I think it really is, is our job to create activities and to create units of work and to create resources that help them become more metacognitive and to become more aware of how they learn and how they can be the best learners they can be.

LAURA: It becomes quite difficult when you know you've got like, let's say, students in year seven, who've got like ten pieces of homework that they have to do every, every week. And it's got to be quicker just to quickly type it into ChatGPT and oh, there's, there's my essay about, well, we'll take.

JAMES: Well, it's how do we incentivise that challenge. Because I think like learning should be like a bit challenging. Right. You should stretch your brain and your capacity. So if it's not challenging and they, there is this option where you can take a shortcut. You can take it. So how do you incentivise and direct people more towards this. It's worth me struggling because it's going to be valuable in the long run.

REHANA: Yes.

JAMES: So how do we incentivise that that more invested thinking in their future.

BOBBY: And I think it's also about recognising, you know, they are using these tools in terms of adoption.

LAURA: Oh my goodness, yes.

BOBBY: They are using these tools. And so then the learning the learning process, the learning goal then isn't writing an essay, it's evaluating almost an essay written by something else, which is a different...

LAURA: A completely different thing.

BOBBY: ...different thing, you know.

JAMES: And then maybe there is an onus on teachers to be like, maybe they shouldn't be setting the kind of homework that that can be really quickly done. So it's a shift in the educator mindset as well.

LAURA: Oh, absolutely. We have to be a bit more creative about how we're assessing students in this age of AI, but also encourage them to understand the ethical implications as well as being metacognitive about, you know, how how they're learning, but just the ethical implications of, you know, not plagiarising, not like passing off work as your own when it's not.

JAMES: So we've spoken there a little bit about what we would like to see in terms of education, in terms of our learners behaviours and so on. But let's think a bit more about where we are. So what is currently more broadly, what do you think is driving or limiting the take up of AI education in schools? Is it things like teacher confidence, learner demand, school priorities, government? Or is it like companies out there pushing what's what's driving or limiting that uptake? Do we think at the moment.

LAURA: I would say teacher confidence is a big thing because, you know, a lot of teachers probably don't have a lot of experience with AI or understanding how it works. So I think, you know, projects like Experience AI that really helped me. I deliver that a couple for a couple of years, my year nine students, and that was brilliant. I learned a lot. My students learned a lot. They got very excited about it. So those sort of things are good. But my advice would be for teachers out there, you know, dive in, ask questions. You know, because there is definitely a demand for learning about AI, even if it's an exploratory learn about it, if you know what I mean. So rather than saying being a teacher who has all the answers, I mean, as we can see here, no one has all the answers about what AI, you know, can do for us. And you know how it works. So, so I think teacher confidence is a big thing. But I think, you know, here at the Foundation we can offer them Experience AI.

REHANA: Yeah, we find that a lot of teachers don't feel confident about delivering these, these, these resources because they think it's too complicated or too technical or they just don't have enough information around the around the subject. And that's why we invest so much in teacher training. And so we work together with phenomenal partners all across the world. We have more than 20 and counting partners all over the world who help us, who help us with with the teacher training. And so we have a cascade model. So train the trainer, we train partners, and then they go on to upskill teachers in their local contexts. And so another point that kind of limits the uptake of AI literacy resources or AI education is government policy. Because how do you design resources if there is no space in the curriculum or if it's not part of the national requirement? And, you know, even if there's no push from like a systems level, then no one's going to take it up. So that that has been kind of a problem in the past, but I think that more and more governments are starting to see the importance, the crucial importance of AI literacy education and more more support from, you know, from a systems level means that we can more widely and more quickly, spread out.

JAMES: I think that that policy and systems level thing is really important. We can probably take a lot of those arguments and apply them to the data science conversation we've just had as well. Like, where's the space? At least with AI, it's kind of in the zeitgeist at the moment. It's driving kind of more rapid policy change. But yeah, it's interesting. Laura and Booby, what else, is driving or limiting, do you think?

BOBBY: I think technological change and the pace of technological change is is it, quite an important factor, probably more so for curriculum developers than teachers, because we're trying to think about ways to design and teach about technologies that become outdated quite quickly. And so, the challenge then is thinking about what's a, a general, almost like a future-proof way of talking about these tools. And actually to go back to the curriculum review, can you can you tell I read it recently?

JAMES: Yeah.

BOBBY: That one of the points they made was to not be, to be agnostic about about these tools and to not say, oh, we need ChatGPT in the curriculum, but rather we need something a bit more foundational about how do how do these general kinds of things work? How do these language models work? How do these foundation models work? Yeah. And so I think that's a challenge for if I was still in teaching, I think I would I would look at what tools my students are using and think it changes so frequently. How do I keep up with that? And we know that technology changes quicker than government and curriculum and so so it's about kind of it's finding that balance between yeah, future proofing.

JAMES: And I think in that in mind, I think like focusing on the sort of the core concepts and skills you're trying to deliver. And then what tool can I use to deliver them is, is typically how we've often thought about curriculum development here. But you still need to understand what are those key concepts and skills which are moving at pace as well with, with the technology and so on. Laura, what are your thoughts?

LAURA: One of the challenges I had when I was using the Experience AI curriculum was basically how locked down everything was within my school to try and like, even run like little chat bots or to go and use any kind of ChatGPT or a lot of these things were blocked by my school. I think basically a lot of schools are still quite fearful of allowing students access to, you know, LLMs or AI so I mean it must have changed. So I think that perhaps could be a challenge or could also be a limiting factor if teachers are interested in teaching about it, like how could you actually implement it within your school if you are, you know, experiencing quite a locked down scenario?

REHANA: Yeah. Yeah, exactly. And I think especially for teachers right now, because they're being bombarded with all these tools that are that are going to help them mark, that are going to help them grade or even do the marking, grading for them, or help with the lesson planning. And I don't think it's our job to tell teachers what to use and what not to use, or when they should use this and when they should use that. I think that's really up to the teacher to to decide. What we can do as educators is equip them with the knowledge to make informed, context specific decisions around AI use that, that, that could be beneficial to their students, that could be beneficial to their to their to their context, and that could be beneficial to them. But before they actually go on to use it, they, they, they, they need to know the, the shortcomings or limitations, the strengths, the kind of biases that that are inherently embedded in these, in these tools. And only then can they make informed decisions.

JAMES: How inclusive and accessible do we think the AI literacy education is at the moment? Who is being served well and who's at risk of being left behind?

REHANA: Well, I think the biggest questions around inclusivity and accessibility of AI literacy education is who can access it, who can read it? And a lot of AI literacy education and a lot of the AI literacy resources are developed in English and English only. And that immediately limits the use to more than half of the people on this planet. So we translate all our resources, into a range of languages, and then we also localise them. And localising means that we make these resources relevant to the local context of these educators that are going to roll out and teach these, these resources to their students. And localisation is not just swapping out words, right. It's making sure that that the activities, that the scenarios that the concepts that we talk about, they all make sense within that local context and a local community. So language, I think is one of the biggest barriers towards adoption of AI literacy education.

BOBBY: I think also cost. So so, you know, embedded within AI literacy curricula are tools. And, and it's really encouraging to see lots of free tools being developed. And, you know, Machine Learning for Kids, Google's Teachable Machine. But if we think more generally about AI tools that students use, there is this I mean, a lot of them are free and have a free tier, but there is also this divide between those who can afford the free, you know, well, the free tier, but the paid tier. And so one of the things that I found quite interesting, I read recently, was about this new divide being introduced, in higher education between the students that pay for the pro models and therefore can access higher quality responses and answers and generated essays and things of that. And the ones that just use the free tier. And that introduces that has huge kind of ethical...

LAURA: That's another digital divide.

BOBBY: It is. It absolutely is another digital divide. And so, you know, the tools are free for the moment. But with these additional paid tiers, I feel like that's another barrier to adoption, which is who can afford these, more capable, let's say, tools and technologies.

LAURA: I mean, also, you know, completely obvious is like the accessibility of devices, isn't it? It's like, you know, in rural schools, you know, they might not have devices for everyone. They might not have stable internet. And oh my goodness, you know, you need that if you're going to be, you know, connecting to all of these tools.

JAMES: All this connects with the wider topic of digital literacy, because even as AI tools become more common, learners still need strong foundational digital skills to use technology safely, creatively, and critically. What 2025 has really shown us is that digital literacy isn't something that young people just pick up through being around technology, as discussed in one of our earlier podcast episodes, the notion of the digital native has been pretty firmly disproven. What we're seeing instead is that learners need explicit, intentional teaching to develop the skills required to navigate an increasingly complex digital world. So let's explore that a little bit more. So coming to you, Laura. If we move away from the idea that learners simply pick up digital skills by being around technology, how can educators support that development of strong and meaningful digital literacy?

LAURA: Well, I mean, I think digital literacy and this thing called digital skills, which is probably what, you know, we previously would be focusing on, you know, when I would teach, like year sevens, who would come into my classroom. It's like, right, so you can use a keyboard, can they use a mouse? So they knew, you know, how to do copying and pasting. There were those kind of digital, actual like skills that allowed them to perform tasks. But I think above that I would be more thinking about like having like these critical evaluation skills of what they're doing and how they're doing it, and if they're using AI, how they're evaluating that and being able to use a computer or whatever technology for the right reasons. And, you know, knowing when to use which tools, as it were. So it's more more than just the actual, you know, check box of, of what they can actually perform. It's more about, why and the reasoning behind it. I think your original question was, how do we teach those skills? How do we teach digital literacy? I mean, the things I've written down when I'm thinking about this was, don't assume they already have the skills. I mean, that's it. That's very easy as a, you know, a tired teacher, you just assume they know how to copy and paste and, you know, things that just come naturally to you. So don't assume. Then make sure you are modelling what you want them to do. Make sure you are repeating and reinforcing it. So it's not just like one-off lesson at the beginning of the year, and then you forget about it. Recap often. Just make sure that they are always using and experimenting with, you know, digital digital tools that they that they have access to.

JAMES: I think using the space around you as well. Like, I'd much rather see a display in a classroom that has a bunch of keyboard shortcuts on it than the history of computers, because it's far more relevant. To copy? Right, done. Great. How do educators stay up to date, and how do we ensure that learners can develop really good quality digital literacy?

BOBBY: I think one of the things that some really encouraging over the last couple of years, in the research team has been the seminar series that we've run. And so the idea is that we invite long academics and researchers from around the world to share their wares, basically, and talk about a new pedagogy or a new tool that they've developed or a new way of doing things. And that's obviously that's at the point of inception, right? That's when, you know, it's not an off the shelf product they're coming up with... You know, if I think this year we had, academics from Finland share this new tool that they developed to teach young people about how social media tools work, allowing them to kind of open up the what we call the black box to see how these things work. And what's really lovely about the seminars are that teachers come along and and when we have breakout rooms, they get really, really excited when they're talking to us about, "Oh, I've never seen this before, but I'm definitely going to try it out." And then in the follow up seminar, you meet those teachers again. They say "They loved it". And again, research should not to just exist on that digital dusty shelf.

JAMES: Yeah.

BOBBY: But that it's actually taken into the classroom. And so I think keeping up to date with, research activities. I'm going to be agnostic here, but I think the seminar series is a great way of doing that.

LAURA: I would say don't suffer in silence. If you're a teacher, you know, don't just sit there feeling, because often computing teachers are just on their own single 1 or 2 person departments. I would say, don't suffer in silence. Reach out. You know, there's the CAS forums, there's like Facebook groups. There's, lots of different places where other computing teachers meet up and share resources. You can often find great resources on, on, Computing at School forum. Come along to some of the conferences. I mean, we go to like the Festival of Computing or the CAS conference. You can talk, meet with other computer science teachers there. You know, if you're worried about something, you can bet that other computing science teachers have have, if not solved the problem, but gone, you know, the few steps towards solving that problem as well. So I think they're, they're my tips for, you know, if you're if you're worried about something or you want to, you know, look for schemes of work or lessons about AI or digital literacy, there are there are people who've who've been in your situation before. So don't suffer in silence.

REHANA: I think a very good resource is our Quick Reads. So we produce Quick Reads quite regularly, and they are basically summaries of research around topics that we think are very important or very relevant to teachers. So these are usually computer science topics or AI literacy concepts. So we digest the research and we put it onto a two pager. So it's very easy to read and quick to read as well. And that's, that's our way of trying to make research accessible to, to to teachers who are time poor and who don't have a lot of time to read through all these papers and through all the research and so Quick Reads are, I think are a very good resource.

JAMES: Thank you for the plug there, Rehana. And they also connect to the research. So you can follow on that reading and go go deeper if you want to. Great. 

It's important to remember that the challenges around computing education look very different depending on where you are. I asked our colleagues and partners about what these barriers looked like in their education systems. This is what they had to say. 

NANGAMSO: Although there has been significant progress in introducing coding and robotics curriculum in 2024. One of the biggest barriers is the Financial Resourcing office. When you are introducing a subject like coding and robotics that often requires devices and as well as connectivity, skilled teachers. Having ring-fenced budgets to implement this is one of the highest and pressing challenges for 2025. 

The second one linked to, financial constraints, is that South Africa has a bimodal education system. Pretty much 80% of the schools in South Africa are non-fee paying schools. And often these are kids that coming from low resource backgrounds, or low economic status households and only 20 up to about 20% of the kids that are in fee paying schools for much more advantaged households. If we don't resolve, this bimodal education model in South Africa, we're probably going to end up with a large population of youth, that are not fully equipped to be able to participate actively in the future economies of this world. 

Probably be the third one is around, teacher knowledge and teacher skills. The teachers that are coming into the system, either don't have, the specialist knowledge or the strong training in how to deliver and teach coding and robotics in a system like South Africa. So we're having to really think about how do we get teachers that are not specialists, trained, confident, and equipped to be able to teach this subject that will be, rolled out across the education system. 

LEONIDA: In Kenya, there is vast gap and difference in access and equitable access to the digital literacy and infrastructure around it, like connection to power, there are places that don't have access. This has really hindered the utilisation of these resources in some areas, but it's exciting to see them adapting them in different ways, either printing, downloading it, or using their smartphones to carry on this learning in the classroom. And also, one of the things that has been a sort of a barrier is, the capacity building for the teachers and their confidence to teach. You find most of these teachers are not specialists in computing, so they tend to struggle and lack confidence to go to the classroom. But the recurrent and retooling training that we've been doing, we've seen teachers picking it up very well. But that is one has been an issue also for them to integrate and use these resources confidently in the classroom. 

MARIEVA: The main challenge in 2025 was the rapid evolution of technology, which required educators to continuously adapt to new tools and methodologies, especially those related to generative AI. Keeping teaching practices aligned with fast changing technological developments demanded robust training and ongoing support. The educational community demanded continuous professional development, awareness activities, and detailed guidance on responsible AI use. Furthermore, being geographically fragmented with numerous sites landing mountainous and remote areas, it is an extra challenge in Greece to mobilise and to reach remote communities that naturally encounter increased difficulties compared to urban areas. 

JAMES: Thanks for sharing those insights. What really comes through is the real sense of contrast. The ambition and enthusiasm is there, but the conditions on the ground vary hugely. But let's look ahead for a moment. What might 2026 have in store for computing education? 

LEONIDA: Be in 2026, and also beyond, these educators are now becoming confident to be able to deliver in the classroom, now going beyond just digital literacy skills and, having the learners get other skills programming, robotics, physical computing. With these skills, we see our learners becoming so innovative and then building our digital economy within Kenya. When the teacher is well equipped, then the teacher will be able to deliver quality learning to the learners. And when the learners receive quality learning, then we'd have innovation out of our learners. We envision a digital economy that cuts across everywhere within Kenya to solve issues not just within our country, but even across Africa. 

NANGAMSO: 2026, one of the exciting initiatives is working strategically and closely with two provinces in South Africa, Gauteng, and the Department of Education piloting, how do you implement this curriculum and then roll it out across their districts? 

And then the second one, I do think as much as, literacy and numeracy remains a challenge in our country, I am very encouraged that in the early grades, coding and robotics does enabled this, and it does really support, literacy and numeracy skills development. 

Probably so that the last one to add is that, with this bold call of this curriculum accelerating and buildings kids' skills on problem solving, digital skills, critical thinking, project based learning, it's really, really, really what we've seen, which is research has told us, enables and has a correlation between kids' work readiness and, their activeness in the economy. 

MARIEVA: The greatest hope for 2026 is the continued development of a forward looking national strategy for computing and AI education, one that ensures all students develop strong digital skills, computational thinking, and critical awareness of emerging technologies even in the most remote areas of the country. They continue deployment of advanced, high quality resources such as those from the Experience AI program, which are constantly kept at the forefront of technological advancements, will be central to this effort. By empowering educators and inspiring students, Greece can prepare the next generation for a future in which AI will redefine professional skills and influence virtually every aspect of modern life. 

JAMES: A big thank you to everyone for sharing their thoughts. There's so much inspiration and plenty to think about in what we just heard. So with that in mind, let's return to our guests here in the studio. 

So, Rehana, let's start with you. As you look toward the year ahead, what should we be watching for? What are you most excited about? 

REHANA: What you should definitely be watching for is our new resources. So we have lots of new resources in the pipeline. Including our Foundations of AI updates. So keep your eyes on the on the website. As in what I'm excited for for 2026, I think there is a lot of opportunity to look into creating interdisciplinary AI literacy resources. And so to find that organic, connection point between AI literacy and subjects that students already have at school, like geography or history or language, or art. So, for example, if you're teaching geography for example, and you're looking at flooding or flood forecasting and wildfires, and there is a very high chance that there are some tools out there that talk about this or there are some AI tools that are solving problems like floods and wildfires and natural disaster prevention. So inter, interdisciplinary AI education is what I am very much looking forward to. 

JAMES: Thanks, Rehana. That's great, and I'm really excited to see those cross-curricular, sort of resources come to light as they progress over the year. 

Moving to Bobby, how about you? If we take a step back and look at the bigger picture, what do you think will be the biggest shift in computer science education in 2026? And then also like, what's your what are you excited about? 

BOBBY: Okay. Well, I think at the moment it's we're in a period of significant change. Right. So if we think to even just thinking in the kind of countries that we work in and that we're sort of familiar with, it's been a lot of curricular reform going back to that curriculum review again. And so I think there's a sort of realignment or that's why my word was evolution, by the way. 

JAMES: Yeah, yeah, yeah. 

BOBBY: I think, you know, it wasn't... not revolution, but just like a slight just a slight shift. 

JAMES: Yeah. 

BOBBY: And so I think the thing I'm really excited about is thinking about that data-driven that paradigm shift. Right. Like what does it mean? What does computing education look like in a data-driven world? Right. So what are those technologies? What are those concepts? 

And if I think of the work that we do in the Foundation and both the learning team and in the research team, 2025 was a lot of learning. It was a lot of, you know, thinking about what this might look like and how we might and and also in the research team, how we might research that, whereas I think 2026 will be a for us, a year of doing so, going into the classroom, trialing things out. 

So I think one of the things I'm most interested in is to try things out with teachers and learn a bit about, what this new landscape might look like. And I'm really excited to work with teachers because I think, again, it's that commitment to practice that, you know, ultimately, we're serving the young people and the educators that we work with. And so from a research perspective, that's particularly important because you you could very easily just stay on your desk, you know, and not kind of get out into the classroom. 

JAMES: Yeah. Yeah. Great. 

Laura, for teachers who will be starting to teach computing in 2026. What's your advice? What should they focus on as they plan for the year? 

LAURA: Oh, that's a big question. If you're a new teacher to computing, as I said before, don't suffer in silence. Ask for help. There's plenty of resources out there. There's plenty of forums. There's plenty of places where there are collections of people who are and resources who can that can help you. 

I would also say use AI to help you plan lessons. Why not? I mean, that's what it's good at, you know, as long as you can check for errors. The focus I think would be, again, you know, looking at what you would learners need rather than necessarily just blindly teaching the curriculum, like, work with your learners. Do they look like they're going to be able to understand, you know, Python programming straight from the bat? Or do you have to do a lot of work with them beforehand? It kind of depends on how they were taught in primary. So, you know, how long is a piece of string question? 

I would say my number one thing is don't suffer in silence. We at Raspberry Pi Foundation have loads of resources that are available. So does that answer your question, James? 

JAMES: I think it does. I think I like the idea as well that not only are we the work that we're doing producing resources which saves teachers time, maybe limits the amount they're having to do their own planning means they can be more focused on their learners and their needs. And that contextualisation adaptation, which I think is so crucial. 

Right. And finally, let's wrap up with a quick fire round. I'm going to stress the word quick fire. I want a sentence or two of bold predictions for 2026. Who wants to go first? 

BOBBY: I think one of the things I found most exciting and I'm looking forward to going forward is the idea of explainable AI. So we talked a lot about these data-driven systems and tools that we use. We've talked about a little bit about these tools not being explainable. So we can use them, but we don't understand a bit about how they work. 

And so one thing I'm quite excited about are new tools that allow children and young people to open the black box and open these opaque systems, and learn a bit more about how they work. 

JAMES: Awesome. 

REHANA: I'm very excited about integrating AI literacy into existing subjects that already are in schools. So looking at geography and history and arts and languages, like I mentioned earlier as well, I think there's just so many exciting opportunities to get children in, to get students excited about AI by looking at AI through a different lens. So it's not a technical subject necessarily, but as a, you know, as a peculiar as an interesting concept that exists around them. 

JAMES: Great. And Laura, what's your prediction? 

LAURA: I'm going to go a bit leftfield here. I think cyber. I think this is something we have forgotten. I think it was a big thing maybe 2 or 3 years ago. The government put a load of money in, you know, we had the Cyber First competitions, but that's kind of been it feels to me like it's been sidelined and everyone's like, "Yay, AI!" however, we need to be teaching students and teachers about the importance of cyber security. And I think that's something that's really missing here at the moment. So this is something I was always teaching my students. But I think we're missing a trick because we need secure networks more than we need AI. 

There you go. Bold statement. Drop the mic. 

JAMES: Well, I think every new a development as well. Every time that technology advances or changes, the risks and benefits that come with it come, you know, and I think we talk a lot about the benefits of AI in terms of the the productivity. We talk about the risks in terms of bias or in terms of like the, the, cognitive 

REHANA: offloading 

JAMES: offloading. But we don't necessarily talk about any of the sort of security implications that might come associated with that. So I think it is an area where we could be focusing more on. 

So that's it from us today. It certainly sounds like 2026 is shaping up to be a bumper year, and we'll have plenty to discuss and explore in future episodes. A huge thank you to my guests here in the studio, to Rehana, Laura, and Bobby, and to all our contributors who joined remotely from our offices in Kenya, South Africa, as well as from our global partner FORTH, who are in Greece. We hope today's discussion has given you some ideas of what to look out for over the next 12 months. 

And of course, one thing definitely worth keeping an eye out for is the March issue of the Hello World magazine, which focuses on safety and security in computing education. In the meantime, if you haven't already, head over to helloworld.cc to subscribe and explore all our back issues, and you'll also find our full podcast archive there. Perfect if you want to catch up between episodes. 

We'll be back in March with more episodes exploring the key themes from the magazine, so stay tuned! Thank you very much for listening. All the best for the year ahead. 

Goodbye. 

ALL: Goodbye.