|
Courtesy of Jeff Sheid
Jackie Valley, a Monitor education writer, reports from her Las Vegas base, Feb. 13, 2024.

Where school meets AI, a writer sorts perils and promise

Input ideas, get back a research paper? Generative chat, a low-tier but pervasive form of artificial intelligence, has been cast as a threat to learning. That’s only part of the story. Our writer found educators and students discovering fruitful ways of leaning in on AI. 

Artificial Intelligence, Real Learning

Loading the player...

Artificial intelligence has pierced the sphere of public education, as it has most other areas of life. Is it the ultimate cheat code or an aid to learning? 

Education writer Jackie Valley began tracking that question more than a year ago as ChatGPT, a generative tool that mines the internet to construct predictive conversation, became more of a presence. 

“There are a lot of legitimate worries surrounding it,” Jackie says on the Monitor’s “Why We Wrote This” podcast. “But … as the months progressed, what I started noticing in little pockets was this other side of, well, how can [AI] be used for good in education, too?” 

In reporting a recent story on the overlap, Jackie found schools that were teaching responsible AI use. She found ones that were using AI around the edges to optimize learning. Central to the story: the engagement level and joyfulness of young learners. 

One Georgia high school student told her that AI made him more eager to attend school. A lab project had him using it alongside different types of batteries and model electric cars.

“And it had just really excited him,” Jackie says. It added a layer of interactivity. “So you’re not just sitting there absorbing information,” the student told her. “You’re actually involved in the process.”

Show notes

Here’s the story about AI and education featured in this episode:

And here’s Jackie’s most recent appearance on this show:

Last year, she came on the show to talk about teacher pay and fairness:

You can find links to all of Jackie’s work on her staff bio page.

In early 2023, Monitor reporter Laurent Belsie joined this podcast for an early look at ChatGPT: 

Episode transcript

Clay Collins: Artificial intelligence in its different forms overlaps with, or will eventually overlap with, practically every sphere of human experience. Plenty of uses already are – or will be – benign and helpful. Of course, there’s an ominous side too.

Generative AI can make deepfakes ever more persuasive, causing political chaos. AI can exploit human loneliness in a way that creates more of a culture of isolation. 

Its overlap with education has naturally been prompting a lot of concerned conversation, but also some innovative thinking about new opportunities.

Collins: This is “Why We Wrote This.” I’m Clay Collins. Education writer Jackie Valley last joined this show to talk about micro schools. She’s back this week to talk about some schools in which AI is being cautiously embraced at all grade levels. 

Welcome back, Jackie. 

Jackie Valley: Hi, thanks for having me.

Collins: So when we talk about young minds and learning, there’s a sense that the integrity of the materials used to teach them needs to be ensured and also that the way those materials are taught doesn’t shortchange learners. I remember when using SparkNotes was seen as a cheat, right?

Valley: Yes.

Collins: The internet then put mountains of both good and bad information a few keystrokes away. So, this idea that generative AI can go even further – you know, process on request and spit out full essays and remove the impulse to learn – really is kind of legitimately concerning, right?

Valley: Definitely. I mean, it’s garnered so much conversation in the education sphere, and rightly so. It’s a paradigm shift to say the least.

Collins: You write about the pervasiveness of AI, and the inevitability of there being more uses of it. Keeping a handle on AI will require critical thinking, and so it’s kind of, “to know it is to own it,” right? And that starts with kids.

Valley: I think what we’re seeing in the K-12 sector of education is twofold. There’s AI in education, which is referring to the different tools that teachers or students could use to either more effectively lesson-plan or get homework help, all those types of things. But then there’s AI education, and that’s more on the learning side, as in what should students know and at what age and how should they be learning AI. And that’s really what I became interested in, because it hasn’t been talked about quite as much, although that’s starting to shift.

Collins: There are these special sensitivities around education, as we said, and they’re so easily amplified, even inadvertently. I saw an email subject line recently that referred to AI school buses, right? And it seemed calculated to cause concern. It turned out it was just about optimizing bus routes, which is more of this in the sort of first category you were just talking about.

What did your reporting show you about educators’ ability to cross into that zone of framing AI as an innovation that they could work with?

Valley: Yeah, so it’s really interesting, and I think it’s safe to say that it’s a gradual process. Like anything, you’re going to have the early adopters, the teachers or the students who are so excited about it, they dive right in, and take hold. But that’s not the case for everyone, and of course there’s teachers of all different ability levels and ages and technological savvy, so to speak.

And so this isn’t an overnight shift in any sense. We’re seeing some school districts take a little bit more of a proactive stance, building this into a curriculum. But others are just sort of tiptoeing into it, if that makes sense, and I can give you an example. 

There was a principal in Colorado I spoke to and he was at a high school and his school’s having conversations about this. There’s nothing super formative about what they’ve decided, but he used AI to create clues for a scavenger hunt for his students related to homecoming. His students were so impressed with the clues and the rhyming in them, and so he revealed afterward that he used AI to help generate some of those clues and serve as inspiration.

And so that was like a small example, I think, of a school that is showing the wonders of it. And how it can be applied for good use, but, you know, it doesn’t exactly reflect a standard curriculum, so to speak.

Collins: What was the grade level in that case and were kids already up to speed on what AI was, fundamentally?

Valley: Yeah, that was a high school in Colorado. So, they had some knowledge and certainly some teachers were using AI in class in different ways, but it’s not like they had a specific curriculum, spread out across the grade levels. On the flip side, I spoke to a district in the Atlanta area, the Gwinnett County Public Schools, and they have gone that route.

And so they have identified that AI is something that kids should know for their futures to be able to get jobs in this new society. And they’re embedding it at all grade levels. Early on, that might just be conversations about “how does Alexa work?,” but at the high school level, there’s actually a dedicated pathway where students can take three or so classes that build upon each other, diving a lot deeper into the AI technology.

Collins: You mentioned what drew you to the story was the idea of, kind of, the study of AI and “to know it is to own it,” that kind of thing. Was this an innovation story when you started thinking about it or did that angle sort of evolve?

Valley: I would say it evolved, and you know what really got me thinking about it was more than a year ago, when I was on my way to Boston to start my first day at the Monitor, I was at the airport and we were all lining up to board the plane. And, usually those are situations where no one’s really talking, but in this case, a conversation got sparked about AI and it had to do with students using it as a way to cheat, you know, write their essays and pretty soon, so many people in that line were talking about it, and it just emerged as this talking point for, you know, the five minutes or so before we boarded.

And so it got me thinking a lot about how ChatGPT at that point, I think, had been debuted for maybe less than a month, and it was already becoming such a flashpoint for concern.

And again, rightfully so, there are a lot of legitimate worries surrounding it, but I think as the months progressed, what I started noticing in little pockets was this other side of like well, how can it be used for good in education, too? And certainly there is that tool side I mentioned versus the learning side and we’re still in the early stages, and it’s a rapidly evolving technology.

So this is going to keep changing, but there’s certainly an effort right now to come up with some standards or guidelines for how schools should be thinking about teaching AI.

Collins: How in general do you bring the Monitor values lenses to bear in your education writing?

Valley: I try to look at what the core issue is. I think in this case, we’re talking about technology that’s going to revolutionize society, so naturally schools are going to have to be more innovative in how they think about teaching and bringing students up to speed for the society they’re entering.

So innovation felt like a good fit for this particular story, obviously transformation probably could have worked, too, because sometimes they go hand in hand. But I’d say there are plenty of education stories that are also about perseverance or simply compassion. I think you just have to dig a little deeper in the layers to see what really stands out, what we’re trying to get at here.

Collins: What is this story really about? 

Valley: Right.

Collins: You mentioned misuse and, you know, at a high level, a colleague of ours, Laurent Belsie, came on this show to talk about ChatGPT when it was pretty new. And he made the point that this technology’s growth was almost certain to outpace society’s ability to put down any kind of guardrails.

And you’re talking about educators kind of doing that ad hoc. What’s your sense of how equipped and how empowered educators are to create, enforce and teach best practices around student use of AI?

Valley: I think it goes back to what I said earlier, there are some who are going gung-ho into it, really motivated, and then there are others who are still worried and not, or a little bit more hesitant, and there have been two states that have launched actual policy guidelines for AI in the classroom. That’s not a very large percentage across the country. So it’s a work in progress and we can’t expect it to stay the same. It’ll be a very fluid process, I imagine, as the technology changes even in the next few months, couple years.

Collins: So then from the student side, Jackie, you ended your story talking about student engagement and joyfulness. What’s your confidence in kids at different levels, including high school, being able to figure out and then benefit from the right uses of AI? 

Valley: Yeah. Well, that’s what I thought was really interesting in reporting this story was that students are pretty captivated by it. And I spoke to a student in high school at Gwinnett County Public Schools, and he said that it just made him more eager to attend school. He talked about doing a lab project and it had something to do with using AI and different types of batteries and little electric cars.

And it had just really excited him. And he said, it’s the type of lessons where you have to put your thinking cap on. So you’re not just sitting there absorbing information. You’re actually involved in the process.

Similarly, I spoke to a teacher in the Los Angeles area, who works for a charter network called Da Vinci Schools, and they are probably on the early adopter end of the spectrum, and they’re thinking critically around the idea of AI.

So he helped launch this mechanism called Project Leo, and Project Leo uses artificial intelligence to take standards from lessons, whether it’s in a mechanical engineering class or an English language arts class, and then melds that with student interests to produce project ideas. And so, for instance, he mentioned one, there was a girl who really liked cats.

And so she wanted to do something with cats. And I think, if I remember correctly, how to help her cat lose weight. And so, what they’re doing is they’re using this platform to find projects that students can do a hands-on activity with that directly relates to the instruction happening in the classroom.

And, you know, he said students have been finding a lot of joy and that’s sort of the end goal in education too. Like if students aren’t having fun and eager to learn, you’re missing a lot right there. And so that was an example of AI more as a tool. Certainly some kids are using AI in their projects on a more direct basis, but right now it’s being used to help make more creative lessons.

Collins: I love the persistence of the “thinking cap” and the introduction of the cat. I mean, those are things that ... really represent the persistence of something bigger than artificial intelligence in education.

Valley: And he made a really good point, the teacher at the Da Vinci School, he said these are projects that the students can work on over a longer period of time in some cases and they sometimes are helping solve societal problems. One had to do with skateboarding and perfect form for tricks.

That’s not exactly a societal problem, but you get my drift. Like, they’re things that students can work on and potentially show employers when they get out of school as part of a portfolio of taking an idea and applying a solution. And so I think there’s deep ramifications for students entering the workforce with maybe a little bit of advantage as well.

Collins: Thank you, Jackie, for coming back on to talk about your very fast-moving and fundamentally important beat.

Valley: Thank you, I appreciate the time.

Collins: Thanks for listening. You can find more, including our show notes with links to all of Jackie’s work at CSMonitor.com/WhyWeWroteThis. This episode was hosted by me, Clay Collins, and produced by Mackenzie Farkus. Jingnan Peng is also a producer on this show. Our sound engineers were Noel Flatt and Alyssa Britton, with original music by Noel Flatt. Produced by The Christian Science Monitor, copyright 2024.