LeadershipACSALOGO.jpg
A publication of the Association of California School Administrators
A publication of the Association of California School Administrators

From noise to music

Navigating the evolving role of AI in education

By Ken Montgomery | November | December 2024
Facebook_icon.pngX_Logo.pngLinkedIn_Icon.pngPinterest_icon.pngEmail_share_icon.png
When you get a new musical instrument, the first thing that you want to do is to make some noise. That’s because the first stage of learning to use something is usually the “play around stage.” You want to blow on the horn, or bang on the drum to make some noise. This is what has been happening in education with AI; people have been making a lot of noise.
Now that it’s been almost two years since ChatGPT was launched, we’ve entered into a much different phase, and you can tell by the questions that are being asked. For example, a recent article in Scientific American did a deep dive into whether we should be nice to ChatGPT. Spoiler alert — the answer is yes, because not only are polite prompts likely to generate better responses, but it’s also important that we don’t develop bad habits while using AI. Parents have already seen this happening as they have reported that their children speak to them less respectfully because they are used to telling Siri what to do all day, and unlike a dutiful parent, Siri does not remind children to say “please” and “thank you.”
Articles like this illustrate just how much our conversations about AI are changing. We no longer just talk about what AI can do for us, but we are beginning to have conversations about what AI might be doing to us. While this conversation is important in and of itself, it illustrates a wider point. School leaders do not have the luxury of thinking narrowly about AI. AI is not a passing fad like fidget spinners and water bottle flipping (although that one has some staying power). AI is here to stay, so we must think about AI across many different dimensions.
As we move on from the play-around stage, school leaders should consider conversations beyond the two primary topics we’ve been hearing about: 1) AI is powering a revolution in tutoring; and 2) AI is powering a revolution in cheating. These topics, along with should you be polite to ChatGPT, are important, but school leaders have a responsibility to consider the implications of even more fundamental issues.
Here are some strategies to move your school or district into the next stage. Let’s start with an easy one.
Openly question the purpose of school
Shortly after ChatGPT was released in 2022, our senior English teacher said that he was in the midst of an existential crisis. He had been teaching for 20-plus years, and for the first time he was really struggling with what he should be teaching. Over his career he had seen many different approaches to teaching come and go, but in general he could always anchor on teaching kids to write. Now there was a machine that seemed to make writing an obsolete skill. He has since navigated his way through this crisis and has developed guidelines for his class regarding when and how to use AI in their writing. These guidelines are relatively easy to develop, and you can find some with a quick internet search, but more important than the guidelines he developed, there has been a change in mindset.
ADVERTISEMENT
We all openly acknowledge that ChatGPT can write some essays as well as many students and in a fraction of the time. Just like if I want to travel a mile, usually a car will get me to the same point faster than if I were to walk there. The difference between driving and walking is that walking is better for my body. So yes, ChatGPT can write that essay, but writing it myself is better for my brain. The act of writing the essay is truly about human development as opposed to producing an essay. It’s to everyone’s benefit to be transparent about this. Our government teacher made it a goal to give students assignments that ChatGPT could not do. This meant that students spent a lot of time in our makerspace creating statues and exhibits to communicate concepts. Additionally, they were asked to make their writing more personal by discussing the personal impact of different pieces of legislation or amendments. These are small changes that people made because when ChatGPT was released, we joined the students in asking, “Why should I learn this?” We have a long way to go, but by openly questioning the status quo as opposed to blindly defending it, we are moving in the right direction.
Make it a learning journey for your community Not only should you be open to students questioning your existence, but you should also embrace that this is a learning journey for everyone. Last year we held a parent town hall to talk about AI. We were not trying to roll out a new policy or develop a strategic plan. We just wanted to hear what was on our parents’ minds. We brought in two people who were creating ed tech tools that utilized AI so that parents could get a sense of what was happening in the field. We then asked parents to break into small groups and discuss what excites them about AI, what concerns them about AI and what questions they have. As one speaker pointed out, parents have a rich opportunity to go on an AI learning journey side by side with their children. Parents may not want to learn how to use Snapchat or do the latest Tik Tok trend, but they will want to have a basic understanding of how to use AI, and their kids probably do as well. Our parents were excited about the possibilities of using AI for instant and targeted feedback and were concerned that it gives students the answers too easily. They were also concerned about how to prepare students for careers if the skills needed for work will change dramatically. But mostly they had questions such as:
  • How do you assess if students are learning the important concepts?
  • Aside from a personalized tutor, what can AI do for education?
  • How will AI evolve?
  • Does AI help students?
  • How do we keep up?
  • How do we implement AI in schools with equity in mind?
It’s that last question that leads to my last recommendation.
Get AI in the hands of students Any conversation about AI and equity is multi-layered, so I want to acknowledge that I am only addressing one slice of it. As one teacher said, “We may be headed toward a world divided between people who tell AI what to do and people who are told by AI what to do.” He said this knowingly trying to be provocative. The point was not lost, though, that as AI moves forward, we run the risk of furthering the inequities in our system. AI is here to stay, and students should not be afraid of it because of rules imposed by a school.
So yes, ChatGPT can write that essay, but writing it myself is better for my brain. The act of writing the essay is truly about human development as opposed to producing an essay.
I’ve heard of some schools and universities that have developed such strong positions that “ChatGPT is cheating” and that many students are afraid to use it for fear of being accused of cheating. I recently attended a panel discussion where one of the speakers commented that it is primarily students of color at large public universities that are communicating this fear-based AI strategy. I understand that large public universities have resource constraints that require them to make decisions with a lens towards efficiency. In a biology class of 800 students, it may be difficult to quickly work through appropriate use of AI in a nuanced manner. With an average class size of 12, the Harvard professor has more options and can more easily implement their university’s guidance of supporting responsible experimentation of generative AI tools. Responsible experimentation is not just for Harvard; it’s where all high school leaders should land in their policies. Of course, not all the experimentation will be responsible at your school, just like I’m sure that not all the experimentation at Harvard is responsible. Nevertheless, you need to find ways to have students experiment with AI tools. This is the approach we have taken, and it led to one of our students creating an AI-assisted tool that can help improve the way people use design thinking. We have highlighted this tool in our community so that everyone knows that we want students using AI. I know that it is easy to see the problems that may arise, but as leaders we must acknowledge the opportunities as well. If our approach toward AI veers too far toward command and control, we may end up denying students important opportunities in their futures. If you are able to do the three things listed above, I guarantee that you will not have all the answers. These approaches are going to create more questions and more complexity than if you were just deciding which AI tutoring or detection software to use. Instead of deciding on the tool and then managing all the other challenges of the school year, going down the path I suggest will be more challenging, but it may open the doors to a more fundamental transformation of your school. As AI evolves, more and more questions will arise, so you may as well try a few things that will surface the deeper issues, and hopefully you will be able to move beyond some of the noise and start making music. References https://www.scientificamerican.com/article/should-you-be-nice-to-ai-chatbots-such-as-chatgpt/ https://oue.fas.harvard.edu/ai-guidance Ken Montgomery, Ph.D., is executive director and co-founder of Design Tech High School, a public charter school at the Oracle Corporation campus in Redwood City.
ADVERTISEMENT