Teachers Are Going All In on Generative AI

Please vote:

Tim Ballaret once dreamed of becoming a stockbroker but ultimately found fulfillment helping high school students in south Los Angeles understand the relevance of math and science to their daily lives. But making engaging class materials is time-consuming, so this spring he started experimenting with generative AI tools.

Recommendations by friends and influential teachers on social media led Ballaret to try MagicSchool, a tool for K-12 educators powered by OpenAI’s text generation algorithms. He used it for tasks like creating math word problems that match his students’ interests, like Taylor Swift and Minecraft, but the real test came when he used MagicSchool this summer to outline a year’s worth of lesson plans for a new applied science and engineering class.

“Taking back my summer helped me be more refreshed for a new school year,” he says. “When I’m not spending so much time at home doing these things, I’m able to spend more time with my family and my friends and my wife so I can be my best at work, instead of being tired or rundown.”

Students’ soaring use of AI tools has gotten intense attention lately, in part due to widespread accusations of cheating. But a recent poll of 1,000 students and 500 teachers in the US by studying app Quizlet found that more teachers use generative AI than students. A Walton Family Foundation survey early this year found a similar pattern, and that about 70 percent of Black and Latino teachers use the technology weekly. As more companies adapt generative AI to help educators, more teachers like Ballaret are experimenting with the technology to find out its strengths—and how to avoid its limitations or flaws.

Since its launch roughly four months ago, MagicSchool has amassed 150,000 users, founder Adeel Khan says. The service was initially offered for free but a paid version that costs $9.99 monthly per teacher launches later this month. MagicSchool adapted OpenAI’s technology to help teachers by feeding language models prompts based on best practices informed by Khan’s teaching experience or popular training material. The startup’s tool can help teachers do things like create worksheets and tests, adjust the reading level of material based on a student’s needs, write individualized education programs for students with special needs, and advise teachers on how to address student behavioral problems. Competing services, including Eduaide and Diffit, are developing their own AI-powered assistants for educators.

All those companies claim generative AI can fight teacher burnout at a time when many educators are leaving the profession. The US is short of about 30,000 teachers, and 160,000 working in classrooms today lack adequate education or training, according to a recent study by Kansas University’s College of Education.

Study author Tuan Nguyen says generative AI is unlikely to heal the problem, which is related to poor pay and working conditions, and a perceived lack of prestige, not just working long hours. “AI tools can potentially save teachers time and can even help teachers target and individualize their instruction, but at this point, I don’t think they are going to change the teacher labor market,” says Nguyen.

That remains to be seen, but many teachers are experimenting with or getting introduced to the technology. The AI Education Project, a nonprofit funded by companies including Google, Intel, and OpenAI, has trained more than 7,000 teachers this year in how AI works and how to use AI-powered tools in classrooms. Cofounder Alex Kotran says teachers most commonly use generative AI for lesson planning and to write emails to parents. In training sessions, he finds that many teachers have used generative AI in the past week, but few know tricks such as “prompt hacking,” which can help draw out better answers from language models. “Now that AI is available for people to use, it’s important to show—rather than tell—educators what it looks like and how it can be used effectively,” Kotran says.

At the Ednovate group of six charter schools in Los Angeles where Ballaret works, teachers share tips in a group chat and are encouraged to use generative AI in “every single piece of their instructional practice,” says senior director of academics Lanira Murphy. The group has signed up for the paid version of MagicSchool.

In her own AI training sessions for educators, she has encountered other teachers who question whether automating part of their job qualifies as cheating. Murphy responds that it’s no different than pulling things off the internet with a web search—but that just as for any material, teachers must carefully check it over. “It’s your job to look at it before you put it in front of kids,” she says, and verify there’s no bias or illogical content. Ednovate has signed up for the paid version of MagicSchool, even though Murphy says roughly 10 percent of Ednovate teachers she encounters worry AI will take their jobs and replace them.

MagicSchool’s Khan says the threadbare legacy of education technology causes some teachers to be skeptical of new AI services. “It’s an industry that’s been burned by technology over and over again,” he says.

Joseph South, chief learning officer at the International Society for Technology in Education (ISTE), whose backers include Meta and Walmart, says educators are used to gritting their teeth and waiting for the latest education technology fad to pass. He encourages teachers to see the new AI tools with fresh eyes. “This is not a fad,” he says. “I’m concerned about folks who are going to try and sit this one out. There’s no sitting out AI in education.” ISTE recently partnered with education nonprofits Code.org and Khan Academy to release an AI 101 video series.

One other way AI is different from past classroom technologies is that it can bring along some problems not found in more conventional software. The Charter School Growth Fund, which helps charter schools open new campuses, formed working groups to advise schools on AI policy after a survey of school leaders found the technology was a top concern. Ian Connell, the fund’s head of innovation, says that in addition to understanding the benefits of AI tools, schools must also monitor the quality of content created by the tools.

Past research shows that large language models are capable of generating text harmful to some groups of people, including those who identify as Black, women, people with disabilities, and Muslims. Since 90 percent of students who attend schools that work with Charter School Growth Fund identify as people of color, Connell says, “having a human in the loop is even more important, because it can pretty quickly generate content that is not OK to put in front of kids.”

April Goble, executive director of charter school group KIPP Chicago, which has many students who are people of color, says understanding the risk tied to integrating AI into schools and classrooms is an important issue for those trying to ensure AI helps rather than harms students. AI has “a history of bias against the communities we serve,” she says.

Last week, the American Federation of Teachers, a labor union for educators, created a committee to develop best practices for teachers using AI, with guidelines due out in December. Its president, Randi Weingarten, says that although educators can learn to harness the strength of AI and teach kids how to benefit too, the technology shouldn’t replace teachers and should be subject to regulation to ensure accuracy, equity, and accessibility. “Generative AI is the ‘next big thing’ in our classrooms, but developers need a set of checks and balances so it doesn’t become our next big problem.”

It’s too early to know much about how teachers’ use of generative text affects students and what they can achieve. Vincent Aleven, co-editor of an AI in education research journal and a professor at Carnegie Mellon University worries about teachers assigning nuanced tasks to language models like grading or how to address student behavior problems where knowledge about a particular student can be important. “Teachers know their students. A language model does not,” he says. He also worries about teachers growing overly reliant on language models and passing on information to students without questioning the output.

Shana White, a former teacher who leads a tech justice and ethics project at the Kapor Center, a nonprofit focused on closing equity gaps in technology, says teachers must learn not to take what AI gives them at face value. During a training session with Oakland Unified School District educators this summer, teachers using ChatGPT to make lesson plans discovered errors in its output, including text unfit for a sixth grade classroom and inaccurate translations of teaching material from English to Spanish or Vietnamese.

Due to a lack of resources and relevant teaching material, some Black and Latino teachers may favor generative AI use in the classroom, says Antavis Spells, a principal in residence at a KIPP Chicago school who started using MagicSchool AI six weeks ago. He isn’t worried about teachers growing overly reliant on language models. He’s happy with how the tool saves him time and lets him feel more present and less preoccupied at his daughter’s sporting events, but also with how he can quickly generate content that gives students a sense of belonging.

In one instance three weeks ago, Spells got a text message from a parent making a collage for her son’s birthday who asked him to share a few words. With a handful of adjectives to describe him, Spells responded to the message with a custom version of the student’s favorite song, “Put On,” by Young Jeezy and Kanye West.

“I sent that to the parent and she sent me back crying emojis,” Spells says. “Just to see the joy that it brought to a family … and it probably took me less than 60 seconds to do that.” KIPP Chicago plans to begin getting feedback from parents and rolling out use of MagicSchool to more teachers in October.




Get The Latest Updates

Subscribe To Our Weekly Newsletter

No spam, notifications only about new products, updates.
On Key

Related Posts

Anduril’s New Drone Killer Is Locked on to AI-Powered Warfare

Anduril’s New Drone Killer Is Locked on to AI-Powered Warfare

After Palmer Luckey founded Anduril in 2017, he promised it would be a new kind of defense contractor, inspired by hacker ingenuity and Silicon Valley speed. The company’s latest product, a jet-powered, AI-controlled combat drone called Roadrunner, is inspired by the grim reality of modern conflict, especially in Ukraine, where large numbers of cheap, agile

Read More »