- AI Sidequest: How-To Tips and News
- Posts
- Homework is the second biggest driver of chatbot use
Homework is the second biggest driver of chatbot use
What are teachers and students to do?
Issue #39. AI in Education
On today’s quest:
— Is the whole AI market just homework help?
— Are students right to worry about their AI future?
— Is your kid’s professor an AI (or worse)?
— A productivity tip for Google Sheets users
— Quick hits
Homework is one of the biggest reasons people use AI chatbots. A recent study by the Washington Post found that getting homework help is second only to getting help with creative writing such as fan fiction, movie scripts, and jokes (gift link).
Further evidence that students make up a noticeable part of AI traffic comes from a Bloomberg report that last year, ChatGPT traffic dropped 10% in May, 15% in June, and 4% in July, and then rose again in late August as many students in the U.S. returned to school.
So for the back-to-school season, here’s a roundup of education-related AI stories I’ve been gathering over the last few months.
Are we in the AI homework apocalypse?
After GPT-4o became available free, Ethan Mollick said the “Homework Apocalypse will reach its final stages.” He’s bullish on AI as “a powerful tutor and teaching tool and thinks a big reason it hasn’t been used more widely in education is that many students couldn’t afford to pay for models that worked well.
Mollick said, “GPT-4 can do almost all the homework on Earth. And it writes much better than GPT-3.5, with a lot more style and a lot less noticeably ‘AI’ tone. Cheating will become ubiquitous, as will universal high-end tutoring, creating an interesting time for education.”
Khan Academy launches a free AI assistant
Khan Academy and Microsoft are teaming up to give teachers a free AI assistant. A deal with Microsoft has allowed Khan Academy to make its Khanmigo AI assistant free to K-12 teachers. Fast Company says the tool is “more than just a chatbot, the software offers specific AI-powered tools for generating quizzes and assignment instructions, drafting lesson plans, and formulating letters of recommendation.” — Fast Company
Students feel like they need to learn AI
A new study found that 70% of recent college grads wish they had been taught how to use AI, and 61% of Gen Z graduates felt uneasy with their “facility with AI.” I know when I was a recent grad, I was anxious about what my future career held, but in this survey 39% of respondents said they “feel threatened that generative AI could replace them or their job entirely.” — Inside Higher Ed
But are students being sold a bunch of hype?
A user going by Pavel posted the following chart on Bluesky in response to the survey results in the Inside Higher Ed story above about students wanting to learn AI. Could it be that graduates who feel anxious and wish they had learned AI are just responding to AI hype?
The chart shows a relatively low percent of companies saying they are using AI or plan to use AI in the next six months. The overall number looks to be around 7%, which is lower than I’ve seen in other surveys, so I’m not sure what to make of it. Use obviously varies greatly by industry, but it probably also varies greatly by company. I imagine most companies either significantly embrace AI or aren’t encouraging its use at all.
Also, since homework is such a big driver of AI, could it be that recent graduates currently understand its power more than corporate managers?
Is your kid’s professor an AI?
People online were fretting about Harvard using an AI instructor for its entry-level programming course, but when I clicked through, it didn’t sound terribly different from the flipped classroom concept that was popular when I was a professor. The flipped classroom deemphasizes lectures (the “sage on a stage”) and instead has students do most of their learning outside class, reserving class time for the instructor to answer questions (a “guide on the side”).
The fretting led me to a jaw-dropping article about a university running a class with videos from a professor who was actually DEAD. As in the AI class, teaching assistants handled the tasks that couldn’t be done by the not-alive instructor.
I have some thoughts:
Students read books from dead writers all the time as part of the learning experience, what makes videos different?
Since one of the great fears about AI is that it will replace humans, how often are educational institutions already continuing to use pre-made videos and supplementing them with lower-wage, in-person workers?
Although the difference for students may be negligible if the videos are good, is reusing videos blocking the career advancement of other faculty?
But getting back to AI, it reminded me of my objections to AI math tutors in a previous newsletter. As one reader pointed out, human tutors make errors too, and I frequently hear from parents who are dismayed by how little their kids’ teachers seem to know about grammar and usage.
One argument I’ve made about the danger of AI in the past is that it presents information so confidently — in such a believable way — that it undermines our skepticism. But most people treat professors the same way. Because of their status and position of authority, we tend to believe they’re always correct. And they usually are. But in my experience, AI often is too.*
So even if an “AI professor” makes some mistakes — let’s say it’s as good as an average professor or even a below-average-but-not-horrible professor — is that really such a bad thing for students if it means they get more one-on-one attention from real humans, and is it even really different from having a human instructor? Are we setting the bar too low, or are we being realistic? I guess we’re going to find out.
It’s complicated
Looping back to the story at the top of the newsletter, the Washington Post reporter, Jeremy B. Merrill, posted an anecdote on Mastodon that didn’t make it into the article. He said:
“I had an interesting conversation with a guy who helps refugee kids get tutoring who said that a lot of the kids use ChatGPT for English help because they want to demonstrate their mastery of history or chemistry or whatever, but don't have the English skills yet. It's (sometimes) not cheating, it's kids putting in serious effort to excel in school, when resources aren't necessarily easily available.”
As often happens with AI, I come away thinking it is both good and bad. As with any tool, it’s all in how we use it.
Productivity Tip: Changing your text case in Google Sheets
This isn’t AI, but if you use Google Sheets, I think you’ll like the Change Case extension that gives you six different ways to reformat text: all uppercase, all lowercase, first letter capitals, invert case, sentence case, and title case.
You can change multiple cells at once, and I’m going to use it to clean up a messy spreadsheet I have with more than 1,000 rows because even though I’m all about following a style guide in professional work, apparently when I’m throwing text into a spreadsheet, I’m as chaotic as everyone else.
Quick Hits
Duolingo expands AI tutor after seeing 54% increase in subscriptions. — WSJ
Major record labels release an army of lawyers on AI music companies Suno and Udio, claiming copyright infringement. — Musically
Tim O’Reilly, of the publishing house O’Reilly, is building an AI tool that pays authors when their material is used to generate answers. — O’Reilly
Popular YouTuber Drew Gooden has a 30-minute rant about how AI is ruining the internet. I don’t usually watch long videos, but this made me laugh and kept my attention. (It also made me think about “YouTuber voice,” which is something I’ve heard of but feel like I’ve never really heard. I’m still not sure if that’s what Drew has, but halfway through, his cadence started reminding me of rappers.)
What is AI sidequest?
Using AI isn’t my main job, and it probably isn’t yours either. I’m Mignon Fogarty, and Grammar Girl is my main gig, but I haven’t seen a technology this transformative since the development of the internet, and I want to learn about it. I bet you do too.
So here we are! Sidequesting together.
If you like the newsletter, please share it with a friend.
* In an earlier version of this section, I have “AI usually is too.” But it feels like I’ve been getting more wrong answers lately.
Written by a human.