My nerves rattled like I’d swallowed a washing machine as the girl next to me started reading her podcast script. All the fancy recording equipment at this summer journalism program already had me on edge, and it didn’t help that her script sounded flawless.
When she finished reading, I gushed, “That was so good. How did you write that in such a short amount of time?” I wasn’t really expecting an answer, assuming she’d just say thanks and move on.
Naive me.
She leaned in, lowering her voice as if she were about to give me valuable advice. “Just use ‘Chat.’”
Oh. Of course.
“Smart, smart,” I said, pretending I wished I’d thought of doing that. I turned back to my own script, suddenly less impressed and a lot more unsettled.
This wasn’t the first time I’d been given this advice, and it certainly wouldn’t be the last. In the back of classrooms, the library and the SAC, “Just use ‘Chat’” can be heard everywhere.
What alarms me most is how casually we refer to AI by a nickname. Our imagined relationship with ChatGPT builds trust and dependence, leading us to reach for it without thinking.
Students routinely violate academic dishonesty policies without questioning how normalized the behavior has become. If a homework question is too long or confusing, a document needs to be paraphrased or an email to a teacher needs to sound less passive-aggressive, most students turn to ChatGPT or another form of generative AI for assistance.
AI is easy to use, more accessible than ever and, like any algorithm, purposefully designed to maximize engagement. Responses are often tailored to its users’ needs and biases, validating their opinions and questions. Even without ChatGPT, I find myself unintentionally relying on questionable AI overviews when I try to Google something. Avoiding AI has become much more difficult than using it.
As more students go straight to AI at the slightest inconvenience, their reliance grows and eventually they’ll struggle to solve problems on their own. This habit teaches students to outsource their thinking instead of pushing through confusion, so it’s no wonder students miss test questions that precisely mirror the homework. It’s obvious how many peers opt to use AI, missing the valuable skills they could’ve built by doing the work themselves.
I say this as someone who also once just needed help understanding a homework assignment, the same kind of help I’d ask of a friend or teacher and so I thought, why not ask AI? At the time, it felt morally sound. I fed ChatGPT prompts, tweaked the responses and scolded it for falling short of my expectations, all without realizing how much it hollowed my ability to learn and problem solve on my own. Admitting this is easy. Breaking the habit is harder.
I urge students to not trade real feedback and experience for AI-generated work that offers nothing beyond convenience. Having a desire to learn is the very reason the other girl in the podcast studio and I signed up for the journalism program.
Four years spent copying and pasting prompts into AI won’t pay off. Every essay, assessment and annotation should be a chance to improve. If AI is doing the work, then who’s truly learning? Certainly not you.
When I walk across the stage at graduation, I want to be the one earning my diploma, not ChatGPT. You should be able to say the same with confidence.
