On May 16, 2025, a Medium piece titled “I’ll Instantly Know You Used ChatGPT If I See This” made the rounds. Its premise was simple: students ask AI to make their writing “more human,” and that’s supposedly “dumb.” The author boasted that he could always tell when ChatGPT wrote something.
Here’s the irony — that essay, meant to mock others for sounding artificial, perfectly demonstrated how shallow our standards have become.
The true crisis isn’t AI. It’s literacy, logic, and intellectual laziness.
1. The article’s premise misses the point
The original essay reduces AI-generated writing to a gimmick: students are lazy; teachers can tell; problem solved. Yet it failed to address the deeper issue—why students rely on tools like ChatGPT in the first place.
Students reach for AI not just because it’s easy, but because they were never properly taught how to write, argue, or structure ideas. Children are being taught by idiots—teachers themselves who were never properly taught basic English. The essay infers this is a symptom rather than a warning sign that the education system has collapsed on itself.
2. The literacy data no one wants to face
America’s reading and writing proficiency is collapsing—long before AI entered classrooms.
