
Generative AI arrived like a firecracker lighting up the night: Suddenly, no one could look away.
-Damon Beres, The Atlantic Intelligence, 8/30/2025
It's difficult to check facts, or to talk about fact checking, without coming off as a know it all, a fussbudget, or a snob. But knowing things is hard. Checking is a practice. It's not omniscience.
-Zach Helfand, "Vaunted; How this magazine gets its facts straight," The New Yorker, 8/25/25
Last academic year, when I asked middle school and high school educators about their challenges. they would either discuss phones in the classroom, book bans, or parental pressure regarding book bans and curricula. This year, with phones out of sight during the school day—a New York State Governor Hochul initiative—the challenge for educators is AI in the classroom and the continuing necessity of lockdown drills. (More on lockdown drills another time). University professors are returning to Blue Books for hand-written exams in situ in an effort to ascertain what students have learned. And school districts, community colleges and universities, are devising protocols for the use of AI in the classroom. As we still live in a free, or free enough country, they cannot monitor what happens at home.
With the inevitable partial return to less advanced technology, educators must also now reconsider ethical standards regarding plagiarism, which has exploded. One professor told me that his students generate essays and stories by asking AI a question, and then revise the language, structure and content so that it appears "original." Is this deliberate cheating, ignorance in the wake of lax educational standards, or the ambience of the fraught political discourse in the country? False news and outlandish assertions on social media has become commonplace. How can we blame our children for imitating what the adults in their lives have wrought?
For hard-working writers and editors who fear redundancy, there are other questions: Is it possible to resist the temptations of generating sparkling prose ? Do you trust, as you are reading this blog post, that I am able to sustain the Authors Guild "author generated" promise? I hope so. If we lose trust in one another how will we continue to live together peacefully and evolve? In truth-- an apt pun--it is almost impossible to know what we know, or how we came to know it. Nor can we edit or fact check our own work with the constant interference of artificial intelligence. Its findings are not definitive. How does AI know what it knows? How can it hear my voice, or understand my intention as I write? What can I do if it questions my memory? Who or what is the final arbiter, the final authority? What if a sentence I have read floats unconsciously into my work and I fail to attribute it? What then?
Forgive me, but I just had to look up Governor Hochul's initiative. I had thought it was a law and almost wrote the word "legislation." AI interrupted my search. I had to deepen it by calling up a primary source: a New York State website. If that hadn't worked, I would have made a phone call.
So here's a suggestion: Swap some writing with a friend and fact check their work without using AI. And then fact check it using AI. As Zach Helfand writes in his fascinating New Yorker piece quoted and attributed above, facts are everywhere and no one, not even a machine, is omniscient. We all must maintain a healthy skepticism beyond the purview of the bots we have created.