Citizen Science by Jamie Zvirzdin
AI Renaissance: A Chance to Reduce Cheating,
Revitalize the School Experience
Middle school is hard. It feels like the Dark Ages. There’s extra insecurity, boundary-testing, and of course, hormones. The pressure to perform well in school ramps up—as does the temptation to cheat. My 13-year-old son and his generation face an additional challenge: smartphones, online answer banks and now artificial intelligence services have made cheating very, very easy. And dishonesty in academics leads to dishonesty elsewhere, including in professional scientific research.
This isn’t a hand-wringing article about the moral decrepitude of cheaters, however; it is a wake-up call to teachers and educational administrators to change the game. Our expensive, competitive, performance-based emphasis on exams and grades have created a conflict-of-interest situation for students. It’s a system-design problem that continues right on into science circles, where fudging results equals grants and promotions. Researchers Nina Mazar and Dan Ariely found in their 2015 paper “Dishonesty in Scientific Research,” that even good people who value honesty justify cutting corners given these high-stakes situations. By reducing conflict-of-interest situations for students as often as possible, AI can actually help us usher in a new age of academic enlightenment, where learning is once again a privilege and a pleasure. If science institutions follow suit, we’ll also have more accurate data sets and better scientists.
To give you a tip-of-the-iceberg look at how systemic the problem is: My son, who is in seventh grade, was taking his end-of-year exams last week and saw a classmate cheat using a phone hidden in their lap. We have apps now where you simply take a picture of a math problem and it will display the answer. The kids—and the money-grubbers who prey on academic anxiety—have always been one step ahead of educators technologically: my son’s middle-school classmates have already figured out how to bypass the firewall on their school laptops to play games and access cheating sites when their teachers and parents think they are diligently working on their homework.
The cheating doesn’t stop in middle school. The anxiety around grades increases exponentially when you’re paying—or borrowing—thousands of dollars for a course you can’t afford to fail. A young friend of mine in college admits to using online “Homework Help” sites like Chegg.com, where you can type in the exact question and get the exact answer. As a test—I have never used Chegg, nor will I—I typed in a graduate-level physics problem from a final exam I took at Johns Hopkins, a problem I know my professor wrote in his own words. Lo and behold, as the first search result, Chegg promised to give me the answer to that exact problem—if I paid for their service. What an absurd game: You pay thousands of dollars for a course to stress-test you to the point you feel compelled to find and pay for the answer online? This is not education. It is a broken system that rewards cheaters and punishes the honest seeker of truth.
This year saw the public release of ChatGPT and other AI services, which stressed-out students now use to push cheating past the point of absurdity to pure stupidity. A friend of mine who teaches college writing expressed anger at how often she now sees the “blandly perfect grammar” of AI essays, and I agree—good writing teachers can easily recognize ChatGPT’s writing voice. It is disheartening when students don’t have enough confidence to write with their own voice.
Since ChatGPT often makes up fake sources, my friend also knew a student had cheated when the essay cited a source written by “Jane Doe.” Other teachers say some students aren’t upset about cheating; they’re upset they lost points when an AI chatbot gave them the wrong answer.
What makes student fear far worse, especially in science courses, are professors who elect themselves purifiers of their discipline. They are proud of their weeder courses. These self-annointed gatekeepers make homework and exams so difficult that students struggle to understand a fraction of the material. A classmate told me her undergraduate physics teacher proudly whittled 75 students in the physics program down to six that actually graduated. This is not education either. It is Academic Hunger Games.
No matter how severe the consequences become for academic dishonesty, students will continue to bend over backward so they don’t fail. My Mathematical Methods professor at JHU told us of a student who hired a professional test taker: The remote cheater-for-hire assumed control over the student’s computer and took a timed, online test for the student. That student was caught and disciplined, but many never get caught, and these are the ones who graduate with great grades and get great, high-paying jobs. They falsify science data and get grants and promotions. The dishonest win the game.
So it is time to change the game.
Rabindranath Tagore, a Bengali polymath, poet, social reformer and test-disliker, wrote in his 1917 book “My Reminiscences,” “The main object of teaching is not to explain meanings, but to knock at the door of the mind.” Teachers and professors can reject the taskmaster role and reclaim the far more enjoyable duty to inspire and spark curiosity in students. Administrators can back off from their obsession with metrics and reward teachers (maybe with fair pay and decent benefits?) who find clever ways to help students engage creatively with class material—using all available tools, including ChatGPT.
Khan Academy, for instance, is actively turning ChatGPT into an online tutor and teaching assistant named Khanmigo, which is brilliant. Similarly, teachers can invite students to write a class book together using ChatGPT—and odds are, such a book will be clearer and more concise than most textbooks. As someone who proofread science and technical textbooks for many years, I state from firsthand experience that academic textbook writing is often abysmal and arrogant, written for the professor’s own ego at the expense of the student’s comprehension and confidence.
Teachers can use AI essays (which are not the best, but not the worst) as starting points for discussion: Write an essay about X using ChatGPT. Now let’s talk about the essay’s strengths and weaknesses: How can we improve word choice and revise the essay using our own writing voice? What personal examples or analogies can we add to strengthen the argument? Why is it wrong to use a source from “Jane Doe”?
We can also redesign the educational game to allow failure and thereby reduce the motivation to cheat. In “How Humans Learn: The Science and Stories behind Effective College Teaching,” Joshua R. Eyler says teachers can provide “opportunities for low-stakes failure so that students can take the risks necessary to enact deep learning.” Students must be allowed to “fail up,” as I like to tell my own science writing students. I ask my students to redo exercises, without punishment, if I see too many mistakes in them, and I’m grateful for teachers who did the same for me when I was growing up. It was in those classes I learned the most.
So I encourage educators and administrators to reexamine their systems, their assignments and exams. Flip the classroom. Ask students to teach something or follow a passion project. Play “Stump the Teacher,” where students bring in difficult problems for teachers to solve; if they can’t, the student shows the teacher how. In this new era of deep machine learning, do not forget how our own deep learning actually happens: by feeling inspired, becoming curious and making mistakes.
I don’t know a single person who loved going to middle school. Maybe some of that is an unavoidable part of growing up, but if we embrace new technology and reduce conflict-of-interest situations—if we can make school truly cool instead of cruelly cutthroat—perhaps we will bring new light to students who feel trapped in the shadow of perfectionism. Perhaps we will find, as Getty medievalist Larisa Grollemond argues, that there was no such thing as the Dark Ages; that the Middle Ages “were always the illuminated ages.”
P.S. I wrote this article myself, found the resources myself. You can check it yourself using https://www.zerogpt.com/ (which is also, by the way, not always 100 percent correct). I invite teachers to borrow “Subatomic Writing: Six Fundamental Lessons to Make Language Matter” from their school libraries for more writing exercises to help students gain confidence in their own writing voice.
Jamie Zvirzdin researches cosmic rays with the Telescope Array Project, teaches science writing at Johns Hopkins University and is the author of “Subatomic Writing.”