I recently received an email from a “consultant” inviting me to help a publisher create an automatic essay-grading technology product for humanities professors to use in introductory-level courses. The consultant claimed that once completed, the program would “accurately auto-grade brief writing assignments – 500 to 900 words.” The program, the email said, “uses specific writing prompts and rubrics to achieve computer grading accuracy.”
And how will these impressive results be achieved, you ask? According to the consultant: “For the program to work correctly, thousands of student essays are scored by hand and loaded into the system. By doing this, the system ‘learns’ how to grade essay questions.” It is mildly reassuring that before a machine will grade subjective work objectively, “instructors” will grade essays by hand for the enlightenment of the machine.
And from what source will these thousands of essays come? Me, apparently, and other professors who can be persuaded to force their students to respond to publisher-generated writing prompts and post them “through an online portal” for the creation of their exciting new educational tool. By violating our students’ privacy and right to ownership of their intellectual property, we will help “build the bank of student essays needed to develop the product.”
What will be the benefit to professors in this beautiful cooperative relationship? For one, they can receive the bribe of $350 (a “stipend,” as the consultant calls it). Beyond that, the benefits are less clear.
Professors will be given “a choice of writing prompts and rubrics,” which would be helpful if we were incapable of thinking for ourselves. Moreover, as the consultant assures me, “you will have several writing assignments graded for you, which will save you time.” Thus, participants could claim to incorporate writing into their humanities courses without actually having to grade it themselves. In addition to these perks, perhaps some educators would find fulfillment in assisting a forward-thinking corporation in producing an “auto-grading functionality.”
It could also be entertaining to see how total strangers view the work of one’s students. And, if anyone complained of having received an unfair grade, the professor could simply respond: “I’m sorry, but a stranger hired by a publisher to help produce a product that will eventually grade your work entirely by automated computer program says otherwise. You’ll need to contact the publisher.” And good luck to any student who attempts to contact the correct one of the many instructors at work grading the thousands of essays in auto-program hatcheries while fed a steady supply of soma.
More importantly, surrendering one’s professional responsibilities will also be good practice for the day when professors will be entirely replaced by computers, which will be programmed to present with accuracy all of Western intellectual history as well as to discuss (in the most engaging manner) the most relevant themes and issues with today’s students. Don’t worry — these computerized professors will be developed by the hand-loading into the system of thousands of lectures and conversations with students, obtained at top-dollar rates, so the system can “learn” how to teach.
After reading my email invitation to participate in the creation of Frankenstein’s latest techno-monster and spending some time with my head inserted in a paper bag, I emerged and wrote a short response to the consultant expressing my opposition to the project and forwarded the exchange to my immediate university administrators. Our coordinator of general education is pursuing the matter with the provost to see if we need a new policy to ensure that our students are protected from the kind of exploitation on which such propositions as this one depend. I also wrote to the publisher and indicated that I am boycotting the company until it discontinues the unethical development of this unethical product.
I do not know whether any of these actions will do any good. At the least, I wish to make a modest proposal: This is one auto-generated wolf in surrogate-cloned-sheep’s clothing we need to kill now.
Author Bio: Kathleen Anderson is a professor of English at Palm Beach Atlantic University.