ChatGPT, Strollers, and the Anxiety of Automation

28/02/2023
Amanda Parrish Morgan

LAST FALL, I published a book about strollers and what they reveal about our attitudes toward children and their caretakers. Although I pitched Strolleras, in part, a critique of the consumer culture of contemporary American parenthood, I came to love my (many) strollers. In the years when I routinely ran while pushing my kids ahead of me in our jogging stroller, I recorded race times faster than I had as the captain of my college track team. In the long, claustrophobic early days of the pandemic, my son and I meandered slowly up and down the sidewalks of our neighborhood watching that late, cold spring come to New England. Often, at the end of a long stroller walk or run, my kids fell asleep, and on warm days, I’d park them in the shade and myself in the sun to work while they slept, feeling a proud mix of self-sufficiency and frugality (no childcare needed to run or meet a deadline). 

In the months after my book came out, friends and family sent me pictures of themselves pushing strollers in iconic places (the Brooklyn Bridge, a protest in front of the Supreme Court, Buckingham Palace) as though to say: Here I am living an adventuresome life with my children right alongside me. In my inbox I had photos of a fleet of UppaBaby Vista strollers outside the 92nd Street Y, a suburban garage filled not with cars but with strollers, movie clips of runaway prams, and, more than once, stories about self-driving strollers. One video clip from my husband’s cousin showed a woman jogging, swinging her unencumbered arms next to a stroller while it matched her pace. To that one, I responded with a quick line about how much faster it would be to run without having to push the 100-plus pounds of my Double BOB.

That kind of casualness was a relic of a time before my inbox started to fill up with another flurry of emails, this time about ChatGPT. I taught high school English for many years and now teach freshman composition, so news about the new—horrifying, amazing, fascinating, or dystopian, depending on how one sees it—large language models, and their role at the nexus of writing and teaching, often made friends and family think of me. Because everyone has a wealth of (often fraught) memories about their own high school years, and because many of my friends now have children around the age of the students my husband and I teach, we end up talking about work in social contexts fairly often. Just how stressed out are the high school students enrolled in multiple AP classes? Are our students’ weekends like an episode of Euphoria or even—and this would be alarming enough—more like what our own adolescent parties were in the late ’90s? What do we wish our students were better equipped to do? How do we keep them off their phones in class? And, most recently, as news about ChatGPT swept through increasingly wide rings of society, I began to get questions that were not so different than those that accompanied the emails about self-driving strollers: What are we going to do about life as we know it being changed by automation?


Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus nec blandit sem. Nam vulputate id mauris eu venenatis. Vivamus blandit magna eu malesuada pharetra. In iaculis quis tortor ut dignissim. In vitae tortor malesuada, tempus nibh eget, mattis tortor. Phasellus ullamcorper lorem vitae dapibus gravida. Praesent ac dui semper, congue quam at, iaculis augue. Fusce tincidunt metus vel velit rutrum, ut varius mi ultrices. Maecenas urna risus, pellentesque ac mauris et, lacinia blandit urna. Aenean tincidunt neque in arcu dignissim, non gravida tellus pulvinar.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Phasellus nec blandit sem. Nam vulputate id mauris eu venenatis. Vivamus blandit magna eu malesuada pharetra. In iaculis quis tortor ut dignissim. In vitae tortor malesuada, tempus nibh eget, mattis tortor. Phasellus ullamcorper lorem vitae dapibus gravida. Praesent ac dui semper, congue quam at, iaculis augue. Fusce tincidunt metus vel velit rutrum, ut varius mi ultrices. Maecenas urna risus, pellentesque ac mauris et, lacinia blandit urna. Aenean tincidunt neque in arcu dignissim, non gravida tellus pulvinar.
 

IT WAS FROM my husband that I first heard of ChatGPT. He teaches high school physics and computer programming, and so its implications in the classroom were on his radar long before my colleagues and I in the English department had even heard of it. “Soon,” he told me, “everyone is going to be talking about this.” He was right of course, but that first night over dinner, it was easier to dismiss his predictions as alarmist or the niche concerns of computer programming teachers. 

My initial response was to insist that there are important differences in how easily AI might produce work mimicking student code as opposed to essays. But what I couldn’t dismiss was a concern much broader than the assignments either of us might give or the implications for our specific students: the ethical and philosophical implications of the program itself. Instead of being built around if-then commands, Nick explained, ChatGPT is a neural network. What is it then, Nick asked me, that makes those neural networks that comprise ChatGPT different than our biological network of neurons? The fact that they’re silicon instead of carbon-based? Why would a carbon-based network allow consciousness to develop and a silicon-based network not? How, he asked, could eight extra protons make all the difference? Nick’s line of thinking was almost intolerable to me. Of course, I insisted, there is something beyond carbon—perhaps something we can’t put into words or even prove exists—that makes us human.  And though I pointed to emotions and connections and relationships, I could not articulate quite what that human-making something is.

UNLIKE STROLLERS, WHICH I will happily discuss all day, I hate talking about ChatGPT, and yet I find myself doing so all the time, and often because I am the person who’s brought it up. 

At the beginning of the spring semester, I posed a metaphor for my students to consider: Wasn’t using ChatGPT to complete a writing assignment (without acknowledging having done so) like going to the gym, setting the treadmill at 10 mph, letting it run for 30 minutes, taking a photograph of its display, and then claiming to have run 5 miles at a six-minute pace? It might appear to have happened, and the student, in a very passive way, would have been responsible for bringing the illusion to life, but the student would be no fitter or faster than when he or she had begun, or than the student who’d run one or two minutes at a six-minute pace or 5 miles at a comfortable jog.
Most students seemed to acknowledge the metaphor’s validity. I was pleasantly surprised to hear my students say (even if it was just for my benefit) that they’d avoid using AI to complete writing assignments for a mix of reasons that included fear of being caught, concern over the quality of writing produced, and the sense that, at some point, not having practiced writing for years might catch up with them. But one student was outspoken in his disagreement: The point of a writing assignment, he maintained, was just to receive a grade. He didn’t plan to work in a field that required much writing, and if he did, he suggested, couldn’t he just use ChatGPT for that as well? 
 

I was in some ways relieved that he’d brought the idea of writing’s ultimate purpose into the discussion and eager to argue back that the purpose of a writing class is not to appear to have written, but to write—not to receive the credit for the course in pursuit of a diploma and eventually a job, but to practice the skills that said diploma was meant to indicate and that said job was likely to require. 

He was polite, but unconvinced. No matter how capable I was of understanding where he was coming from (thinking of myself in Calculus 131, for instance), I could not bury the defensive sense of panic his remarks incited. Isn’t writing different from learning logarithmic functions? At least because it’s so deeply connected to language and expression and connection? Even to the way we care for the people and the world around us? Not just in the way that writing, for someone who makes her living working with words, is an act of care itself—of noticing and recording and bearing witness—but because if we, collectively, decide that distinguishing between human language and machine language is irrelevant, that language can be automated, aren’t we taking a running leap into a dystopian future stripped of care much more broadly defined? 

I realize that knee-jerk opposition to automation is not just defensive, but also simplistic and nearly always hypocritical. I am generally averse to arguments that hinge on the old way of doing things being superior, not least of all because so often they are (deliberately or not) rife with reactionary attitudes about women’s roles in their families and society. Still, I can’t stop thinking about all that I’ve already lost to automation. Automated strollers, like the Glüxkind Ella now available for preorder, doubtless offer a more accessible option to caretakers needing mobility assistance. It would be an overstatement to suggest that all automation strips meaning from human relationships, that a stroller powered by muscle rather than battery is somehow more meaningful, more real parenting. Still, when my daughter was a baby, she loved the battery-operated baby swing we kept in our family room, and although I knew it was irrational, I sometimes felt vague pangs of guilt over how easy it was to sooth her with it. Shouldn’t true maternal love mean rocking her in my arms until my back ached and my muscles burned with fatigue?
 

Yet, the link between the development of appliances to automate household work and the first wave of feminism are long established, and I haven’t felt similar pangs over other technological advances that have taken place in my lifetime, even when they’ve changed the way I engage with activities I’ve loved or derived meaning from. I got a stand mixer years ago and no longer cream butter and sugar by hand. It was only in the consideration of this essay that it occurred to me that something—what, though? Love? Hand muscles? Some virtue in the toil itself?—might be lost in using it to bake cookies. Once an avid if mediocre photographer, I’d spent hours in my high school darkroom trying to fix the ways I’d messed up the depth of field, focus, exposure, or framing in my pictures. Now, like almost everyone else, I use my iPhone. In Portrait mode on certain occasions. Yes, this is all so much more efficient, but it also makes baking or taking photographs feel less like something I’ve meaningfully done.

At the risk of sounding like a 17th-century Calvinist who sees all instances of work or toil as inherently virtuous, I have been trying to articulate the real feeling that something—is it care? Intimacy? Connection?—is in danger of being lost in all this automation. I can see the care my grandmother’s hands put in the neat rows of stitches on the sweaters she knitted me, precisely because the product took time to make. Is it because taking and developing a picture used to take longer, be less certain to “turn out,” that the portraits I took of my high school friends on film feel more personal? If taking a baby for a walk in an unwieldy Victorian pram takes more work, does that imbue the outing with more meaning—with more love? 
 

Still, I don’t remember how to knit, and although I know how to make cookies from scratch, I buy the vast majority of the food my family eats at Trader Joe’s so that I can exhaustedly combine “interesting” flavors at the end of a busy weekday. While I sometimes joke about being a bad cook, I don’t feel any real sorrow or guilt over my rushed—automated—food preparation, and when I do think sadly about not knitting my kids’ sweaters, it’s from a sentimental, not philosophical perspective. I wish I could remember the skill my grandmother taught me because I loved her very much, not because I think I’m an inferior or less caring woman because my kids’ sweaters are from Gap Kids.

Writing is an act of care for me because I’m a writer, and responding to student writing is an act of care for me because I’m a teacher. Do those who argue that writing is no less a utility than food or clothing have a point? It’s an uncomfortable question to consider. This is true personally, of course, but also because it leads quickly to an uncomfortable line of thinking about the role of a liberal arts education. It feels overly cynical to give up on the belief that learning, rather than a degree or networking opportunities, is at the heart of a college education, and perhaps for that reason, I feel it’s reasonable to expect my students to buy into the idea that learning to write more clearly and more thoughtfully is a worthy use of their time no matter how minor a role writing might someday come to play in their lives.
 

I DIDN’T MENTION self-driving strollers to my students—after all, most of them are a decade away from even considering parenthood, and it’s unlikely strollers loom as large in their minds as they do in mine—but I did bring up autonomous cars and the admittedly irrational reaction I have whenever I hear about one that’s been involved in a fatality. Although I’m familiar with all the statistics bearing out the vastly lower chances of a crash in an self-driving vehicle, I can’t imagine surrendering control (I believe myself to be a good driver) of my or my passengers’ safety. 

“But what if you could be convinced that you’re wrong about your driving?” one student asked. “What if you saw right in front of you the numbers that convinced you it’s safer for everyone to use a car in autonomous mode?”

I knew he was right, of course, but still I could not reconcile this with my aversion to ceding control of my car to the car itself. Yet, if someone led me into the cockpit of a small airplane and offered me the chance to turn autopilot on or attempt to fly the plane myself, I wouldn’t hesitate to rely on the plane’s automation, because I understand very well that I do not know how to fly a plane and that making a mistake would almost certainly be fatal. It’s no big deal, really, for me to push a stroller, even up a steep hill or for a long time. I like to be outside and enjoy going for walks alone or with my kids. Perhaps it’s in anticipation of my smug perception of stroller-pushing expertise that Glüxkind mentions its automated stroller’s enhanced safety features. “Ella,” as the stroller is called, includes traffic detection and an “enhanced multi-break system.” The marketing copy promises parents “more peace of mind,” declaring, as if speaking to that horror trope we all know from Battleship Potemkin or Rosemary’s Baby: “Runaway stroller? Not on Ella’s watch."

But the confidence in my strollering ability, including my ability to use the brakes and prevent runaway episodes, is not how some of my students feel about writing. Unlike driving a car or pushing a stroller, writing can be a task that’s daunting both for its difficulty and often-opaque criteria for success. Not to mention that my students think about grades very differently than even the most competitive of my peers and I did. Grades don’t feel to many of them like measuring performance in a specific subject or even specific skill, but like wholesale endorsements or admonitions of their character. My students don’t tend to think of themselves as “good writers,” the way that I—deservedly or not—think of myself as a “good driver.” The stakes of success or failure on a writing assignment feel, to many of my students, much closer to piloting a plane. Writing isn’t a way to give care, and receiving feedback on their writing, while it may be something I agonize over endlessly, likely does not feel like a way of being cared for.
 

TOWARD THE END of my conversation with my students, I mentioned that I’d seen some discussion on Twitter about using AI to write letters of recommendation. All of my students said they’d feel betrayed if they learned a professor had done so, and I agreed that it felt like an ethical violation. I’m a relatively quick writer, feel flattered when students ask me to write their letters, and don’t mind doing it. I do, though, absolutely loathe grading. I don’t loathe talking with students about their ideas, their writing, or their understanding of the texts we’ve studied, but assigning the grade itself feels very much like a means to an end. I know that often a student will be upset by the grade and that our conversations about his work will focus on the number I’ve put in the box on Blackboard rather than on his ideas, his writing, or his understanding of the texts we’ve studied. Would I be tempted to use ChatGPT to grade student work? Of course. But similar to asking AI to generate a letter of recommendation, it feels unethical, because while I see a grade as an imperfect measure of a student’s skills at a moment in time, most students see grades as something deeply connected to their relationship with me and with the material in our course. 

In that vein, the line between dystopian and utilitarian is not binary. I don’t think autonomous strollers, or even ChatGPT, signal the end of humanity, but I do think they signal an increasing willingness to view the mundane and tiny daily moments that make up our lives as only a means to an end. I think often of Annie Dillard’s advice to aspiring writers: How we spend our days is, after all, how we spend our lives. The question comes down, for me, to considering the purpose of a given parenting task, an element of teaching a course, or a type of written communication. I’m satisfied with a Crate & Barrel chatbot handling the return of the broken planter I received just like I’m satisfied with using my washer and dryer to keep my family’s clothes clean. Maybe years in the future I’ll look at ChatGPT the same way I do the invention of the automatic transmission or spell-check: a useful, but ultimately incremental, step in the steady march of technological progress. If I were washing my family’s clothes on a washboard and then hanging them out to dry, I would not be writing this essay. But I also can’t shake the sense that we ought not to relinquish the kind of work—caring, teaching, writing—without a fight.