Features

ChatGPT is not the end of written integrity

March 25, 2023


Design by Olivia Li

When the first capable version of ChatGPT was released in November 2022, professors across the internet bemoaned the death of the undergraduate essay as a method to assess students. The Atlantic called the moment a “textpocalypse” and a writer from The New York Times said he was “deeply unsettled” following a conversation with Bing’s integrated AI chatbot. ChatGPT, unlike earlier chatbots, has the capacity to generate coherent, long-form writing. 

ChatGPT has upended what it means to write. But, upon further analysis, it may not be the game-changer for writing or other industries that the world initially envisioned. Already, AI is present in academic spaces through functions like Grammarly and spellcheck according to Phil Sandick, assistant professor and interim director of the Writing Center. Academia and college admissions will likely move away from formulaic writing tasks where ChatGPT could dominate, but it may become a useful tool in the workplace and in conserving energy for select tasks. 

While ChatGPT differs from other AI in that it can write complete essays, Jacques Berlinerblau, the Rabbi Harold White Chair in Jewish Civilization, believes that ChatGPT cannot imitate the many kinds of collegiate-level essays required to succeed at Georgetown.

“Can it generate a cluster of facts that are fairly accurate and sequence those facts in a way that seems like it’s a human being? I think sure,” Berlinerblau said. “Can it write creatively? No. Can it write with sparkle and dash and personality? I’m going to say no, there’s no way.”

According to Berlinerblau, writing professors often look for voice and process in written assignments, elements expressed through multiple drafts and revisions of the same essay—skills beyond the capabilities of any existing AI chatbot. 

“Most of my classes utilize case studies and case methods and the Socratic method,” Jennifer Logg, an assistant MSB professor, said. “I’m asking over and over again how are we going to take what we just learned and apply it to the situation, and that’s outside of the boundaries of ChatGPT.”

In small intro-level classes like Sandick’s WRIT-015 or SFS proseminars like Berlinerblau’s, professors work so closely with students that it’s easier to tell when an essay’s voice changes because a section wasn’t written by a student, or when the tone of an essay shifts from previous ones. 

One hallmark of inauthentic essays may be the citations students use: specifically, whether they’re correct. ChatGPT tends to easily blend in information, making it hard for professors to identify citations. “One thing I think about with any kind of citation is does the voice change?” Sandick said.

Berlinerblau conceded that assignments in his upper-level Jewish Civilization and Government listed class could be vulnerable to AI-generated content because he doesn’t track a student’s writing across the semester, but was confident that he could not be fooled by ChatGPT in a class where he becomes intimately familiar with a student’s writing voice. “It would be really dangerous to do this [use ChatGPT] in a writing class,” Berlinerblau said. Although the university has yet to decide whether using ChatGPT is plagiarism, according to the Student Advocacy Office, students have been written up by professors for using it in their writing.

Some Georgetown programs may be more susceptible to ChatGPT’s capabilities, however. Sandick has previously taught writing programs where ChatGPT could easily complete the formulaic tasks assigned. Curious to see if this would apply at Georgetown, Sandick fed ChatGPT a prompt for an exploratory essay about the writing process and assessed it in WRIT-015 class discussions.

“I thought it didn’t really didn’t do any of the things I asked for,” said Sandick, noting that the essay didn’t get beyond any surface-level arguments or construct a narrative, something expected of students.

When listing facts or accomplishments without analysis is sufficient, ChatGPT can probably do the job, but where students are trying to share their voice or conduct higher-level analysis, most professors believe ChatGPT just isn’t sophisticated enough. 

Even if a student isn’t taking a writing-intensive class, using ChatGPT still comes at a cost to their learning, according to Edward Maloney, executive director of the Center for New Designs in Learning and Scholarship and a faculty member in the Graduate Program of Learning Design and Technology. He sees a danger in using ChatGPT to summarize information students haven’t actually engaged with.

“You could get a summary from ChatGPT and it may or may not be right, but how are you going to know if you haven’t read the material?” Maloney said. “If you immediately go from question to ChatGPT to answer, then you’re missing some steps.”

While ChatGPT might help a student pretend to be familiar with a topic by quickly explaining a concept, it cannot do all of the work. ChatGPT is imperfect in determining the truthfulness of information because it draws on all of the information on the internet—regardless of if it is true—which could leave students turning in error-ridden assignments.

“With a tool like this, you still have to evaluate and decide what is good and what is bad,” Maloney said, “So if you don’t have the skill set and what you’re doing is relying on the shorthand answer without the ability to judge the value of that shorthand answer, then you’re putting yourself at a disadvantage because you’re also not learning what it means to evaluate the information that you got.” 

As a pattern analysis tool, for example in a search engine, integrating ChatGPT could help to highlight the most popular answers to web searches and draw out connections between information according to Logg who sees web-browser-integrated chatbots functioning as a histogram showing which information appears in search results most often and is the most relevant.

Some professors believe ChatGPT detracts from the transferable skills universities are designed to impart on students. Students have reported using ChatGPT to summarize lengthy reading assignments, but Jonathan Brown, the Prince Alwaleed bin Talal Chair of Islamic Civilization, believes that is detrimental to the student’s learning. 

“If you’re a student, you have to ask yourself, what is it you want out of university?” Brown said. “Students have to decide if they want ChatGPT to do that for them or if they want to learn.”

But the impact of ChatGPT goes beyond the college classroom; some high school students have reported using ChatGPT for college admissions essays, leading to concerns the bot could threaten the usefulness of essays as a tool for admissions. To test this, Georgetown admissions officers reviewed five essays assembled by the Voice responding to the College of Arts and Sciences’ essay prompt: “What does it mean to you to be educated? How might Georgetown College help you achieve this aim?” 

Two of the essays were written by admitted students, and three by ChatGPT, with various additions to the prompt. One ChatGPT essay responded to the prompt with no further instructions, and another responded to the prompt with the instruction to “tie your response to religion and your interest in studying law later in life.” The final essay was asked to use a “toolbox analogy.” Of the three ChatGPT essays, admissions officers correctly identified two as being written by AI, and were unsure about the essay which discussed several Jesuit values and a personal ambition of wanting to become a lawyer. 

James Colman and Melissa Costanzi, admissions officers with more than 20 years of experience each, found the essays to be formulaic if not necessarily concerning. “They’re boring essays. There’s nothing in this that’s saying: ‘Wow, this kid’s very interesting.’ Maybe the admissions committee would love to see them, but [the essays] are not bad, they’re just boring,” Costanzi said.

The distinction between the expected high school-level narrative writing and the AI’s work was stark to many. Kelvia Jaupi (CAS ’22), another admissions officer thought the essays sounded similar to those produced by middle schoolers just learning basic essay format. “There’s no nuance to it,” Jaupi said. 

For a student’s admissions chances to be enhanced, the only AI-generated essay that wouldn’t immediately be passed over would be one subject to extensive tinkering by the applicant, the officers noted. The admissions officers also emphasized that students applying to Georgetown must confirm that all work submitted is their own. At this time, however, the admissions team reported that there is no way to determine an essay’s authenticity beyond their own instincts.

In application contexts beyond college admissions or job applications where submissions aren’t always assessed by humans, AI may be more useful. 

Job applicants and employers are increasingly using AI in the hiring process. Logg, whose research focuses on how algorithms are used in hiring, thinks frequent AI use will cancel itself out. “I can imagine if the algorithm is not just assessing what’s being submitted but also generating it, you have this arms race of how you beat this system,” Logg said. 

Already, research has shown work sample tasks, such as writing tests, tend to be better indicators of job performance than cover letters, and Logg predicts that hiring practices will shift to reflect that. 

While certain industries may innovate away from ChatGPT-vulnerable tasks (like formulaic essays and template cover letters), other fields could benefit from the quick drafts the software provides, such as summary memorandums or other kinds of formulaic professional writing. 

“Think about how much time you spend writing certain kinds of summaries, that this might now be able to do, where you could then use that time to do more high-level thinking,” Maloney said. Short-form areas might be where the tool’s strength lies: collating and summarizing information to make our lives easier. 

A human’s value in a world with sophisticated AI is in their ability to evaluate the AI’s outputs, Maloney said. “This tool, at least right now, is not sentient. It doesn’t have a perspective, it doesn’t have a point of view,” he said. “It has no way of saying this is good or bad, or making ethical decisions, other than what other people have written, so it has no touchpoint to the world that we live in or we act in. You do.”

Even if AI eventually replaces humans in some roles, he believes that the tool is limited by the capabilities of the humans operating it. Humans still need to understand what ChatGPT is trying to achieve, and how to produce the same result manually. “[Knowing] how to use a calculator is as important as having a calculator,” Maloney said. “ChatGPT is the same thing. What are the questions you are asking, and how do you help people understand that these are really good, useful questions and useful ways of using the tool?”

While ChatGPT may have initially been viewed as an essay writing tool, it doesn’t seem to be the academic disruptor or immediate industry changer that it first appeared to be. Instead, it may have a future as a collaborative tool for humans, like Wikipedia.

In thinking about how to prepare students for a future that integrates AI, there needs to be a consideration of how humans can develop skills that complement the technology. “Part of our investment in where we’re going and what we’re doing needs to be helping our students to think about the skills that allow them to actually work,” Maloney said. “Things that are fundamentally human—ethics and compassion and empathy—[are] the things that we want to make sure that we’re always investing in.”


Kulsum Gulamhusein
Kulsum is a senior in the College and the Executive Opinion Editor. She is a transfer student, and her favourite time of the year is when she gets to "play" coursicle.


More: , , , ,


Read More


Subscribe
Notify of
guest

2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Marmaris

The advent of ChatGPT is indeed transforming the landscape of writing, but it’s heartening to hear that human elements like creativity, personality, and the nuanced process of revision are beyond its current capabilities. As a writer, I firmly believe that the human touch in crafting essays, especially in academic settings, goes beyond the factual and enters the realm of voice and authenticity. While AI tools have their place, the richness of genuine human expression remains unparalleled. If you ever need assistance or insights into the art of writing, feel free to reach out!

Astron G

Absolutely, the emergence of ChatGPT has undeniably reshaped the writing landscape. While AI tools serve a purpose, the depth of genuine human expression is undoubt. Additionally, for comprehensive essay support, I recommend checking out https://www.topessaywriting.org/ essay writing service, which excels in providing quality assistance surpassing the capabilities of AI. Human always be better than AI.