ChatGPT might be capable of passing the LSAT for law school, but ChatGPT struggles when writing a college admissions essay. The findings are telling for legal bloggers.
In the article, “Can ChatGPT get into Harvard?” reporters Pransh Verma and Rekha Tenjarla from The Washington Post tested its essay writing.
Hardly, was the finding of former Ivy League college admissions, counselor, Adam Nguyen, who previously advised students at Harvard University and read admissions at Columbia University.
Responses written by ChatGPT often lack specific details, leading to essays that lack supporting evidence for their points. The writing is trite and uses platitudes to explain situations, rather than delving into the emotional experience of the author. The essays are often repetitive and predictable, leaving readers without surprise or a sense of the writer’s journey. If chatbots produce content on issues of race, sex or socioeconomic status, they often employ stereotypes.
At first, Nguyen was impressed by the AI-generated essays: They were readable and mostly free of grammatical errors. But if he was reviewing the essay as part of an application package, he would’ve stopped reading.
“The essay is such a mediocre essay that it would not help the candidate’s application or chances,” he said in an interview. “In fact, it would probably diminish it.”
I frequently use ChatGPT in my blogging. I’ve experimented with GPT’s writing capabilities, whether it’s generating a sentence, creating a paragraph, or providing a summary. The results have been varied.
GPT is better utilized as an assistant and editor. It can help with a range of tasks including creating outlines, checking grammar, summarizing, social media sharing, and crafting engaging titles. It’s a beneficial tool.
I recommend that legal bloggers experiment with ChatGPT to see how it fits into your blogging as an assistant and editor. Not in writing your posts.