School counselors Stephanie Nelson and Richard Tench, while hundreds of miles apart, give their rising seniors the same assignment when asked for a letter of recommendation: Take a “brag” sheet, fill it out with challenges they’ve overcome or accomplishments they’re particularly proud of, and give it back to the counselors to help guide their writing.
It’s a common counseling technique. And what is also catching on: Counselors then plug those student achievements into a generative AI tool to help compose the letter of recommendation.
“I’m not taking away the personal part, and I’m still using my tried-and-true counseling techniques and skills and enhancing what’s already in mind with it,” Nelson, a school counselor in North Carolina, says. “I’ve joked with students who say, ‘This is wonderful,’ and I’ll say, ‘ChatGPT helped.’”
While students are already turning toward AI to help with everything from study aids to mental health needs, with the increasingly wide chasm spreading between the number of students and number of counselors available in schools, generative AI could present as a tool to help both groups do their part to complete college applications.
“I’ve plugged things into AI to help me strengthen the letter; sometimes for time, sometimes I get stuck — when you have to write hundreds of letters a year,” says Tench, who estimates he writes between 120 to 150 letters of recommendation each year at his school in West Virginia. “It’s definitely a useful tool. While it helps them fine-tune their resumes, it can also fine-tune our letters to show the best in our students while also keeping our voice.”
How Widespread Is This Use?
While plenty of headlines have been penned about students’ generative AI use and education institutions’ concerns, the question of school counselors using it to aid the college admissions process has been largely unaddressed.
There is not much data, if any, about that specific use. The American School Counselor Association told EdSurge they do not track it.
For counselors who are seeing rising numbers of students — Nelson sees roughly 380 students while Tench calls himself “lucky” with a 275:1 ratio — AI tools could slightly lighten their own load.
But opinions about this are mixed. In a focus group for foundry10, an education research organization, conversations swirled between tech-forward teachers and their more novice counterparts, bringing to light the stark difference in their AI usage.
“You get reactions of teachers that are already AI-forward and the others kind of shocked or surprised, like, ‘Really? You’d use it for something like letters of recommendation?’” says Riddhi Divanji, a technology, media, and information literacy team lead at foundry10.
That discussion led to a 2024 study, finding roughly 1 in 3 students and teachers self-reported using some form of generative artificial intelligence to help with college essays or writing letters of recommendation. Divanji, a co-author of the study, acknowledged that number has most likely risen from when data was first gathered in the spring of 2024.
“Students were wanting to experiment with the tools but wanted to do it with boundaries; and no one was helping them understand what those boundaries were,” she says.
The study found students turned toward their parents first for help, then teachers and counselors. For first-generation students whose parents did not attend college, or students who couldn’t afford pricey college admissions coaches that can run hundreds of dollars an hour, “then it would make sense to turn toward this tool to help,” Divanji says.
Using AI Ethically to Apply to College
The usage should come with guardrails. Hannah Quay-de la Vallee, a senior technologist at the Center for Democracy & Technology, encourages students to write their own essays at first, then be specific with requests.
“If you say, ‘Write my essay,’ it’ll be much more error-prone, versus, ‘Help me come up with a thesis statement,’ or ‘My introductory paragraph isn’t punchy enough,’” she says, adding a human should always be the final entity to look at an essay, not a computer. “The more targeted tasks you can give it the better. And keep a real strong eye out for error and bias.”
She also encourages both students and faculty to look at exactly what tool they are using for help. Many education technology companies are touting their own solutions, which are “wrappers for ChatGPT, or Gemini or Claude,” she says, with no actual education research involved.
“Honestly, just use ChatGPT at that point,” she adds.
Tench and Nelson both say they are upfront about their use of AI and expect the same of their students, repeatedly stating the best usage is to help brainstorm or fine-tune ideas.
Each expert also gave the reminder for educators and students to check first with both high school and college AI policies. Colleges are mixed on allowing students to use AI in their applications, with some encouraging it while others outright ban its usage.
AI policies are often hyperlinked in the admissions application, but it could take some more digging to find, Tench says.
“It’s following our policies as a school but also their policies in college to make sure those guidelines are followed; that’s part of the ethical and responsible AI usage for students that’s so important,” he says. “For some, it’s harder because AI is the easy way out. But I feel the longer it’s around, the more intentional and systemic we can be in training them in the do’s and don’ts.”
This post is exclusively published on eduexpertisehub.com
Source link