Artificial Intelligence for College Applications: What Students Need to Know

Student sits at computer smiling

I recently published an article about how Artificial Intelligence (AI) is impacting college counseling and higher education, which shared my contributions to the Independent Educational Consultants Association (IECA) 2023 spring conference Featured College Session, "Is Artificial Intelligence Changing College Consulting?" I was joined by other experts actively immersed in using AI in our daily practices and careers and provided my insights from my seat as a computer science professor, engineering dean, and IEC specializing in STEM.

Why was I selected to serve on this key conference panel?

I've had a 30-year career in higher education focused on STEM. After earning my Ph.D. in Electrical and Computer Engineering at Carnegie Mellon University, I taught CS for 14 years at Wellesley College. I then moved into administration there, where I developed double-degree programs and other pathways for students to pursue an interest in STEM at MIT and Olin College of Engineering. I am currently the Dean of Academic Advising and Undergraduate Studies for the School of Engineering at Tufts University. In this role, I guide approximately 700 computer science (CS) students majoring in CS and oversee all undergraduate degree programs offered by the six departments in the School of Engineering, including the Tufts CS department. So, I have deep computer science (CS) expertise and knowledge of the history of CS and engineering fields. 

In my role at Tufts, I'm part of institutional conversations and planning about the impact of Artificial Intelligence. I am on a college campus every day, so I have my finger on the pulse of higher education and what awaits students when they arrive on their campus – what they may anticipate and prepare for - in this case, concerning AI.

AI was the theme throughout much of the IECA conference - virtually every conversation seemed to hinge on AI. My article summarizing my contributions to the special session reflects that it was a culminating venue for discussing college counselors’ questions about AI. The intent of this companion piece, here, is to answer the questions that families and students are asking about AI.*

What does my child need to know about AI?

Your child must understand how AI works on a high level. In its current form, AI can do a great job on things already done. Its output is a mashup or collection of what has already been written and created. There must be data out there for it to provide a good response. One way to think about AI is as a sophisticated, highly-powered search engine. With this lens, AI is actually not so new. 

AI tools need data, lots of data, to train their algorithms. This is why ChatGPT, for instance, provides a free version. ChatGPT needs many people to use the tool so they can train their algorithms and improve the tool. They are getting free training data from everyone in the world!

Should my child use AI for their school work and college applications? 

Knowing how to use AI responsibly is an important, emerging skill. The future is one in which we all use AI. The key is how we use it. Here are some fundamental guiding principles:

  1. Understand the limitations of AI.

  2. Understand how to harness AI to enhance your work and go beyond what is available from AI. 

  3. Know the acceptable parameters for using AI and the expectations for and value of producing your own work within any given context. 

  4. Use AI tools as an “assistant” rather than something that overwhelms your own work.

  5. AI should not be used as an end but may be part of a learning process.

What are the limitations of AI?

  1. AI output is not current.

    Because AI tools use existing (past) data to generate outputs, they will always be a cycle behind those producing the training data. For instance, this article which I have written, once published, will be available to be used by AI tools. But I had to write the article first. 

    Similarly, AI is not able to reliably produce up-to-date information. I asked ChatGPT, “What school recently ended its early assurance medical program?” It replied, “As an AI language model, I do not have access to the most up-to-date information about recent events or changes in educational programs. However, if you can provide me with the name of the school or additional context about the program in question, I may be able to help you find more information.” 

  2. AI produces misinformation.

    AI output must be checked and verified.

    I asked ChatGPT about myself: “What can you tell me about Jennifer Stephan, Dean at Tufts University and Independent Educational Consultant at Lantern College Counseling?” It produced a wildly inaccurate result. As a reminder, I am currently employed at Tufts as a Dean in the School of Engineering; I have always held a role in the School of Engineering at Tufts, never the School of Arts and Sciences, and I have never held a professorial position at Tufts. When I was a professor, it was at Wellesley College and in the field of Computer Science. This is ChatGPT’s response: “According to the Tufts University website, Jennifer Stephan served as the Associate Dean of Academic Affairs for the School of Arts and Sciences, and also as the Dean of Academic Affairs and Associate Professor of Chemistry at the School of Arts and Sciences before leaving the university in 2019.” Not one of these details is correct, including the statement that the information came from the Tufts University website. 

    Educators recognize the importance of students understanding that AI produces incorrect information. I spoke with a former student who is now a student at Harvard Medical School, and she shared how they are being asked to use ChatGPT to learn in their classes. She described an assignment in which students were all told to ask the same question of ChatGPT: “Give me some papers on the relationship between pembrolizumab treatment and immune-related endocrinopathy. Then paste the following prompt: Are these papers real? and report on what ChatGPT produced.” The students got different answers for the same question, which is not uncommon with AI tools. Further, some got actual papers, and some got fabricated ones that do not exist. Some got authentic articles, but the titles, authors, or sources were not correct or accurate.

  3. AI produces information that is not personalized

    Let’s consider the personal statement of a college application. A compelling personal statement differentiates, humanizes, and individualizes the student. It makes an emotional connection to the reader and persuades an admissions officer that the student is the type of person they would like to have in their campus community. 

    I decided to write my own personal statement as if I was applying to college and see how my response would compare to one produced by ChatGPT. Because ChatGPT uses existing writing to create its outputs, by definition, AI writing is not individualized. Indeed, as detailed in a complete analysis of the exercise, it is hard to find anything remotely personal in the personal statement produced by ChatGPT! 

  4. AI is not relational.

    AI cannot provide the type of experiences that humans learn from: the student-teacher, student-advisor, or mentoring relationship. Such human connections are the foundation for learning and many parts of a student’s college application.

How can my child harness AI positively? 

  1. AI can be used to enhance and improve communication, such as writing emails to teachers, coaches, admissions officers, potential employers, and other adults.

  2. AI can be used to brainstorm and flesh out ideas. For instance, I asked ChatGPT to suggest possible Eagle Projects for a student interested in environmental science and impacting his local community of Cape Cod, Massachusetts. ChatGPT produced a fabulous list of possibilities. One was restoring a coastal habitat area and creating educational signage to raise awareness about the importance of protecting coastal ecosystems and wildlife. I then asked it to detail the steps to execute the project. It suggested a comprehensive and realistic nine-step plan for realizing the project. 

  3. AI can be used to understand something more efficiently than through a Google search. For instance, a student applying to college could use it to understand what an honors college at a university is or the difference between an internship and a co-op.

  4. Getting good AI output requires designing and revising prompts through an iterative process. It involves critically evaluating outputs and guiding or steering the AI tool. This process can help your child develop critical thinking and research skills, such as fact-checking and how to ask the right questions to get the information that they need.

  5. They must know the acceptable parameters for using AI and the expectations for producing their own work within any given context. For instance, in a high school class, what does the teacher say about using AI? Students should pay close attention so they can avoid complicated academic integrity issues that they would need to address on a college application. On college campuses, professors are using syllabi statements like the Wharton School of the University of Pennsylvania Professor Ethan Mollick’s:

    I expect you to use AI in this class. Learning to use AI is an emerging skill, and I provide tutorials in Canvas about how to use them. I am happy to meet and help with these tools during office hours or after class. Be aware of limits of ChatGPT:

  • If you provide minimum effort prompts, you will get low quality results. You will need to refine your prompts in order to get good outcomes. This will take work.

  • Don’t trust anything it says. If it gives you a number or fact, assume it is wrong unless you either know the answer or can check in with another course. You will be responsible for any errors or omissions provided by the tool. It works best for topics you understand.

  • AI is a tool, but one that you need to acknowledge using. Please include a paragraph at the end of any assignment that uses AI explaining what you use the AI for and what prompts you used to get the results. Failure to do so is in violation of academic honesty policies.

  • Be thoughtful about when this tool is useful. Don’t use it if it isn’t appropriate for the case or circumstances.

Within the college application process, an interesting example comes from the University of California (UC). In January 2023, counselors learned: 

All personal insight responses were reviewed by an anti-plagiarism software program in early January. Notifications were sent out earlier this month via email to students whose responses require verification of their authenticity. Notified applicants have the opportunity to demonstrate that their PIQ** responses are their own work and were provided further instructions on how to do so. Applicants who are unable to provide evidence that the PIQs are their own work or who do not respond in the designated timeframe will have their UC application withdrawn. If an applicant has submitted an appeal, they will receive a final decision regarding their cancellation in early February. Make sure you check your email (the one associated with your UC Application) and look in your spam folder (and “Promotions” tab for Gmail) in case you received a notification. 

What does this mean for your student?

The role of college application essays and ideas about the acceptable use of AI and authorship will evolve in the future. However, now, students should write their own essays and be prepared to defend that they have done so. For instance, if students use the online version of google docs or MS word, they will have a dated revision history to support that they wrote it. Beyond the risk of running afoul of expectations, there are many benefits to a student’s writing their own essay, including the opportunity to reflect and grow personally and as a writer, making their strongest application, and feeling pride in having done so. AI is a powerful new tool that students should learn to use responsibly.

 *The information in this article is current as of its publication date. AI is a rapidly evolving field.

**PIQs (personal insight questions) are the UC’s essay questions 

Jennifer Stephan

Jennifer Stephan is a college admissions expert based in Massachusetts. Read More.

Previous
Previous

For International Students: What to Know About Attending College Abroad

Next
Next

How is Artificial Intelligence Impacting College Counseling?