Home / Education / AI Shows Racial Bias in Student Essay Feedback
AI Shows Racial Bias in Student Essay Feedback
28 Apr
Summary
- AI provided more praise to Black students' essays.
- Hispanic students and English learners received more grammar corrections.
- Female students' essays received feedback emphasizing personal engagement.

A recent study examining artificial intelligence's impact on student feedback has uncovered significant racial and gender biases. Researchers analyzed 600 eighth-grade essays using various AI models, including versions of ChatGPT and Llama. The findings, published in March by Stanford University researchers, indicate that AI systems provided differential treatment based on a student's race and sex.
Essays attributed to Black students consistently received more praise and encouragement from the AI. Conversely, essays labeled as belonging to Hispanic students or English learners were more likely to trigger corrections related to grammar and 'proper' English usage. When essays were identified as being written by White students, the feedback focused more on argument structure and clarity.
Furthermore, the analysis showed that female students' essays often received feedback emphasizing personal engagement, with comments like 'I love your confidence.' In some instances, praise for certain groups, such as Black students or female students, took on overtly stereotyped forms. Researchers expressed concern that this uneven feedback, whether overly positive or excessively corrective, could hinder students' opportunities to revise and improve their writing.