The trial was carried out in 2026 in a large introductory economics class (course 354).
Authors built FeedbackWriter, an LLM‑based tool that generates suggestions for teaching assistants.
Students were randomly divided into groups receiving either handwritten TA feedback or AI‑mediated feedback.
A total of 1,366 essays were graded using the system.
Students receiving AI‑mediated feedback produced significantly higher‑quality revisions.
The improvement in revisions increased as TAs adopted a greater proportion of AI suggestions.
Teaching assistants reported that the AI suggestions helped spot gaps and clarify rubric criteria.
📖 Full Retelling
In a 2026 randomized controlled trial at a large introductory economics course (course 354), researchers Xinyi Lu, Kexin Phyllis Ju, Mitchell Dudley, Larissa Sano, and Xu Wang introduced FeedbackWriter—a system that supplies teaching assistants (TAs) with AI‑generated suggestions while they provide feedback on student essays. Students were randomly assigned to receive either handwritten feedback from TAs or feedback mediated by the AI, after which they revised their drafts and were re‑graded. The study found that students who received AI‑mediated feedback produced significantly higher‑quality revisions, and the effect grew as TAs adopted more AI suggestions.
🏷️ Themes
AI in Education, Human‑Computer Interaction, Experimental Evaluation of Pedagogical Interventions, Learning Outcomes in Undergraduate Courses, Instructor–Student Feedback Dynamics, Natural Language Processing for Academic Feedback
Entity Intersection Graph
No entity connections available yet for this article.
Deep Analysis
Why It Matters
The study shows that AI-generated feedback can improve student revisions in large courses, indicating a scalable tool for educators. It also highlights that teachers can selectively adopt AI suggestions, preserving human judgment.
Context & Background
Large introductory economics course with 1,366 essays
Randomized controlled trial comparing handwritten vs AI-mediated feedback
FeedbackWriter lets TAs edit or dismiss AI suggestions
What Happens Next
Future research may explore long-term learning gains and extend the system to other disciplines. Institutions might adopt similar AI tools to support teaching assistants in large classes.
Frequently Asked Questions
What is FeedbackWriter?
A system that generates AI suggestions for teaching assistants to incorporate into feedback on student essays.
Did the AI suggestions improve the quality of revisions?
Yes, students who received AI-mediated feedback produced higher-quality revisions, especially when more suggestions were adopted.
Can teachers reject AI suggestions?
Yes, teaching assistants can edit or dismiss the suggestions before giving feedback.
Original Source
--> Computer Science > Human-Computer Interaction arXiv:2602.16820 [Submitted on 18 Feb 2026] Title: AI-Mediated Feedback Improves Student Revisions: A Randomized Trial with FeedbackWriter in a Large Undergraduate Course Authors: Xinyi Lu , Kexin Phyllis Ju , Mitchell Dudley , Larissa Sano , Xu Wang View a PDF of the paper titled AI-Mediated Feedback Improves Student Revisions: A Randomized Trial with FeedbackWriter in a Large Undergraduate Course, by Xinyi Lu and 4 other authors View PDF HTML Abstract: Despite growing interest in using LLMs to generate feedback on students' writing, little is known about how students respond to AI-mediated versus human-provided feedback. We address this gap through a randomized controlled trial in a large introductory economics course 354), where we introduce and deploy FeedbackWriter - a system that generates AI suggestions to teaching assistants while they provide feedback on students' knowledge-intensive essays. TAs have the full capacity to adopt, edit, or dismiss the suggestions. Students were randomly assigned to receive either handwritten feedback from TAs or AI-mediated feedback where TAs received suggestions from FeedbackWriter. Students revise their drafts based on the feedback, which is further graded. In total, 1,366 essays were graded using the system. We found that students receiving AI-mediated feedback produced significantly higher-quality revisions, with gains increasing as TAs adopted more AI suggestions. TAs found the AI suggestions useful for spotting gaps and clarifying rubrics. Subjects: Human-Computer Interaction (cs.HC) ; Artificial Intelligence (cs.AI) Cite as: arXiv:2602.16820 [cs.HC] (or arXiv:2602.16820v1 [cs.HC] for this version) https://doi.org/10.48550/arXiv.2602.16820 Focus to learn more arXiv-issued DOI via DataCite (pending registration) Submission history From: Xinyi Lu [ view email ] [v1] Wed, 18 Feb 2026 19:33:56 UTC (10,239 KB) Full-text links: Access Paper: View a PDF of the paper titled AI-Me...