University of Pittsburgh

Automatic Scoring of an Analytical Response-To-Text Assessment

ISP Graduate Student
Date: 
Friday, September 19, 2014 - 1:00pm - 1:30pm

In analytical writing in response to text, students read a complex text and adopt an analytic stance in their writing about it. To evaluate this type of writing at scale, an automated approach for
Response to Text Assessment (RTA) is needed. With the long-term goal of producing informative feedback for students and teachers, we design a new set of interpretable features that operationalize the Evidence rubric of RTA. When evaluated on a corpus of essays written by students in grades 4-6, our results show that our features outperform baselines based on well-performing features from other types of essay assessments.

Copyright 2009 | Web site by UMC Web Team