Back to Search Start Over

ReadME – Enhancing Automated Writing Evaluation

Authors :
Stefan Trausan-Matu
Robert-Mihai Botarleanu
Mihai Dascalu
Scott A. Crossley
Maria-Dorinela Sirbu
Source :
Artificial Intelligence: Methodology, Systems, and Applications ISBN: 9783319993430, AIMSA
Publication Year :
2018
Publisher :
Springer International Publishing, 2018.

Abstract

Writing is a central skill needed for learning that is tightly linked to text comprehension. Good writing skills are gained through practice and are characterized by clear and organized language, accurate grammar usage, strong text cohesion, and sophisticated wording. Providing constructive feedback can help learners improve their writing; however, providing feedback is a time-consuming process. The aim of this paper is to present an updated version of the tool ReadME, which generates automated and personalized feedback designed to help learners improve the quality of their writing. Sampling a corpus of over 15,000 essays, we used the ReaderBench framework to generate more than 1,200 textual complexity indices. These indices were then grouped into six writing components using a Principal Component Analysis. Based on the components generated by the PCA, as well as individual index values, we created an extensible rule-based engine to provide personalized feedback at four granularity levels: document, paragraph, sentence, and word levels. The ReadME tool consists of a multi-layered, interactive visualization interface capable of providing feedback to writers by highlighting sections of texts that may benefit from revision.

Details

ISBN :
978-3-319-99343-0
ISBNs :
9783319993430
Database :
OpenAIRE
Journal :
Artificial Intelligence: Methodology, Systems, and Applications ISBN: 9783319993430, AIMSA
Accession number :
edsair.doi...........7d94bec1657b253d70c12ff6d17122ea
Full Text :
https://doi.org/10.1007/978-3-319-99344-7_28