Readability Annotation: Replacing the Expert by the Crowd

Publication type
C1
Publication status
Published
Authors
van Oosten, P., & Hoste, V.
Journal
Proceedings of the Sixth Workshop on Innovative Use of NLP for Building Educational Applications
Pagination
120-129
Publisher
Association for Computational Linguistics (Portland, Oregon)
External link
http://www.aclweb.org/anthology/W11-1415
Download
(.pdf)
Project
Hendi

Abstract

This paper investigates two strategies for collecting readability assessments, an Expert Readers application intended to collect fine-grained readability assessments from language experts and a Sort by Readability application designed to be intuitive and open for everyone having internet access. We show that the data sets resulting from both annotation strategies are very similar. We conclude that crowdsourcing is a viable alternative to the opinions of language experts for readability prediction.