Please use this identifier to cite or link to this item:
http://hdl.handle.net/10497/24305
Title: | Authors: | Keywords: | Automated writing evaluation Argument-based validation Automated essay scoring |
Issue Date: | 2022 |
Citation: | Huawei, S., & Aryadoust, V. (2022). A systematic review of automated writing evaluation systems. Education and Information Technologies. Advance online publication. https://doi.org/10.1007/s10639-022-11200-7 |
Journal: | Education and Information Technologies |
Abstract: | Automated writing evaluation (AWE) systems are developed based on interdisciplinary research and technological advances such as natural language processing, computer sciences, and latent semantic analysis. Despite a steady increase in research publications in this area, the results of AWE investigations are often mixed, and their validity may be questionable. To yield a deeper understanding of the validity of AWE systems, we conducted a systematic review of the empirical AWE research. Using Scopus, we identified 105 published papers on AWE scoring systems and coded them within an argument-based validation framework. The major findings are: (i) AWE scoring research had a rising trend, but was heterogeneous in terms of the language environments, ecological settings, and educational level; (ii) a disproportionate number of studies were carried out on each validity inference, with the evaluation inference receiving the most research attention, and the domain description inference being the neglected one, and (iii) most studies adopted quantitative methods and yielded positive results that backed each inference, while some studies also presented counterevidence. Lack of research on the domain description (i.e., the correspondence between the AWE systems and real-life writing tasks) combined with the heterogeneous contexts indicated that construct representation in the AWE scoring field needs extensive investigation. Implications and directions for future research are also discussed. |
URI: | ISSN: | 1360-2357 (print) 1573-7608 (online) |
DOI: | File Permission: | Embargo_20230801 |
File Availability: | With file |
Appears in Collections: | Journal Articles |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
EIT-2021-112007.pdf Until 2023-08-01 | 457.34 kB | Adobe PDF | Under embargo until Aug 01, 2023 |
Page view(s) 50
131
checked on Mar 31, 2023
Download(s)
2
checked on Mar 31, 2023
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.