Some software may provide helpful feedback on professional writing, but can it really replace a human?
Feedback on writing is necessary for a better overall product, especially on professional writing. How this feedback is provided, though, may be expanding to include delivery from machines. How would this process compare to human feedback? A study of engineering students and their perceptions of automated feedback provides insight into the future of feedback for professional writing.
In their article “Students’ Conceptions of Tutor and Automated Feedback in Professional Writing,” Rafael A. Calvo and Robert A. Ellis (2010) from the University of Sydney examined how engineering students reacted to feedback on e-business reports that they worked on over the course of six weeks. Students gave feedback on other students’ reports while faculty gave a workshop to help the students improve their writing. The students also received feedback from Glosser, a software program utilizing data mining and computational linguistics algorithms to help students improve their professional writing. After receiving feedback from Glosser, fellow students, and faculty, the students gave their opinions on machine versus human feedback.
Generally, the students decided that while there were some advantages to automated feedback, the program did not provide enough feedback to create a perfect document. Several students expressed distrust of machines and concerns about whether technology can give any useful feedback beyond grammar, sentence structure, spelling, or other mechanics. Some students mentioned that automated feedback in general wouldn’t be helpful for any aspect of writing and should be reserved for something more “exact” like math problems (434). The study indicated that in order for machine-derived feedback to be useful, students’ perceptions of automated feedback would need to be improved.
Feedback in professional writing, especially within STEM fields, is a necessity. According to Calvo and Ellis (2010), “Feedback is required on both the academic style and the quality of the content” (427). Although machines can certainly provide useful writing feedback, the study’s results suggest that writers in STEM fields may not trust the feedback for the seemingly inexact science of writing. Those in the professional writing world would do well to recognize that improved automated feedback programs may be on the horizon; but until then, the demand for peer editing and feedback has not yet completely waned.
To learn more about automated feedback as it compares to tutor feedback, read the full article:
Calvo, Rafael A., and Robert A. Ellis. 2010. “Students’ Conceptions of Tutor and Automated Feedback in Professional Writing.” Journal of Engineering Education 99 (4): 427–438. https://doi.org/10.1002/j.2168-9830.2010.tb01072.x.
—Aaron Tobler, Editing Research
FEATURE IMAGE BY DAN COUNSELL
Find more research
Check out “The Truth About Peer Editing” on Editing Research to learn more about the importance of getting feedback on your writing.
Read Timothy J. Beals’s (1998) article to learn more about using software as a means of feedback: “Between Teachers and Computers: Does Text-Checking Software Really Improve Student Writing?” The English Journal 87 (1): 6. https://doi.org/10.2307/822025.