Computer programs that offer automated feedback on writing are quickly becoming commonplace for students—but writers and editors cannot rely on these programs for accurate suggestions.
The ever-evolving frontier of machine learning is slowly inserting itself into every vocational field, and the realms of editing and writing are no different. Computer programs that offer feedback on writing claim to make writing and revising easier, but they pose the risk of introducing errors or inconsistencies to a manuscript. Despite this risk, technology-savvy editors or writers might be tempted by the convenience that programs like Grammarly claim to offer. By examining and quantifying Grammarly’s corrective tendencies, we can get a better idea of whether it is worth the risk.
In 2021, Artak Tovmasyan and Marina Dodigovic conducted a study to examine the writing feedback offered by the software program Grammarly and published their results in an article called “Automated Writing Evaluation: The Accuracy of Grammarly’s Feedback on Form.” Specifically, the study attempted to answer two questions: How accurate is Grammarly at identifying grammar errors? And how accurate is Grammarly at correcting these errors? Although Grammarly also offers corrections on style and semantics, this study looked purely at grammar errors.
To answer these questions, the researchers first created a corpus of EFL (English as a Foreign Language) student writing. Second, the researchers asked human raters (an English professor and three graduate students) to identify any errors within the writing. The researchers then reviewed the errors and consolidated them into a list. Finally, the researchers plugged the corpus into Grammarly and compared the program’s corrections with the human raters’ corrections.
Dodigovic and Tovmasyan (2021) found that Grammarly’s corrections had a 66% accuracy rate compared to the assumed 100% accuracy rate of the human raters. This discrepancy stood out with regard to errors in relative clauses, verb omission, subject omission, and sentence fragments. In each of these categories, Grammarly detected less than 25% of the errors that the human raters detected, whereas other categories of grammar errors, such as verb tense, subject-verb agreement, and determiner errors, overlapped more with human corrections. Notably, the human raters and Grammarly agreed most closely (88.6% of the time) on verb tense errors (Dodigovic and Tovmasyan 2021, 81). The researchers concluded, “In light of the findings of this study, the usage of Grammarly as a stand-alone product may not be optimal” (84).
Editors strive for accuracy and consistency, and any computer program which could introduce error or inconsistency into writing should be used with extreme caution, if at all. This is not to say that there is no use for Grammarly: many people may benefit from the suggestions that the program provides, as long as changes are reviewed and implemented with good judgment.
It is also worth noting that the results of this study are provisional—future developments to machine learning, including the creation of more advanced programs, may result in useful and reliable writing feedback programs for professional editors and writers. For now, though, this study suggests that Grammarly is not a perfect substitute for good grammatical sense.
To find out more about the accuracy of Grammarly’s feedback, read the full article:
Dodigovic, Marina, and Artak Tovmasyan. “Automated Writing Evaluation: The Accuracy of Grammarly’s Feedback on Form.” International Journal of TESOL Studies 3, no. 2 (2021): 71–87. Gale Academic OneFile (accessed February 5, 2022). https://doi.org/10.46451/ijts.2021.06.06.
—Simon Laraway, Editing Research
FEATURE IMAGE BY OLIA DANILEVICH
Find more research
Take a look at Aaron Tobler’s article, “Is Automated Feedback as Effective as a Tutor in Professional Writing?” for more insights into the value and drawbacks of automated feedback.
Read J. M. Dembsey’s (2017) article to find out more about Grammarly’s feedback in a student-oriented setting: “Closing the Grammarly® Gaps: A Study of Claims and Feedback from an Online Grammar Program.” The Writing Center Journal 36, no. 1: 63–100. http://www.jstor.org/stable/44252638.