Editors facing the rise of AI aren’t surrendering control; they’re defining what creativity, credibility, and authorship mean in the age of automation.
As artificial intelligence (AI) infiltrates nearly every creative field, editors are wondering where to draw the line for its use in publications. While AI tools promise efficiency and cost savings, editors aren’t convinced the potential loss of readers’ trust is worth it. Their hesitation isn’t just about technology; it’s about trust, both in the AI tools themselves and in maintaining the readers’ confidence in the publication. Can a machine truly replicate human creativity, judgment, and voice? As publishers grapple with shrinking staffs and growing pressures, the question isn’t if AI can help, but if it should.
THE RESEARCH
In their 2025 study “Helpful or Hurtful: Legacy Magazine Journalism Wrestles with the Advent of AI Technology in Newsrooms,” Adam Pitluk, Jennifer Wilson, and Jeffrey Inman investigated how editors at publications are responding to the rise of generative AI technologies. Through in-depth, semistructured interviews, 15 editors from leading publications recognized by the American Society of Magazine Editors were asked about their current AI usage.
The interviews included how the editors currently use AI, what concerns they have about its adoption, and how they determine the boundaries between human and machine work. The researchers used grounded theory—a method based on observing patterns—and thematic analysis—a way of interpreting recurring ideas in qualitative data—to examine the profession’s cautious, often resistant, stance toward AI integration. Three key themes emerged.
- Editors widely view generative AI as content that has been unethically taken from others. The fact that the technology is trained on copyrighted material raises ethical concerns about plagiarism and authorship, both when writers submit AI-generated work and when editors consider using AI tools themselves.
- Participants argued that AI simply cannot replicate the quality of human work, from nuanced storytelling to on-the-ground reporting that captures emotional and cultural context.
- Despite these strong feelings, most magazines lack formal AI policies. Editors admitted that they are too understaffed and overworked to create comprehensive guidelines, leaving decisions about AI use, whether by writers or editors, to individual discretion.
The researchers concluded, “Editors must balance AI’s capabilities with their editorial responsibility to readers, ensuring that technology serves the needs and values of the community they serve.
Pitluk, Wilson, and Inman (2025, 13)
THE IMPLICATIONS
This study suggests that for editors, the decision to use AI is extremely nuanced. While AI tools are becoming increasingly unavoidable, editors’ concerns must be addressed to preserve quality, trust, and the human voice. Editors need to ensure that AI is used ethically, avoiding copyright infringement and protecting authorship, while also maintaining the quality of work so that storytelling, emotional depth, and cultural context are not compromised. Developing clear AI policies in the editorial workplace helps guide consistent, responsible decisions with this new technology.
Just as editors had to reevaluate how they operated when publishing moved from print to online, they should now approach AI with similar thoughtfulness. By addressing these concerns proactively, magazines can move forward responsibly, using technology to support rather than replace human editors.
To learn more about how magazine editors are negotiating the rise of AI in journalism, read the full article:
Pitluk, Adam, Jennifer Wilson, and Jeffrey Inman. 2025. “Helpful or Hurtful: Legacy Magazine Journalism Wrestles with the Advent of AI Technology in Newsrooms.” Journalism Practice, 1–16. https://doi.org/10.1080/17512786.2025.2458608
—Macady Whitehead, Editing Research
FEATURE IMAGE BY RON LACH
Find more research
Take a look at Nick Diakopoulos’s (2019) book to learn how automation is reshaping newsroom practices: Automating the News: How Algorithms Are Rewriting the Media. Harvard University Press.
Read Margaret Le Masurier’s (2015) article to explore how “slow journalism” values depth and reflection amid digital acceleration: “What Is Slow Journalism?” Journalism Practice 9 (2): 138–152. https://doi.org/10.1080/17512786.2014.916471.
Check out Meredith Broussard et al. (2019) for an overview of AI’s broader impact on journalism ethics and production: “Artificial Intelligence and Journalism.” Journalism & Mass Communication Quarterly 96 (3): 673–695. https://doi.org/10.1177/1077699019859901.



