In a move highlighting AI’s limitations and public reception in journalism, Gannett, the parent company of USA Today and over 200 other daily newspapers, recently halted an AI experiment amid criticism over poorly written sports articles.
It would be fair to say that those working in roles under threat from AI – which includes writing in various forms – are probably delighted when the technology fails to deliver work at a standard that could replace them.
Gannett’s experiment involved publishing AI-generated sports articles based on score data.
One article from The Columbus Dispatch, a Gannett-owned paper, awkwardly described a high school football game as “high school football action,” leading readers to label it as “terrible.”
Adding to the awkwardness, another AI-generated phrase described an Ohio game as a “close encounter of the athletic kind.”
Yet another glaring error occurred in a Dispatch story dated August 19, where AI failed to generate names for sports mascots. Instead, the article read: “The Worthington Christian [[WINNING_TEAM_MASCOT]] defeated the Westerville North [[LOSING_TEAM_MASCOT]] 2-1 in an Ohio boys soccer game on Saturday.”
It’s quite remarkable that Gannett published these articles with such glaring errors, and the story was later amended and an editor’s note added.
A spokesperson for Gannett confirmed they were suspending the experiment, stating, “In addition to adding hundreds of reporting jobs across the country, we have been experimenting with automation and AI to build tools for our journalists and add content for our readers.”
The spokesperson continued, “We have paused the high school sports LedeAI experiment and will continue to evaluate vendors as we refine processes to ensure all the news and information we provide meets the highest journalistic standards.”
The problematic articles were authored by LedeAI, a tech firm that aims to provide “reliable, readable, accurate local reporting.”
Jay Allred, the CEO of LedeAI, a company that “has been guiding news organizations in the deployment of useful and productive AI that keeps the control in their hands and puts their readers and staff first,” acknowledged the issues.
He stated, “As with any new technological advance, some glitches can occur. We sincerely regret that a very small number of the 1,000+ articles we produced for Gannett newspaper sites on August 19th included some errors, unwanted repetition and/or awkward phrasing.”
Allred added that the public criticism had initiated a “useful” dialogue about the technology, stating, “The conversation that started on X is a useful one to have.”
“There were legitimate problems with the reports we produced, and the feedback we received was valid. We took the criticism seriously and acted on it immediately.”
While some, like German publisher Axel Springer, speculate that AI could one day replace human journalists, others see this incident as evidence of the technology’s current limitations.
Nevertheless, Allred remains optimistic about the role of AI in local newsrooms, believing that “content automation is part of the future of local newsrooms.”