Buckingham Palace warned of legal battles over fake books describing King Charles’s recent cancer diagnosis.
King Charles announced he’d been diagnosed with a ‘form of cancer’ found after a minor medical procedure.
We don’t know much more than that, though it seems likely the cancer has been caught early. King Charles has withdrawn from public duties while he receives treatment.
In the days following this announcement, several AI-generated books appeared on Amazon, falsely alleging insider knowledge about King Charles’s health, including specific details about his medical condition and treatment.
They were criticized for spreading false information about the King’s health, including unfounded claims about him undergoing surgery and experiencing side effects from chemotherapy, which is completely untrue.
These books were sold alongside legitimate royal biographies, adding some level of credibility.
In a statement, Buckingham Palace condemned these publications as “intrusive, insensitive and filled with inaccuracies” and warned that their legal team intends to scrutinize the matter.
Buckingham Palace continued: “Any such titles speculating about His Majesty’s diagnosis and treatment are intrusive, insensitive and filled with inaccuracies. Our legal team will be looking at the issue closely. We call on any individuals or organisations facilitating their sale to withdraw them immediately.”
Royal insiders and commentators have expressed disappointment, criticizing Amazon for allowing the sale of misleading content – a chronic problem since the rise of generative AI.
Free AI tools like ChatGPT (with GPT-3.5 Turbo) make it possible to generate entire books in hours or even minutes, which can be published through Amazon Books without an upfront fee.
OpenAI has extended the output for GPT-3.5 Turbo, making it possible to generate thousands of words in seconds, albeit at generally low quality and poor factual accuracy.
This has led to a huge influx of poor-quality or outright factually incorrect work on Amazon Books. In some cases, the content could be illegal or dangerous.
For instance, last year, Amazon pulled mushroom foraging books for recommending the consumption of potentially lethal fungi.
In September 2023, Amazon introduced stricter regulations, limiting authors to uploading just three self-published titles per day.
Amazon later required authors to declare whether AI was used in their works, but voluntary disclosures were never likely to be successful.
Despite Amazon’s efforts, this recent incident involving King Charles’s health reinforces the challenges platforms face in policing AI-generated content.