Cambridge Dictionary reveals an AI-related “Word of the Year” 

November 15, 2023
Cambridge AI

The Cambridge Dictionary has announced “hallucinate” as its Word of the Year for 2023, assigning it a new meaning linked to AI.

The term, traditionally associated with sensory experiences of nonexistent things due to health conditions or substance use, now also encompasses the phenomenon of AI generating false information.

Users of generative AI, such as ChatGPT, Bard, and Grok, are already (or should already be) familiar with hallucinations, which tend to occur when the model attempts to ‘fill in the gaps’ of its knowledge by predicting something that isn’t factual or truthful. 

The newly added definition of hallucinations states: “When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.”

This change also illustrates the evolving role of AI in our everyday lives and the cultural changes that follow. 

Despite advancements in generative AI models, the reliability of these AI-generated outputs remains under scrutiny, as they can sometimes produce inaccurate or nonsensical information. 

There have been numerous high-profile cases of people relying on AI without double-checking its output, only for it later to be revealed as a hallucination. One notable example involved a US lawyer who included fabricated cases in a legal brief submitted on behalf of a plaintiff. 

Additionally, Google’s AI chatbot Bard was involved in a factual error regarding the James Webb Space Telescope, which embarrassingly occurred during a live demonstration. 

Wendalyn Nichols, the publishing manager at Cambridge Dictionary, highlighted the importance of human oversight in AI usage, stating, “The fact that AIs can ‘hallucinate’ reminds us that humans still need to bring their critical thinking skills to the use of these tools. AIs are fantastic at churning through huge amounts of data to extract specific information and consolidate it. But the more original you ask them to be, the likelier they are to go astray.”

The fact we’ve assigned a definition related to our own human experiences to AI in “hallucinations” is linguistically intriguing.

Dr. Henry Shevlin, an AI ethicist at Cambridge University, mentioned how this is an example of us anthropomorphizing AI technology to understand it through our human experiences.

Shevlin said, “The widespread use of the term ‘hallucinate’ to refer to mistakes by systems like ChatGPT provides […] a fascinating snapshot of how we’re anthropomorphising AI.” 

New AI and tech terms in the Cambridge Dictionary 

Beyond “hallucinate,” the Cambridge Dictionary added several other AI and tech-related terms:

  • Prompt engineering: Designing prompts for optimal AI responses.
  • Large language model (LLM): Mathematical language representations based on extensive data, enabling human-like language generation.
  • GenAI: Short for generative AI, encompassing AIs that create text, images, etc.
  • Train: In machine learning, developing or enhancing a system through data input.
  • Black box: Systems that operate without transparent processes to the user.

These additions, along with terms like “shadowban,” “vibe check,” “water neutral,” and “range anxiety,” show how modern language develops around emerging technological themes and trends, which has been the case for decades. 

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.

×
 
 

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI


 

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.



 
 

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions