“Authentic” is the Merriam-Webster word of the year, but why?

  • The Merriam-Webster dictionary says the word "Authentic" has seen a spike in searches this year
  • This is partly due to deep fake content spreading across the news and social media
  • The Cambridge Dictionary chose "Hallucinate" as their equivalent, showing the cultural influence of AI
Merriam Webster AI

Merriam-Webster has selected “authentic” as their word of the year for 2023, explaining how it’s taken on new relevance with the rise of fake content, including AI-generated deep fakes. 

The word “authentic” was widely searched on Merriam-Webster’s online dictionary throughout the year. 

The word has various meanings, including being “real or genuine,” “true to one’s personality,” and “conforming to fact or an original.”


Merriam-Webster lexicographer Peter Sokolowski told the Associated Press, “We see in 2023 a kind of crisis of authenticity.” He noted the current challenges of trusting what we see and hear, stating, “We don’t always trust what we see anymore,” and “Authenticity is a performance itself.”

Touching on the subject of deep fake media, Sokolowski said, “Can we trust whether a student wrote this paper? Can we trust whether a politician made this statement? We don’t always trust what we see anymore.”

The choice of “authentic” as word of the year, following “gaslighting” in 2022, marks the 20th anniversary of this tradition. The selection process involves analyzing data on word lookups, filtering out commonly searched words, and, this year, excluding popular five-letter words influenced by games like Wordle.

Runners-up for the word of the year included “X” (Elon Musk’s renaming of Twitter), “EGOT” (an acronym for the Emmy, Grammy, Oscar, and Tony Awards), and “Elemental” (a new Pixar film).

Other shortlisted words included: 

  • Rizz
  • Deepfake
  • Implode
  • Dystopian
  • EGOT
  • X
  • Indict

AI has thoroughly entered the lexicon

The Cambridge Dictionary recently chose “hallucinate” as their own “Word of the Year” for 2023

“Hallucinate,” traditionally associated with experiencing sensations of things that don’t exist, usually due to health conditions or substance use, has now been redefined by AI. 

In AI, a hallucination occurs when a generative model mistakenly produces false or erroneous information. This happens when the AI attempts to fill gaps in its knowledge base with factually false predictions.

Cambridge Dictionary added a new definition for AI hallucinations: “When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.” 

AI, like other influential technologies before it, has become part of our collective cultural expression. 

© 2023 Intelliquence Ltd. All Rights Reserved.

Privacy Policy | Terms and Conditions

×
 
 

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI


 

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.



 
 

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions