Search engines Google and Bing readily display deep fake porn

January 12, 2024

deep fakes

In an investigation by NBC News, it was revealed that popular search engines such as Google and Microsoft’s Bing are inadvertently promoting nonconsensual deep fake pornography. 

Deep fake technology, which has become a catch-all term for any AI application that can generate extremely lifelike video, images, and audio.

It’s been increasingly used to create pornographic material without consent, often targeting female celebrities.

NBC News conducted searches for 36 popular female celebrities along with the term “deepfakes” on Google and Bing. The findings were alarming: Google displayed nonconsensual deep fake images in the top results for 34 of these searches, while Bing did so for 35. 

Furthermore, these search results often included links to websites known for distributing such deepfake content. 

The situation is exacerbated by the fact that searching for terms like “fake nudes” on these platforms leads to numerous links for creating and viewing nonconsensual deep fake porn.

Legal experts, advocates, and victims of this technology have expressed serious concerns. 

Nonconsensual deep fake porn not only violates privacy but also contributes to the sexual objectification and harassment of women. In some cases, deep fakes extend to child sex abuse, which stems from illicit images contained within popular image training databases like LAION.

Currently, there’s been little systematic effort from Google to identify and remove these materials, and that may not be realistically possible at any scale.

A Google spokesperson stated, “We understand how distressing this content can be for people affected by it, and we’re actively working to bring more protections to Search.”

They emphasized that Google’s search algorithms are designed to avoid unexpectedly displaying harmful or explicit content.

Microsoft, on the other hand, has a policy against non-consensual intimate imagery (NCII) and includes sexually explicit deepfakes under this category. Victims can report such content appearing in Bing search results. 

A Microsoft spokesperson remarked, “The distribution of non-consensual intimate imagery is a gross violation of personal privacy and dignity with devastating effects for victims.”

Deep fakes are a complex aspect of AI, offering a means for malicious actors to manipulate election results, create controversy amid events like the Israel-Palestine war, and instigate scams

This presents an intractable issue, as eventually, AI will become virtually impossible to tell deep fakes apart from real media at the pixel level. 

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Sam Jeans

Sam is a science and technology writer who has worked in various AI startups. When he’s not writing, he can be found reading medical journals or digging through boxes of vinyl records.

×

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions