DAI#7 – AI fakes celebs, politicians, and wants your dreams

October 6, 2023

Welcome to the best AI news roundup your inbox has seen.

This week AI dished out deepfakes of some of our favorite celebs and least favorite political figures.

DALL-E 3 is free, but OpenAI still wants to charge you for it.

And AI wants to watch you while you sleep and slide into your dreams.

Let’s dig in.

Fake it ’til they make it

Hollywood’s strike is finally over and we can watch television again as nature intended. The deals struck are a little fuzzy around the edges as far as how AI will impact the jobs of creatives.

AI gave some celebrities a head-start by having them feature in a growing collection of deepfake scams. No, MrBeast isn’t going to give you an iPhone 15 Pro for $2, and Tom Hanks isn’t overly concerned about your dental health.

The idea of using AI to bring your favorite dead actor back to life on-screen sounds interesting but should we do it? Robin Williams’s daughter raises some good points in her objection to AI-generated versions of her father’s famous voice.

War and AI, together at last

DARPA is throwing money at researching how to use AI to make better battlefield decisions. Once their army of autonomous AI robots picks up their guns they’ll need to be told where to go pretty quickly.

When the robots do eventually rise up they’ll probably be using a version of Google’s new robotic models and datasets. They’re getting robots to do some pretty clever stuff now.

AI is supercharging cybersecurity threats so the NSA is getting in on the AI action to protect US national security. They’re particularly worried about China stealing US AI tech or sabotaging the AI tools they’re using.

Edward Snowden may have his own views on how much safer we should feel now that the NSA is leveraging AI.

via GIPHY

Now we can’t even trust politicians

While scammers are trying to make a quick buck with AI fakes, some people are seeing the bigger picture. Why scam a few bucks when you could influence who runs a country?

A sketchy conversation between a Slovakian politician and a reporter made its way onto Facebook just hours before election polls opened. It was confirmed as an AI fake but highlighted the threat AI poses to election integrity.

In countries that already have a poor record of credible elections, the danger is even more apparent. The apparently fake audio of ex-Sudanese president Omar al-Bashir that appeared on TikTok is a case in point.

It’s no wonder then that authorities like the UK government want to look inside AI’s ‘black box’ in advance of the global AI Safety Summit.

At least British royalty will feel a little safer. The guy who said his AI angelic girlfriend told him to kill the queen with a crossbow in 2021 was sentenced to 9 years in prison.

Bing scoops ChatGPT with DALL-E 3 release

The prospect of getting to use DALL-E 3 convinced a lot of people to finally sign up for ChatGPT Plus. Bing promptly swooped in and released the amazing image generator for free on its platforms.

You may want to avoid clicking some ads on Bing Chat unless you’d like a dose of malware.

OpenAI says it adds a digital watermark to images generated with DALL-E 3 but it turns out that AI image watermarks don’t work, and probably never will.

Apparently, lazy researchers use images they’ve grabbed from other papers to fake their results. This AI tool finds them quicker than human specialists can.

But with DALL-E 3 or ChatGPT Plus, dodgy researchers can just make their own fake research graphs or pics and don’t need to steal someone else’s work anymore. The AI giveth, and the AI taketh away.

Enter Sandman

Diagnosis of obstructive sleep apnea (OSA) normally involves sleeping in a hospital for a few nights while hooked up to a bunch of sensors. This AI-powered night-vision camera will watch you while you sleep in the comfort of your own bed to see if you wake up gasping.

If you’d like a little more control when in dreamland you might like the idea of lucid dreaming on demand. Prophetic says their AI wearable headband will help you do just that. But first, they want to record your brainwaves while you dream to train their model.

AI watching you while you sleep and recording your dreams sounds like a nightmare in the making.

Meta says ‘Your data is our data’

Don’t want Meta to use your Facebook or Instagram posts to train its AI? Turns out you may not have a choice. But it’s ok because Meta says they ‘tried to exclude personal data’. If the service is free, you’re the product.

via GIPHY

At least the company is making some cool AI tools with your data. Meta will soon launch AI tools for creating generative AI ads. If you’ve ever navigated their Ads Manager platform this will come as a huge relief.

Last week everyone was fawning over the cute stickers Meta’s new custom AI sticker generator could make. Some users manipulated the Emu tool to make some really funny and even shocking stickers to add to your collection.

In other news…

Here are some other click-worthy AI stories that we enjoyed reading this week:

And that’s a wrap of this week’s juicy AI news. I’m not sure how I feel about AI watching me while I sleep but I really want one of those lucid dream wearables. Would you offer up your dream data to train an AI?

Also, we’re not surprised that Meta is using FB and Instagram data to train its AI, but have you seen the stuff people post? What kind of AI will they end up with?

Let us know if we missed a good AI story. And if you snuck an interesting sticker past Meta’s content filter we’d love to see it.

Join The Future


SUBSCRIBE TODAY

Clear, concise, comprehensive. Get a grip on AI developments with DailyAI

Eugene van der Watt

Eugene comes from an electronic engineering background and loves all things tech. When he takes a break from consuming AI news you'll find him at the snooker table.

×

FREE PDF EXCLUSIVE
Stay Ahead with DailyAI

Sign up for our weekly newsletter and receive exclusive access to DailyAI's Latest eBook: 'Mastering AI Tools: Your 2024 Guide to Enhanced Productivity'.

*By subscribing to our newsletter you accept our Privacy Policy and our Terms and Conditions