As generative AI tools become more widely used, teens are now facing a growing, disturbing trend. We’re talking about so-called “deepfake nudes” — fake, sexually explicit images of real people, often minors. The nonprofit Thorn, a group dedicated to protecting kids from online sexual exploitation, says one in eight teens knows someone who has been targeted.
Today, I’m speaking with the organization’s vice president of research and insights about the risks kids are facing right now, and what families, schools and tech companies could do to better respond.
Join us again for our 10-minute daily news roundups every Mon-Fri!
Learn more about our guests: https://www.theNewsWorthy.com/shownotes
Sign-up for our bonus weekly EMAIL: https://www.theNewsWorthy.com/email
Become an INSIDER for ad-free episodes: https://www.theNewsWorthy.com/insider
Get The NewsWorthy MERCH here: https://www.theNewsWorthy.com/merch
Sponsors:
Go to HiyaHealth.com/NEWSWORTHY to get 50% off your first order of their best-selling children's vitamin.
Shop the SKIMS T-Shirt Shop at https://www.skims.com/newsworthy #skimspartner
To advertise on our podcast, please reach out to ad-sales@libsyn.com
#DeepFakes #AI #Online