❌

Reading view

There are new articles available, click to refresh the page.

US man used AI to generate 13,000 child sexual abuse pictures, FBI alleges

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

The FBI has charged a US man with creating more than 10,000 sexually explicit and abusive images of children, which he allegedly generated using a popular artificial intelligence tool. Authorities also accused the man, 42-year-old Steven Anderegg, of sending pornographic AI-made images to a 15-year-old boy over Instagram.

Anderegg crafted about 13,000 β€œhyper-realistic images of nude and semi-clothed prepubescent children”, prosecutors stated in an indictment released on Monday, often images depicting children touching their genitals or being sexually abused by adult men. Evidence from the Wisconsin man’s laptop allegedly showed he used the popular Stable Diffusion AI model, which turns text descriptions into images.

In the US, call or text the Childhelp abuse hotline on 800-422-4453 or visit their website for more resources and to report child abuse or DM for help. For adult survivors of child abuse, help is available at ascasupport.org. In the UK, the NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or Bravehearts on 1800 272 831, and adult survivors can contact Blue Knot Foundation on 1300 657 380. Other sources of help can be found at Child Helplines International

Continue reading...

πŸ’Ύ

Β© Photograph: Charlie Neibergall/AP

πŸ’Ύ

Β© Photograph: Charlie Neibergall/AP

Downranking won’t stop Google’s deepfake porn problem, victims say

Downranking won’t stop Google’s deepfake porn problem, victims say

Enlarge (credit: imaginima | E+)

After backlash over Google's search engine becoming the primary traffic source for deepfake porn websites, Google has started burying these links in search results, Bloomberg reported.

Over the past year, Google has been driving millions to controversial sites distributing AI-generated pornography depicting real people in fake sex videos that were created without their consent, Similarweb found. While anyone can be targetedβ€”police already are bogged down with dealing with a flood of fake AI child sex imagesβ€”female celebrities are the most common victims. And their fake non-consensual intimate imagery is more easily discoverable on Google by searching just about any famous name with the keyword "deepfake," Bloomberg noted.

Google refers to this content as "involuntary fake" or "synthetic pornography." The search engine provides a path for victims to report that content whenever it appears in search results. And when processing these requests, Google also removes duplicates of any flagged deepfakes.

Read 20 remaining paragraphs | Comments

States Move to Ban Deepfake Nudes to Fight Sexually Explicit Images of Minors

Legislators in two dozen states are working on bills, or have passed laws, to combat A.I.-generated sexually explicit images of minors.

Β© Ruth Fremson/The New York Times

Caroline Mullet, a ninth grader, prompted her father, Mark, a Washington State senator, to work on a bill to ban A.I.-generated sexually explicit images of minors. The ban is set to take effect in June.
❌