Artwork

Contenido proporcionado por Malwarebytes. Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente Malwarebytes o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.
Player FM : aplicación de podcast
¡Desconecta con la aplicación Player FM !

San Francisco’s fight against deepfake porn, with City Attorney David Chiu

20:55
 
Compartir
 

Manage episode 441320576 series 2652999
Contenido proporcionado por Malwarebytes. Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente Malwarebytes o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.

On August 15, the city of San Francisco launched an entirely new fight against the world of deepfake porn—it sued the websites that make the abusive material so easy to create.

“Deepfakes,” as they’re often called, are fake images and videos that utilize artificial intelligence to swap the face of one person onto the body of another. The technology went viral in the late 2010s, as independent film editors would swap the actors of one film for another—replacing, say, Michael J. Fox in Back to the Future with Tom Holland.

But very soon into the technology’s debut, it began being used to create pornographic images of actresses, celebrities, and, more recently, everyday high schoolers and college students. Similar to the threat of “revenge porn,” in which abusive exes extort their past partners with the potential release of sexually explicit photos and videos, “deepfake porn” is sometimes used to tarnish someone’s reputation or to embarrass them amongst friends and family.

But deepfake porn is slightly different from the traditional understanding of “revenge porn” in that it can be created without any real relationship to the victim. Entire groups of strangers can take the image of one person and put it onto the body of a sex worker, or an adult film star, or another person who was filmed having sex or posing nude.

The technology to create deepfake porn is more accessible than ever, and it’s led to a global crisis for teenage girls.

In October of 2023, a reported group of more than 30 girls at a high school in New Jersey had their likenesses used by classmates to make sexually explicit and pornographic deepfakes. In March of this year, two teenage boys were arrested in Miami, Florida for allegedly creating deepfake nudes of male and female classmates who were between the ages of 12 and 13. And at the start of September, this month, the BBC reported that police in South Korea were investigating deepfake pornography rings at two major universities.

While individual schools and local police departments in the United States are tackling deepfake porn harassment as it arises—with suspensions, expulsions, and arrests—the process is slow and reactive.

Which is partly why San Francisco City Attorney David Chiu and his team took aim at not the individuals who create and spread deepfake porn, but at the websites that make it so easy to do so.

Today, on the Lock and Code podcast with host David Ruiz, we speak with San Francisco City Attorney David Chiu about his team’s lawsuit against 16 deepfake porn websites, the city’s history in protecting Californians, and the severity of abuse that these websites offer as a paid service.

“At least one of these websites specifically promotes the non-consensual nature of this. I’ll just quote: ‘Imagine wasting time taking her out on dates when you can just use website X to get her nudes.’”

Tune in today.

You can also find us on Apple Podcasts, Spotify, and Google Podcasts, plus whatever preferred podcast platform you use.

For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.

Show notes and credits:

Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)

Licensed under Creative Commons: By Attribution 4.0 License

http://creativecommons.org/licenses/by/4.0/

Outro Music: “Good God” by Wowa (unminus.com)

Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.

Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.

  continue reading

120 episodios

Artwork
iconCompartir
 
Manage episode 441320576 series 2652999
Contenido proporcionado por Malwarebytes. Todo el contenido del podcast, incluidos episodios, gráficos y descripciones de podcast, lo carga y proporciona directamente Malwarebytes o su socio de plataforma de podcast. Si cree que alguien está utilizando su trabajo protegido por derechos de autor sin su permiso, puede seguir el proceso descrito aquí https://es.player.fm/legal.

On August 15, the city of San Francisco launched an entirely new fight against the world of deepfake porn—it sued the websites that make the abusive material so easy to create.

“Deepfakes,” as they’re often called, are fake images and videos that utilize artificial intelligence to swap the face of one person onto the body of another. The technology went viral in the late 2010s, as independent film editors would swap the actors of one film for another—replacing, say, Michael J. Fox in Back to the Future with Tom Holland.

But very soon into the technology’s debut, it began being used to create pornographic images of actresses, celebrities, and, more recently, everyday high schoolers and college students. Similar to the threat of “revenge porn,” in which abusive exes extort their past partners with the potential release of sexually explicit photos and videos, “deepfake porn” is sometimes used to tarnish someone’s reputation or to embarrass them amongst friends and family.

But deepfake porn is slightly different from the traditional understanding of “revenge porn” in that it can be created without any real relationship to the victim. Entire groups of strangers can take the image of one person and put it onto the body of a sex worker, or an adult film star, or another person who was filmed having sex or posing nude.

The technology to create deepfake porn is more accessible than ever, and it’s led to a global crisis for teenage girls.

In October of 2023, a reported group of more than 30 girls at a high school in New Jersey had their likenesses used by classmates to make sexually explicit and pornographic deepfakes. In March of this year, two teenage boys were arrested in Miami, Florida for allegedly creating deepfake nudes of male and female classmates who were between the ages of 12 and 13. And at the start of September, this month, the BBC reported that police in South Korea were investigating deepfake pornography rings at two major universities.

While individual schools and local police departments in the United States are tackling deepfake porn harassment as it arises—with suspensions, expulsions, and arrests—the process is slow and reactive.

Which is partly why San Francisco City Attorney David Chiu and his team took aim at not the individuals who create and spread deepfake porn, but at the websites that make it so easy to do so.

Today, on the Lock and Code podcast with host David Ruiz, we speak with San Francisco City Attorney David Chiu about his team’s lawsuit against 16 deepfake porn websites, the city’s history in protecting Californians, and the severity of abuse that these websites offer as a paid service.

“At least one of these websites specifically promotes the non-consensual nature of this. I’ll just quote: ‘Imagine wasting time taking her out on dates when you can just use website X to get her nudes.’”

Tune in today.

You can also find us on Apple Podcasts, Spotify, and Google Podcasts, plus whatever preferred podcast platform you use.

For all our cybersecurity coverage, visit Malwarebytes Labs at malwarebytes.com/blog.

Show notes and credits:

Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)

Licensed under Creative Commons: By Attribution 4.0 License

http://creativecommons.org/licenses/by/4.0/

Outro Music: “Good God” by Wowa (unminus.com)

Listen up—Malwarebytes doesn't just talk cybersecurity, we provide it.

Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.

  continue reading

120 episodios

Todos los episodios

×
 
Loading …

Bienvenido a Player FM!

Player FM está escaneando la web en busca de podcasts de alta calidad para que los disfrutes en este momento. Es la mejor aplicación de podcast y funciona en Android, iPhone y la web. Regístrate para sincronizar suscripciones a través de dispositivos.

 

Guia de referencia rapida