Over the past decade, the rise of deep learning image synthesis technology has made it increasingly easy for people with a consumer PC to create misleading pornography by swapping out the faces of the performers with someone else who has not consented to the act. That practice spawned the term “deepfake” around 2017, named after a Reddit user named “deepfakes” that shared AI-faked porn on the service. Since then, the term has grown to encompass completely new images and video synthesized entirely from scratch, created from neural networks that have been trained on images of the victim.
The problem isn’t unique to the UK. In March, deepfake nudes of female middle school classmates in Florida led to charges against two boys ages 13 and 14. The rise of open source image synthesis models like Stable Diffusion since 2022 has increased the urgency among regulators in the US to attempt to contain (or at least punish) the act of creating non-consensual deepfakes. The UK government is on a similar mission.
Read 4 remaining paragraphs | Comments