The sharing of so-called “deepfake porn” should be made illegal in the UK, according to a government-backed review which warned current laws do not go far enough to cover “disturbing and abusive new behaviours born in the smartphone era”.

The Law Commission on Thursday laid out a series of recommendations relating to deepfake porn, where computers generate realistic but fake sexualised images or video content of an individual without their consent.

The independent body, which looks at whether legislation needs to be overhauled, has been reviewing existing laws on the non-consensual taking, making and sharing of intimate images since 2019.

There is currently no single criminal offence in England and Wales that applies to non-consensual intimate images. The report proposes widening the motivations behind these crimes to include things like financial gain, as well as extending automatic anonymity to all victims of intimate image abuse.

Only victims of voyeurism and upskirting are provided with these protections under existing law and prosecutors must prove the perpetrators acted to cause distress or for sexual gratification.

The review comes as advances in deep learning have meant that deepfakes are increasingly available online and cheap to use, with fake videos of politicians and celebrities proliferating on the internet.

The use of these tools in porn, where often a person’s face is superimposed on to a porn actor’s body in a video, has led the Department for Digital, Culture, Media and Sport select committee as well as campaign groups to call for it to be criminalised.

“Altered intimate images are almost always shared without consent,” said Professor Penney Lewis, the law commissioner for criminal law. “[They] often cause the same amount of harm as unaltered intimate images shared without consent.”

The phenomenon has “dramatic under-reporting” as victims do not have anonymity under current laws, which “do not go far enough to cover disturbing and abusive new behaviours born in the smartphone era,” she added.

The review comes as the long-awaited Online Safety Bill makes its way through parliament. Many of the Law Commissions’ previous recommendations have already been added to the legislation, including criminalising revenge porn and cyberflashing, where an indecent image is shared without the recipient’s consent.

The Government said the Online Safety Bill “will force internet firms to protect people better from a range of image-based abuse — including deepfakes” and it will consider the proposals.

Companies including Twitter, Reddit and PornHub have already banned deepfake porn generated without the person’s consent. In the US, Virginia and California have also made it illegal, while Scotland has also made it illegal to distribute deepfake porn.

Last month the European Union also strengthened its disinformation rules to include deepfakes. Under a new EU code of practice, regulators can fine technology companies up to 6 per cent of their global turnover if they do not crack down on deepfakes.

You May Also Like

Disney’s stock pops on big earnings beat, dividend, lower streaming losses

Walt Disney Co.’s stock popped 7% higher in after-hours trading Wednesday on…

Forget ‘quiet quitting.’ Some workers are now embracing ‘bare-minimum Monday’ and pending home sales rise 8.1% in January, largest increase since June 2020 

Hi, MarketWatchers. Don’t miss these top stories. Warren Buffett wants you to…

Stock futures gain as Wall Street looks to snap 8-week losing streak

U.S. stock-index futures gained late Sunday, after Wall Street last week sank…

Bed Bath & Beyond’s turnaround plan leaves ‘questions that have really not been answered to any degree’

When Bed Bath & Beyond Inc. reports fiscal second-quarter earnings on Thursday,…