Cover image via
Market Research Telecast
& Edventure TV (YouTube)
Subscribe to our Telegram channel for our latest stories and breaking news.
According to VICE, Masayuki Nakamoto was arrested for specifically selling 10 fake photos at about JPY2,300 (about RM83) each.
It is learnt that he had made JPY11 million (about RM400,000) from selling over 10,000 manipulated videos.
The self-employed man was arrested by the Kyoto Prefectural Police for allegedly violating the Copyright Act and displaying obscene electromagnetic record media, reported Japanese daily The Mainichi.
Using AI model ‘TecoGAN’ — an engine that can sharpen the resolution of pixelated content — he processed and sold videos at the request of customers.
VICE likened the technology used by Nakamoto to ‘deepfake’, a software that can perform realistic face swaps on videos.
The controversial technology is not only infamous for creating non-consensual porn by swapping faces of people — primarily celebrities — with adult video actors, but it is also a threat to democracy as the software makes it is easy to spread fake news.
In Nakamoto’s case, instead of changing faces, he used machine learning software to reconstruct the blurred parts of the video based on a large set of uncensored nudes and sold the finishing ‘uncensored’ content online.
In Japan, it is against the law to display vaginas and penises in adult video and photo content.
He was not charged with any offences for violating the privacy of the actors in the videos.
The 43-year-old admitted that he did it for the money, reported VICE, citing Japan’s public broadcaster NHK.
He was caught during a prefectural police ‘cyber patrol’, reported The Mainichi. The police intend to investigate him for further crimes.
Nakamoto’s arrest marks what is believed to be Japan’s first case for such an offence.
“This is the first case in Japan where police have caught an AI user,” a lawyer who has tried in cybercrime cases, Daisuke Sueyoshi, told VICE.
“At the moment, there’s no law criminalising the use of AI to make such images.”