DeepNude, an application that can undress people, but mainly women, in photos using machine learning has been taken offline. The makers did not expect the popularity of the app and considered the chance of abuse too great.
“The world is not yet ready for DeepNude,” the makers state in a tweet, while they wrote earlier that day that work was being done to resolve technical issues caused by the many visits and purchases of the app. The makers do not speak of the reactions to the app. Social media reacted mixedly to the application and its withdrawal. On the one hand people were enthusiastic about the possibilities of the app and on the other hand the idea was labeled as morally reprehensible. According to The Register, the app works ‘not well’ for men.
The desktop version of the app, suitable for Linux and Windows 10, was only available for five days. However, the browser version has been active since March 28. The free version of the application did its job, but covered the produced photos with clear watermarks. The paid version, which cost $ 50, placed a smaller watermark.
Although the project has been stopped, activated, paid copies and downloaded free copies will continue to work. These can be put online again by others, the free version can be cracked and someone else can put a clone of the application on the web, which means that it is actually too late to stop DeepNude, although the makers now have no money earn more from it. DeepNude was developed by a team in Estonia.
The use of machine learning when forging photos and videos can produce surprising results. The so-called DeepFakes app replaces the faces of actresses in porn movies with those of, for example, a celebrity. Such images are prohibited on Twitter. The American Defense has tools to identify these fakes.