A seedy app that used a type of artificial intelligence to “undress” images of clothed women has been taken offline by its developers.

Motherboard first drew attention to DeepNude, reporting on Wednesday that the app used machine learning technology to “undress” the images by algorithmically superimposing body parts on them.

Citing an anonymous programmer who goes by the alias “Alberto,” Motherboard reported that the app had been “trained” using a machine learning system called Generative Adversarial Networks (GANs). The app only works on images of women because there are thousands of images of nude women online in porn, Alberto told Motherboard.

The app sparked outrage and has been widely condemned. On Thursday, its servers were also struggling to cope with the demand for the app.

TERRIFYING HIGH-TECH PORN: CREEPY 'DEEPFAKE' VIDEOS ARE ON THE RISE

On Thursday the @deepnudeapp Twitter account tweeted that the app was offline. “Why? Because we did not expect these visits and our servers need reinforcement. We are a small team. We need to fix some bugs and catch our breath. We are working to make DeepNude stable and working. We will be back online soon in a few days,” it said.

A subsequent tweeted statement explained that the app had been taken offline for good. “We created this project for user’s entertainment a few months ago. We thought we were selling a few sales every month in a controlled manner,” it said. “Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic.”

“Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high,” DeepNude said. “We don’t want to make money this way.”

“The world is not yet ready for DeepNude,” it added.

NEW TOOL DETECTS DEEPFAKES WITH 96 PERCENT ACCURACY, RESEARCHERS SAY

A free version of the app placed a large watermark across the images, Motherboard reports, while a $50 version of placed a smaller watermark in a corner of the image. This could be easily cropped out, according to Motherboard.

The controversy over DeepNude comes amid growing concern over the potential negative impact of AI. The technology, for example, can be used to create “deepfake” videos with superimposed images or sound.

CLICK HERE TO GET THE FOX NEWS APP

Follow James Rogers on Twitter @jamesjrogers