Now, AI can create fake humans. What's next?
With all the good things that AI brings on the table, there's also the risk of abuse. We have seen this in the case of 'deepfakes' where an AI was used to create fake pornographic images of celebrities to mar their reputation. Now, going a step further, an AI has started creating images of people that don't exist at all. Let's take a look.
Japanese researchers' algorithm created 'fake humans'
Researchers hailing from DataGrid, a Japanese start-up based out of Kyoto University, used an AI algorithm dubbed Generative Adversarial Network (GAN) to generate images of non-existent humans. Now, as surprising as this may sound, their system did insanely well, it generated thousands of humans - complete with different faces, skin tones, hairstyles, clothes, poses - that looked just like the real deal.
How the AI created these images
The army of eerily realistic humans, who continuously morph into a different person, has been created by pitting one neural network against the other. One of these generates human images, while the other checks if it's a real human image or a fake one. The process continues until the first one masters the generation process and the second isn't required for differentiation.
Researchers say this could be very useful for businesses
As the researchers behind these AI-generated humans mentioned, this technology could be employed to create thousands of virtual models for e-commerce companies and brands. Now, this could be a boon for businesses - particularly those in the fashion industry - that tend to spend millions of dollars on the advertising and promotion of their products in the digital space.
But then, there's a risk of abuse too
These images, and those created by another similar AI, are so realistic that it is hard to differentiate what's real and fake. Now, if by any chance AI manages to integrate these faces with synthesized human speech, expressions, and other human-specific things, we could see a completely fake person walking in a clip. This can make cases of fake news, events and much worse.