Samsung unveiled its 'artificial human' called Neon

NEON is inspired by how humans look, behave and interact


Afp January 08, 2020
PHOTO: Twitter

LAS VEGAS: Avatars touted as "artificial humans" created a buzz Tuesday at the Consumer Electronics Show even as debate swirled on what exactly the digital entities were.

Star Labs, a startup funded by Samsung, showed the painstakingly detailed AI-powered, two-dimensional digital creations to a large crowd, saying they are able to "converse and sympathise" like real people.

A demonstration at CES showed conversations and gestures from the digital creations modeled after real humans.

According to the California-based unit of the South Korean company, the technology allows for the creation of customised digital beings that can appear on displays or video games and could be designed to be "TV anchors, spokespeople, or movie actors" or even "companions and friends."

Pranav Mistry, chief executive of the lab, said the creations known as NEONs are modeled after people but can show highly detailed expressions and gestures, and even new characteristics that can be programmed.

 



"They look very human, in part because they are modeled after a human," but can even speak in languages the person hadn't spoken before, Mistry said.

In developing the creations, Mistry said it "felt like magic to us and we wanted to share this magic."

According to Star Labs, NEON is inspired "by the rhythmic complexities of nature and extensively trained with how humans look, behave and interact."

While digital avatars have long been able to be programmed for specific tasks such as role players in games, NEON goes further by enabling interactions that can incorporate human emotion.

Artificial intelligence puts final notes on Beethoven's Tenth Symphony

Although artificial humans may borrow features from real people, "each NEON has his or her own unique personality and can show new expressions, movements, and dialogues," the company said.

The NEON creators said the new virtual humans are the product of advances in technologies including neural networks and computational reality.

But the invention did not impress everyone.

Ben Wood of the consultancy CCS Insight said he was "underwhelmed" after seeing the NEONS.

He tweeted that, on the booth, they just look like "videos of actors which can be manipulated to do certain actions. I must be missing something."

Avi Greengart of the consultancy Techsponential said the avatars could be realistic but also "creepy."

Pakistan to adopt modern trends of artificial intelligence: IT Minister

"Leaving aside how impressive the technology is, will NEON be used in ways that people like, just tolerate, or actively hate?" he said.

Jack Gold, an analyst at J. Gold Associates, said Samsung may be ahead of the pack if it can develop avatars that can show emotions and expressions but also questioned the potential for abuse.

"It has major implications for many fields like customer service, help desk functions, entertainment, and of course could also be used to 'fake' a human interacting with a live person for bad or illegal purposes."

The announcement comes amid a proliferation of AI-manipulated computer videos known as "deepfakes," and growing concerns about how they could be used to deceive or manipulate.

But Mistry said the computation techniques of deepfakes are "completely different" than those used for NEON.

He told AFP that his technology "doesn't manipulate any content" but "creates the content as it goes along."

He maintained that "no one will ever have access to the technology at its core, and that is what we are designing from the ground up."

Mistry offered no specifics on the company's business model but suggested partner firms may want to use these digital creations for various services.

The India-born Mistry was known for developing Sixth Sense, a gesture-based wearable technology system built at the Massachusetts Institute of Technology.

COMMENTS

Replying to X

Comments are moderated and generally will be posted if they are on-topic and not abusive.

For more information, please see our Comments FAQ