Should you get your child an AI doll this holiday?


(MENAFN- The Conversation) The technological revolution has hit the doll aisle this holiday season in the form of artificial intelligence dolls. The dolls blend a physical toy with either a mobile device and app, or technological sensors, to simulate signs of intelligence.

As an education scholar who conducts research on popular toys, these dolls make me pause to think about the direction of human progress. What does it mean for children's development, to confuse real bodies with machines?

Take Luvabella. She's supposedly . Luvabella looks at you with an astonishingly life-like expressive face, her lips and cheeks moving animatronically, her sizeable eyes blinking. She responds to your voice, laughs when you tickle her toes, plays peek-a-boo, babbles and . As babies go, she's pretty upbeat and low maintenance.

It may be that Luvabella is a natural progression in the evolution of dolls for children — from dolls that look real, to dolls that open and close their eyes when standing or lying down, to dolls that 'drink' from a bottle, 'pee' and talk.

She may be a natural progression of technological developments: From robotic vacuum cleaners, self-driving vehicles and automated weapons systems to autonomous dolls.

But when children are involved, questions about their safety and well-being may come to the fore. Is Luvabella problematic, even dangerous?

Children's emotional attachment

It's not the toy's potential to be hacked that is the problem. ' because children's locations and conversations with these dolls were recorded, tracked and transmitted online.

Luvabella isn't connected to the internet so she isn't a privacy risk.

It's not the impact of these tech toys on children's imaginations either. Early childhood educators may worry that tech-enhanced toys will stifle children's creativity, but I like to believe that children's imaginative powers are pretty robust, and that an AI doll won't mar their ability to pretend.

My Friend Cayla dolls were pulled and discontinued because of privacy concerns. (Handout)

I'm comfortable — and familiar — with seeing a child derive companionship from a cuddly toy. But a robot doll is a new concept.

Children develop emotional connections with their stuffed animals and turn to them for comfort in moments of anxiety. The English paediatrician and psychoanalyst described the importance of a child's emotional attachment to his or her ' as a means of lessening the stress involved in separating from a parent.

Children instill special qualities of warmth and comfort onto their teddy bears, dolls and cuddly toys. They imbue their action figures with vitality and make voices and movements for them.

Like Calvin with his toy tiger Hobbes, acting as if a toy or soothing object is real is what most of us would deem a healthy part of growing up — what childhood is supposed to be. It's a form of childhood magic that makes us adults feel good. A robot doll does not.

Uncanny deception

With a regular doll, the child is the active agent in the relationship. Whatever the toy may give in return is the child's invention. We adults know that the teddy bear or cuddly doll is not alive. At a certain level so does the child.

Children . This ability helps them engage in developmentally rich imaginative play. But this ability is not without moments of confusion.

When a robot doll responds, there may be an uncertainty for the child (and at times the adult) about whether the doll is a living creature or not. This experience of being confused between human and automaton is how explained the term 'uncanny' in 1906.

The deception is unnerving. Maybe, as MIT technology professor Sherry Turkle worries, . It's the same reason I'm nervous about the idea of .

Should objects mimic empathy?

To be sure, some forms of robotic assistive technology are downright interesting — , robots that help a person move from bed to wheelchair.

But expecting a robot to provide companionship, caring or to somehow help a person feel less alone (see ') is cause for concern. It raises questions about

Should an object mimic sentient presence, comfort, even empathy?

Indeed, Luvabella's features are similar to Paro, a . This small automated animal simulates pet therapy, by responding to voice direction and being stroked.

Uncharted territory

Toy companies are pressured to make artificial intelligence toys in order to keep up with societal and technological trends, and to continue to yield profit in a highly competitive business. It isn't their job to consider the human implications.

While the technology isn't quite there — at least not with Luvabella — to fully replace human contact, the direction of technological development is clear: Artificial intelligence is generating more and more convincing simulated human presence.

I have to ask: Is this where we want to go with our technological achievements? To develop convincing animatronic dolls to give to four-year-olds to befriend?

If so, stay tuned. We are stepping into uncharted territory.


The Conversation

MENAFN1312201701990000ID1096231243


Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.