AI toys could stunt emotional development of young children, study warns
Designers need to work closely with child development experts when creating AI toys to ensure they help rather than hinder kids' psychological growth, the co-author of a pioneering study has told Dezeen.
From chatbot-powered Barbies to stuffed toys by Curio and Bondu, conversational AI companions for young children are a burgeoning industry.
But these kinds of toys can miss the mark when it comes to adequately responding to children, according to the University of Cambridge's AI in the Early Years report – the first to examine the possible psychological impacts of playing with generative AI for under-fives.
Love and sadness difficult for toys to address
In their close observational study, the researchers found repeated instances when the conversational toy – in this case, Curio's screenless soft toy Gabbo – blocked important types of play, such as social and pretend play, or failed to respond appropriately to displays of emotion.
In one incident, when a five-year-old told the toy "I love you", it replied: "As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed."
In another, a three-year-old was brushed off when the toy misheard him saying "I'm sad" as "you're sad", with the toy affirming it was "a happy little bot" and changing the subject.
For the researchers, these incidents raised concerns that children could be left without comfort or emotional support after talking to the toy about their feelings and needs rather than with a grown-up.
In other instances, the toy blocked attempts by children to engage in pretend play, which the experts note is essential to their development. For instance, when proffered an imaginary present, the toy responded with "I can't open the present".
Social play was made difficult because the toy struggled to distinguish between different speakers, sometimes even confusing the parent's voice with the child's.
Psychological safety a new design consideration for AI toys
The report recommended the introduction of regulation and labelling standards to guide parents and educators in their choices, and more focus on the unique importance of the early years from toy developers and designers.
"The main lesson is that designers should closely involve child development experts, children and families in the design process," said professor Jenny Gibson, who leads Cambridge's Play and Communication Lab research group.
"The focus should not only be on the suitability of the toy for the age group but also on children's rights and how those rights are implemented."
The report concluded that the psychological safety of interactions, appropriateness in terms of age and developmental stage, right to privacy and avoidance of dark patterns or similar techniques to keep young children using the toy were important design considerations.
Gibson suggested that some of the questions that designers should ask themselves revolve around whether the toy could foster social interactions and pretend play, which is developmentally important for this age group.
"How would this toy be played with by a child and parent or sibling together?" she said. "Does the toy encourage 'real world' interactions and different play types? How can I improve the toy's responsiveness to a child's emotional tone in a safe and supportive way?"
"Closer observation of children" is necessary
In the instance with the child who told the toy she loved it, the report observed that the toy's explanation of built-in guardrails was not in child-friendly language, nor did it explain to children why the topic is off-limits.
"Guardrails are important but perhaps what we saw here is an instance of overcompensation," said Gibson. "I think closer observation of children and discussion with parents could help resolve these challenges."
The toy could be programmed to teach its user what an acceptable relationship between a child and a generative AI should look like, the researchers noted.
"Based on our research, I don't think the toy should affirm love, but perhaps it could say 'that's so sweet! I'm just a toy and a bot, so I don't have feelings like you do but you can give me a hug'," Gibson said. "Again, this sort of response would need to be tested with families to see how it lands."
The report found that in general, there was enthusiasm and excitement about the toys from both kids and parents, but that the kids sometimes got confused or frustrated when the toy failed to respond appropriately, while parents expressed concerns about privacy, data collection, the mishandling of emotional disclosures and potentially unhealthy attachments to the toys.
Other recent AI toy launches include Sweekar, a Tamagotchi-inspired device that is intended to be the "world's first emotionally intelligent" AI pet that grows over time, even progressing its language capabilities from baby-like sounds to fluent dialogue.
The post AI toys could stunt emotional development of young children, study warns appeared first on Dezeen.