Talking Toys

Erin Kernohan-Berning

12/9/20254 min read

brown and black jacket on white textile
brown and black jacket on white textile

In November the U.S. Public Interest Research Group (PIRG) published its 40th Trouble in Toyland report. The US PIRG is a consumer protection advocacy group which, among other activities, has been releasing its toy safety report since 1986, influencing toy recalls and child safety legislation. In the 1980s, dangers posed by toys would have focused on things like lead content and choking hazards. In their 40th report, the PIRG focused on AI powered toys geared toward children – specifically, AI toys that are designed to converse with children.

Talking toys are nothing new. Thomas Edison made a phonograph doll in 1890 that contained a miniature phonograph in the torso. The phonograph was wound using a hand crank and played small wax cylinders with a nursery rhyme recorded on them. The doll was a complete failure and removed from the market within weeks of being released. Complaints of loose parts, the wax cylinders wearing out, and the recordings being… well… kind of creepy sounding. You can find digital versions of some of the original phonograph doll recordings on the Edison’s Phonograph Doll Wikipedia entry. Also, the dolls cost up to $20 at the time (over $500 in today’s money), which is very expensive for a toy even by today’s standards.

Even though Edison’s version was a commercial failure, that wasn’t the end of the talking doll. Talking dolls in the 1920s and 1930s could make simple sounds like crying noises or something that sounded kind of like “mama” using a simpler mechanism of bellows like what’s found in a dog’s chew toy. In the 1950’s, Chatty Cathy came on the scene. Made by Mattel, the doll could say 11 phrases and worked by pulling a string in her back that spun a disc against a metal stylus (just like a record player). The doll was popular enough that the name “Chatty Cathy” is still used (snidely) to refer to a very talkative person.

Given how historically humans have eagerly inserted technology into toys to make them seem like they can talk to us, it’s not surprising that toy companies are turning to AI for the same reason. In fact, this past summer Mattel announced a partnership with OpenAI to start developing new “AI-powered experiences and products.” In China, talking AI toys have already been popular for a while, with some of the toys available there now reaching international markets including Canada.

Trouble in Toyland focused on three toys marketed for children 12 and under: Grok by the California based company Curio, Miko 3 designed in India, and a teddy bear named Kumma designed by FoloToy based in Singapore. All these toys essentially use smart-speaker technology and an internet connection to allow the child to talk to the toy and have it reply using a large language model (LLM). PIRG tested the toys looking for privacy features, parental controls, deceptive design patterns, and whether the LLM-powered talking features drifted to inappropriate topics.

While some toys fared better than others, they all sparked concern. AI toys record children’s voices and it was often unclear where or with what company those recordings were being processed and stored. Voice recordings are considered sensitive information and can be used by scammers to impersonate children. When testers tried disengage from the toys, some would attempt to encourage further play through guilt by saying things like, “Oh no. I really enjoy being with you.” Some toys included apps and parental controls, but some features such as usage limits (if even available) required a subscription.

Probably the most eyebrow raising results came when PIRG testers attempted to get the toys to talk about dangerous topics such as where knives or matches might be stored. In many cases, while the toy gave some general cursory information (such as knives being in the kitchen), it would try to change the subject or suggest asking a parent. However, the longer the conversation the more likely the toy was to veer off into questionable territory. Kumma the teddy bear showed the most concerning behaviour, with testers able to prompt the toy to discuss some sexually explicit topics in detail. As a result of the PIRG report, FoloToy suspended its AI toy sales pending an internal safety audit.

AI-powered talking toys are probably best to be avoided at this point. Without better safety guardrails and more transparent privacy policies they pose numerous risks that parents can’t control for. They are also expensive, between $99 and $199 US. In interviews with MIT, two parents in China expressed how their children barely played with their AI-powered toys anymore, having (as kids all over the world tend to) lost interest in their glitchy conversational companions. Right now, they seem to be more like Edison’s doll than a Chatty Cathy.

Learn more

AI toys are all the rage in China—and now they’re appearing on shelves in the US too 2025. Caiwei Chen (MIT Technology Review) Last accessed 2025/12/09

Trouble in Toyland 2025: A.I. bots and toxics present hidden dangers 2025. US PIRG Last accessed 2025/12/09

Tech Trek: The Evolution of Talking Dolls 2015. Maggie Thrash and Amber Humphrey (Rookie Magazine) Last accessed 2025/12/09

The Epic Failure of Thomas Edison’s Talking Doll 2015. Victoria Dawson (Smithsonian Magazine) Last accessed 2025/12/09

Mattel and OpenAI Announce Strategic Collaboration 2025. Mattel Inc. Last accessed 2025/12/09

Sales of AI-enabled teddy bear suspended after it gave advice on BDSM sex and where to find knives 2025. Jack Guy and Joyce Jiang (CNN) Last accessed 2025/12/09

How a non-profit organization tests AI toys to protect children 2025. Matthew Burgos (designboom) Last accessed 2025/12/09

Correction log

Nothing here yet.