Are Digital Assistants Like Alexa Safe For Children?

Do you hear Alexa’s voice so often it some­times feels like you have a new fam­i­ly mem­ber? You’re not alone. Dig­i­tal assis­tants, also known as voice assis­tants, are becom­ing a part of dai­ly life for many fam­i­lies. They are use­ful and enter­tain­ing, offer­ing end­less triv­ia and access to scores of songs and games. Often, they pro­vide much need­ed respite for busy par­ents. Jel­lies cre­ator, Ken Yarmosh, for exam­ple, uses mul­ti­ple dig­i­tal assis­tants in his home. His young chil­dren enjoy inter­act­ing with them and the Jel­lies app on a reg­u­lar basis. 

But are dig­i­tal assis­tants actu­al­ly bad for kids? We spoke to Axios and The Wash­ing­ton Post about some of the con­cerns par­ents have with this new tech­nol­o­gy. We then dove in to deep­en our under­stand­ing of how dig­i­tal assis­tants impact our children’s social skills, emo­tion­al and cog­ni­tive devel­op­ment, and privacy. 

Here’s what we learned. 

Download the Jellies App

No unboxings. No ads. Just good, quality videos for kids.

Do I Have To Be Nice? Or “Alexa, Give Me a Cookie Now!”

Before we get into the obvi­ous ques­tion, Will Alexa make my child bossy and rude?,” let’s tack­le the ben­e­fits dig­i­tal assis­tants can have on your child’s social skills.

Your fam­i­ly may have noticed that there’s a cer­tain way you have to com­mu­ni­cate with dig­i­tal assis­tants. It’s dif­fi­cult enough as adults to under­stand young chil­dren. It seems to be even more so for arti­fi­cial intel­li­gence algo­rithms. Alexa and oth­er dig­i­tal assis­tants encour­age chil­dren to phrase their ques­tions and com­mands a cer­tain way, change the pitch of their voice, and enun­ci­ate their words. For exam­ple, chil­dren can’t use the Ama­zon Echo with­out learn­ing how to use a cer­tain key­word, Alexa,” to start an inter­ac­tion. Alexa also encour­ages chil­dren to be qui­et when anoth­er per­son is talk­ing. These restric­tions pro­vide oppor­tu­ni­ties for par­ents to teach their chil­dren to think about the way they speak.

This is some­times a hard les­son to learn. Since the devices don’t explain what chil­dren need to do to get prop­er respons­es, chil­dren have to learn by watch­ing oth­ers. With­out those exam­ples, it’s easy for kids to become frus­trat­ed. Also, as good as these skills are to mas­ter, they don’t always car­ry over into deal­ings with oth­er peo­ple. Which can be a good thing, con­sid­er­ing that the most obvi­ous issue with kids and dig­i­tal devices is that the devices don’t dis­cour­age inap­pro­pri­ate ways of communicating.

That brings us to the down­sides. The way they are today, dig­i­tal assis­tants don’t dis­cour­age inap­pro­pri­ate ways of com­mu­ni­cat­ing. Alexa responds to Alexa,” not the more social­ly appro­pri­ate Alexa, please…”. And it doesn’t ask for a thank you,” either. It’s not uncom­mon for chil­dren to bark orders and scream at the devices, talk over oth­ers to get Alexa’s atten­tion, and dis­play oth­er rude behav­iors that go uncor­rect­ed. If these inter­ac­tions cre­ate a pat­tern of behav­ior used with oth­er peo­ple, then that’s a problem.

If they have a demanding, controlling approach to technology and they have unlimited access to that, then we should expect that pattern of behavior is going to carry through when they're talking with mom or dad, a grandparent, a neighbor.

- Ken Yarmosh

Is this some­thing par­ents should wor­ry about? Kaveri Sub­rah­manyam, a devel­op­men­tal psy­chol­o­gist and for­mer mem­ber of Child and Fam­i­ly Stud­ies at Cal­i­for­nia State Uni­ver­si­ty, Los Ange­les, is less con­cerned about dig­i­tal assis­tants turn­ing our chil­dren into brat­ty, lit­tle mon­sters. She’s more wor­ried that the oth­er fea­tures of dig­i­tal assis­tants — the ones that turn on lights, shows, music, and oth­er things — will deter kids from doing things for them­selves. Even still, I don’t think we have to be wor­ried about it or para­noid about it, but I do think it’s some­thing to be watch­ful for,” she told Tech­nol­o­gy Review.

Jen­ny Radesky, devel­op­men­tal behav­ioral pedi­a­tri­cian at the Uni­ver­si­ty of Michi­gan, and co-author of the Amer­i­can Acad­e­my of Pedi­atrics’ guide­lines for media use, shares sim­i­lar views.

Of course parents worry about these devices reinforcing negative behaviors, whether it's being sassy or teasing a virtual assistant. But I think there are bigger questions surrounding things like kids’ cognitive development—the way they consume information and build knowledge.

- Jenny Radesky

This brings us to…

Do Digital Assistants Teach Kids Anything? Or “Alexa, How Many Stars Are In the Sky?”

Hav­ing an end­less fount of infor­ma­tion for curi­ous kids is one of the appeals of keep­ing a dig­i­tal assis­tant in the home. After all, what fam­i­ly doesn’t want their child to imag­ine explor­ing their world or even the space sur­round­ing it? 

This intend­ed ben­e­fit is illus­trat­ed in one of the Ama­zon Echo Dot Kids Edi­tion com­mer­cials where a young girl asks, Alexa, how many stars are in the galaxy?” Alexa answers, Many astronomers believe it con­tains at least 100 bil­lion stars.” The girl makes an aston­ished noise and goes back to stick­ing glow-in-the-dark stars up on her bed­room wall.

The inter­ac­tion ends there, with­out any fol­low-up or con­text. And though the scene makes you think that this girl may con­tin­ue to pur­sue her inter­est in space, pro­fes­sion­als and par­ent­ing advo­cates won­der if dig­i­tal assis­tants, like Amazon’s Alexa, might hin­der mean­ing­ful learn­ing rather than pro­mote it.

Learn­ing hap­pens hap­pens when a child is chal­lenged by a par­ent, by anoth­er child, a teacher — and they can argue back and forth.” That’s accord­ing to Jus­tine Cas­sell, devel­op­men­tal psy­chol­o­gist, direc­tor emer­i­tus of Carnegie Mellon’s Human-Com­put­er Inter­ac­tion Insti­tute, and expert in the devel­op­ment of AI inter­faces for children. 

While Alexa might be able to answer your child’s ques­tions, there’s none of the back and forth that Cas­sell empha­sizes. What’s more, the way chil­dren have to phrase their ques­tions could dis­cour­age deep­er thought and nuance. Your child may hear Neil Arm­strong” when ask­ing Who was the first per­son was to walk on the moon?.” But dig­i­tal assis­tants can’t answer any­thing more com­plex. Alexa can’t describe the enor­mous chal­lenge of build­ing a space shut­tle, or what Arm­strong felt when he hopped down off the Lunar Mod­ule. If there’s no answer wait­ing, why even wonder.

Dig­i­tal assis­tants teach chil­dren that they will always have access to enor­mous amounts of infor­ma­tion at request. This abun­dance might mean that infor­ma­tion car­ries less weight or sig­nif­i­cance for chil­dren as they grow old­er. Since it’s so eas­i­ly obtained, chil­dren may get in the habit of rely­ing on Alexa over oth­er, more thor­ough meth­ods of research like vis­it­ing a library or ask­ing peo­ple they trust. They’re also not taught to con­duct their own search­es or ques­tion their sources for accuracy. 

This may not be the case as tech­nol­o­gy improves. 2007 Tana­ka study, for exam­ple, immersed a social robot in a tod­dler care cen­ter for five months. By observ­ing the inter­ac­tions those tod­dlers had with the robot, researched con­clud­ed that tech­nol­o­gy is already close to advanced enough to bond and social­ize with chil­dren over long peri­ods of time. This lev­el of inter­ac­tion “…could have great poten­tial in edu­ca­tion­al set­tings assist­ing teach­ers and enrich­ing the class­room environment.”

The robot in this study could social­ly inter­act with the chil­dren, walk, dance, sit down, gig­gle, among oth­er things. A bit dif­fer­ent than the dig­i­tal assis­tants we know now, but per­haps not far in our future. Dig­i­tal assis­tants may soon reach the point where they can inter­act in more com­plex ways with chil­dren. They may be able to car­ry on more mean­ing­ful con­ver­sa­tions about top­ics and be a ben­e­fit to young minds.

Are Robots People? Or “Hey Alexa, How Old Are You?”

Dig­i­tal assis­tants com­mu­ni­cate in some of the same ways humans do. They talk in human-like ways with human-like voic­es. They answer ques­tions and even joke and laugh in response to what we say. So what do chil­dren think about dig­i­tal assis­tants? Do they believe there’s actu­al­ly a woman named Alexa inside that small, cir­cu­lar device? 

Here’s what 4‑year-old Han­nah had to say about her Alexa in the Tech­nol­o­gy Review arti­cle Grow­ing Up with Alexa

Alexa is “a kind of robot” who lives in her house, and robots, she reasoned, aren’t people. But she does think Alexa has feelings, happy and sad. And Hannah says she would feel bad if Alexa went away. Does that mean she has to be nice to Alexa? Yes, she says, but she’s not sure why.

- Hannah

While what chil­dren think about tech­nol­o­gy seems to depend on the age and expe­ri­ence of the child, researchers have made a few observations:

  • Chil­dren try to inter­act with tech­nol­o­gy the same way they do with peo­ple. They touch and hug, talk to, play with, and attempt to take care of robots and oth­er tech. Tana­ka study.
  • Kids think of robots dif­fer­ent­ly than oth­er tech­nol­o­gy and objects. They are more like­ly to think that a robot has feel­ings or can be their friend than some­thing obvi­ous­ly inan­i­mate, like a stuffed ani­mal. This is even true when chil­dren believe that the robot isn’t alive or real.” Kahn study.
  • The more chil­dren are able to inter­act with robots on an emo­tion­al and psy­cho­log­i­cal lev­el, the more they believe the robots have emo­tions and intent. Turkle study.

Chil­dren also tend to anthro­po­mor­phize dig­i­tal assis­tants. That is, they assign human char­ac­ter­is­tics to objects and ani­mals. It’s as sim­ple as refer­ring to Alexa as a she” rather than an it” and char­ac­ter­iz­ing dig­i­tal assis­tants as friend­ly and trust­wor­thy, as was the case with kids ages 3 – 10 in an MIT study. The chil­dren also asked the device ques­tions that they would ask oth­er peo­ple, like Alexa, what is your favorite col­or,” and Hey Alexa, how old are you?” That same study also indi­cat­ed that the younger par­tic­i­pants were more like­ly to test Alexa to fur­ther under­stand her” and see whether she” could do things that peo­ple can do. They asked her, for exam­ple, Can you open doors? What are you?” 

So what does this mean for your family? 

Millions of parents have bought computer toys hoping they will encourage their children to practice spelling, arithmetic, and hand eye coordination. But in the hands of the child they do something else as well: they become the occasion for theorizing, for fantasizing, for thinking through metaphysically charged questions to which childhood searches for a response.

- Sherry Turkle

Dig­i­tal assis­tants pro­vide oppor­tu­ni­ties for chil­dren to start under­stand­ing the con­cept of robot.” As chil­dren grow old­er, these inter­ac­tions help them make fur­ther obser­va­tions about the nature of tech­nol­o­gy and human­i­ty. This skill will become more use­ful as tech­nol­o­gy becomes more advanced and inter­acts with us in dif­fer­ent, and more human-like, ways.

Do Digital Assistants Share My Data? Or “Alexa, Can You Keep a Secret?”

It’s not uncom­mon for new tech­nolo­gies to col­lect and share data about their child users, despite U.S. pri­va­cy laws. Dig­i­tal assis­tants are no excep­tion. They are privy to an unset­tling amount of pri­vate infor­ma­tion. They hear, record, and share, some­times unex­pect­ed­ly, con­ver­sa­tions with­in the home. 

Ama­zon and Google cur­rent­ly share some of the data col­lect­ed by dig­i­tal assis­tants. Accord­ing to The New York Times, Amazon’s terms of use per­mits it to share your infor­ma­tion requests and zip codes, and Google says it may share tran­scrip­tions of what you say to third-par­ty ser­vice providers. 

In 2017, out­cry from par­ent­ing advo­cates encour­aged toy­mak­ers to can­cel Aris­to­tle, a kid-friend­ly smart device. The device, an all-in-one baby mon­i­tor and vir­tu­al assis­tant that came with a cam­era and micro­phone, was meant to soothe upset chil­dren and teach them fun­da­men­tals, like ABCs. Par­ent­ing advo­cates warned against the poten­tial sur­veil­lance and data col­lec­tion that could come with a device meant for a child’s bedroom. 

Pri­va­cy con­cerns is one of the strongest rea­sons to keep dig­i­tal assis­tants out of your home. Is it enough though? 

People should be more aware of the risks involved with smart home devices, especially the ones that have always-on listening status. Companies should do better to inform people of their privacy rights. And we need better laws to protect people against privacy harms. But I wouldn't say that everyone should necessarily avoid these devices. They are convenient and they can be particularly helpful for people with certain disabilities, for example.

- Tiffany Li, privacy attorney at Yale Law School’s Information Society Project.

Unfor­tu­nate­ly, this doesn’t look like a trend that’s going away. Both Google and Ama­zon have filed patents that may increase how much infor­ma­tion these devices can mon­i­tor and col­lect. Beyond record­ing audio files, these patents seek to ana­lyze audio based on spe­cif­ic spo­ken words, detect a child’s mis­chief, and deter­mine moods and med­ical con­di­tions. That data would then inform tar­get­ed adver­tis­ing cam­paigns. Those patents are still pending. 

(By the way, we don’t col­lect or share your child’s infor­ma­tion, and Jel­lies meets top safe­ty stan­dards.)

Conclusion

Tech­nol­o­gy is all around us, play­ing a more sig­nif­i­cant role in our lives. While you can decide to ban dig­i­tal assis­tants from your home, chances are your chil­dren will encounter and inter­act with them at some point. We believe that every fam­i­ly is dif­fer­ent, and that it’s impor­tant for par­ents to edu­cate them­selves before decid­ing whether to bring new tech­nol­o­gy into their homes. 

Please reach out and let us know what you think about chil­dren inter­act­ing with dig­i­tal assis­tants. We’re on Twit­ter and Face­book. Don’t for­get to check out our oth­er, in-depth safe­ty and tech­nol­o­gy resources on the Jel­lies blog as well as take a look at the Jel­lies app itself.