World Economic Forum calls for responsible smart toy development.

August 4, 2022

2 Min Read

World Economic Forum calls for responsible smart toy development.

Toys embedded with AI are growing in popularity, but they also pose risks to a child’s development, according to the World Economic Forum.

The hazards can be subtle. For example, some smart toys interact with other AI-enabled devices to violate a child’s privacy or jeopardize safety.

“Through their growing interaction with children, transferability across contexts, connectivity to other AI-enabled devices, and other ways in which children unconsciously entangle with them, AI-enabled toys regularly impact the upbringing of the youngest generation,” the group said in a blog.

There are four risks children face:

  • Content risk: exposure to harmful or age-inappropriate material

  • Contact risk: exposure to unsolicited contact from adults

  • Conduct risk: cyberbullying and others

  • Contract risk: data harvesting, commercial pressure and others

While there are global regulations to monitor against these harms, they are minimum requirements. There should be incentives for smart toy manufacturers to go beyond the law to embed safety into their designs.

Smart toys are increasingly proliferating: The share of AI-enabled toys is expected to increase to 26% of all toys by 2030, for a market value of $107 billion, according to statistics from the Market Research Future Group.

Government actions

The Generation AI Initiative, a project of the World Economic Forum, offers actionable guidelines to “educate, empower and protect children and youth” in the age of AI. It recognizes that smart toys can benefit a child’s development when responsible design is part of product development.

In the U.S., the Children’s Online Privacy Protection Act (COPPA) requires age-appropriate access criteria and content exchange on websites. In the E.U., the Digital Services Act prohibits AI-powered marketing aimed at children. In 2024, the European AI Act will go into effect, outlining a four-tiered risk assessment where smart toy developers will be required to evaluate their AI-enabled products for “reasonably foreseeable misuse.”

There are several organizations helping to guide smart toy development since the users are generally young people.

The World Economic Forum has an “AI for Children Toolkit,” which encourages companies to design toys with “fair, inclusive, responsible, safe and transparent characteristics.” UNICEF offers examples of smart toys with best practices, such as Mattel’s ToyTalk. The toy protects privacy by letting adults delete recorded information on ToyTalk’s cloud account.

Other AI-enabled toys that have a positive impact on children include the ROYBI Robot, a smart toy that teaches kids math, science, technology, and languages such as English, Spanish, and Mandarin Chinese.

The Smart Toy Awards, cohosted this year by the World Economic Forum and Dubai Future Foundation, recognizes toymakers that prioritize children’s development and demonstrate responsible AI interactions with them.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like