Facebook recently announced that it will roll out a standalone messaging platform for children called “Messenger Kids,” for users ages six to 12 to safely chat with family and friends.
Much like Facebook’s existing messenger app, Messenger Kids will include a video call option including playful filters, sound effects, and age-appropriate GIFs. Most importantly, Messenger Kids users will only be able to interact with individuals their parents or guardians approve first.
The social network says its new kid-focused app is the result of more than a year of research, as well as advice and insight from online safety experts and roundtable discussions with parents. A common thread in the discussions? Parents wanted more control over their kids’ online experiences.
To protect the privacy of its youngest users, Facebook says it will collect “little data” from those who use the app. Additionally, kids won’t be able to access their parent’s Facebook accounts and won’t be searchable within the app. The app will also include tools to report or flag inappropriate content and block users.
Cybersecurity experts like CyberGhost CEO Robert Knapp are skeptical.
"The problem with targeting children is that companies like Facebook and Google and for that matter less sophisticated organisations do not have the capacity to secure their users; we have seen this recently where children were targeted on youtube platforms, and, despite Google's best attempts, they were not able to fully protect children on their platform," he told ConsumerAffairs.
"The risk beyond getting children hooked on these companies' platforms at a young age is that putting all the young users in one place actually increases the risk of them being targeted," Knapp said.
Parents must set up their child’s Messenger Kids account and approve who their kids can communicate with on the app. Parents can also see what their children are saying in their chats.
“None of the messages disappear or can be deleted, so parents can look at their kid’s device at any time to see their messages (this was a strong point of feedback we heard from parents),” a Facebook spokesperson told ConsumerAffairs.
“Many of us at Facebook are parents ourselves, and it seems we weren't alone when we realized that our kids were getting online earlier and earlier,” the spokesperson said. “We want to help ensure the experiences our kids have when using technology are positive, safer, and age-appropriate, and we believe teaching kids how to use technology in positive ways will bring better experiences later as they grow.”
Knapp's recommendation is to simply limit time a child spends using any social media platform. "I believe the longer you can keep children away from the screens the better, and especially when it comes to social interaction," he said. "Encouraging them to communicate with their friends through online communication is an extreme step, especially when powered by the largest social network, which is powered by data."
Potential safety risks
Despite its heavy focus on safety and privacy, the app is likely to draw criticism -- especially in the wake of safety flaws recently discovered in other apps geared toward young children.
Back in November, inappropriate cartoons were found on the popular YouTube Kids app, even after the company updated the app to include enhanced parental controls and other key safety measures. Meanwhile, Snapchat -- the social media platform most known for its disappearing messages -- recently made news for being used by child predators.
Children’s online safety experts say it’s hard to predict how a new app is going to be used and what unintended consequences there might be. But Facebook says Messenger Kids was created with the goal of providing a safe online experience tailored to kids’ needs.