The topic of data privacy and information security is continually highlighted as one of the preeminent challenges of a rapidly advancing society. Issues of privacy and security are often seen challenging our current legal and technical capacity to control them, raising questions of risk, responsibility, and governance. This post discusses the challenge of data privacy from the oft forgotten lens of children as they increasingly participate in a data driven society at younger and younger ages.

As entertainment for children becomes a more connected enterprise, a child’s data becomes immensely valuable and immensely accessible. As with other consumers, the opportunity to establish a tailored trail for advertising, especially as preferences are initially being formed, creates a large incentive to collect and store a child’s data. The increasing use of “smart toys,” “connected toys,” and “connected smart toys” is an example of the increasing connectivity of child entertainment which comes with concerns over the children’s data. Smart toys are essentially toys embedded with software that allows it to alter its behavior, connected toys are toys with built in internet capabilities, and connected smart toys are toys with both features. In 2017, the FBI’s Internet Crime Complaint Center issued a consumer notice on its website warning of potential privacy concerns from internet connected toys. A large part of the concern is the amount of personal information collected and the security measures or potential lack of security measures from those who handle collected data. Another concern is the consistently active functionality of some toys that are always on in the background and therefore capable of constantly collecting data and at risk of hacking.

Another problematic aspect of child entertainment and privacy is the internet connectivity itself. The expanding availability of free entertainment online content means more children and families online, more data available, and more data at risk of compromise. As children become more savvy and online content becomes more attractive and available, children are better equipped to circumvent age restrictions and are sometimes aided in doing so by consenting parents. The challenge persists as there’s so much positive and useful content for children online but no clearly good or safe way for kids to behave online. Reaping the rewards of having children online often therefore means taking risks with their ability to be online. Control is also difficult for parents as there are so many access points for children to get online beyond the parent’s control. This means even if your kids only use YouTube Kids at home you have minimal control over them using the regular YouTube platform somewhere else.

The Children’s Online Privacy Protection Act (COPPA), passed in 1998, was meant to protect the privacy of children online. The act essentially comprised of (1) Notice of Data Practices – requiring operators to give parents notice of what data will be collected, what security practices are in place and what privacy policy was employed; (2) Parental Consent – requiring verifiable parental consent before information was collected from a child; (3) No Conditional Participation – forbidding conditioning a child’s access to activity or rewards based on disclosing more information; (4) Required Reasonable Security – requiring operators and connected third parties to maintain reasonable security procedures to protect information collected from children; (5) Data Collection – limiting the keeping of collected data only as long as reasonably necessary to fulfill the purpose for collection and requiring the deleting of collected data after such purpose had been fulfilled.[1]

Issues with COPPA include but are not limited to the boundaries on personal information, lack of enforcement, and difficulty of compliance. If personal information is not collected then parental consent is not required, but lots of information can still be collected that creates risk of harm but is outside the definition of personal information in the act. Sound recordings are an example of information that is collected but would not naturally fall into the definition of personal information in the act. Another issue is that many sites are known to be accessed by children without parental consent but don’t face enforcement because of difficulties in proof or difficulties in compliance. For some sites like Facebook, for example, compliance is in the terms of agreement or in the sign-up process. Facebook requires users to be 13 to create an account, but that can easily be circumvented. Section 12 of the YouTube terms of service similarly forbids the site’s use by children. YouTube created a separate platform for kids to limit child traffic on its main platform where algorithms are set to gather user data, but children are easily able to use either YouTube platform as long as they have an incentive to do so. Earlier this year 23 consumer advocacy groups filed a complaint with the FTC alleging YouTube was violating COPPA because it has actual knowledge that those under 13 use its site but continues to harvest their data as it would any other user.

The risk with children’s data is not just privacy but security. With the increased prevalence of data breaches and hacks, protecting the data of children becomes all the more urgent and important There have already been cases such as the Vtech breach where the lack of security exposed the data of millions of children, and such breaches are not unique. In the marketing context children are most malleable as they have yet to form strong preferences for consumption, and thus access to their data means opportunity to shape behavior towards certain goods or characteristics, especially since parents rarely manage their child’s use and interaction with data, often contributing to the problem themselves.

The challenge is in the structure of online privacy itself and echoes similar questions on the issues of data privacy for adults. The legal and regulatory means of control are simply behind the technology. The current best course of action is to improve the technical means of control while making consumers more savvy and aware of the risks that exist and of ways to respond. Legal and technical solutions might have to involve a complete rethink of the structure of online privacy and adaptations of new systems.

[1] Sara H. Jodka, The Internet of Toys: Legal and Privacy Issues with Connected Toys, Dickinson Wright, https://www.dickinson-wright.com/news-alerts/legal-and-privacy-issues-with-connected-toys (last accessed Oct. 29, 2018).

Enoch Ajayi is a J.D. candidate, 2020, at NYU School of Law.