One might assume that the U.S. Children’s Online Privacy Protection Act of 1998 (COPPA) and associated FTC COPPA Rule
[1] provide actual protection over our children’s personal information when they are online. In reality, however, COPPA and the COPPA Rule merely install a minor speed bump on the well-traveled information superhighway, requiring online content providers in the United States to get a parent or guardian to check a box before marketers, data brokers and the advertising technology (“ad tech”) industry
[2] can proceed to harvest and exploit the personal data of children 12 and under.
This post delves into the U.S.-centric notice-consent privacy structure in comparison with the broader global scheme of data protection and examines why and how COPPA fails to protect the personal information of children in the United States as they consume content – particular youth-focused entertainment – online.
U.S Notice-Consent Privacy Rules
Notice-consent privacy rules start with the premise that processing (e.g., collection, use, distribution, transfer, or sale) of personal information is permissible as long as reasonable notice is provided, and consent is obtained.
Under this standard U.S.-centric view, the act of making a privacy policy available serves as the requisite notice. The privacy policy or notice may include broadly written statements of what information is being collected, how it is being collected, and how it will be used. The statements do not need to be specific or limited, but rather can use language like “includes” and “for example”. Terms related to the intended processing of data can be undefined and ambiguous, such as “to improve our services” or “to improve your experience with us.”
Once notice is given, consent is obtained by the absence of immediate objection and the decision to proceed with using the offered service. This is formalized in most privacy policies or notices with a statement that the continued use of the site or service constitutes acceptance of the terms. Consent may be revoked, in whole or in part, through a process known as “opt-out,” but generally there are limited requirements to ensure an individual can easily opt-out. Most often opt-out functions only to limit targeted behavioral advertising, not the continued collection or further processing of personal information previously collected for other uses.
EU Data Protection Regulations
Data protection regulations, like the E.U. General Data Protection Regulation (GDPR), start with the premise that an individual’s personal information will not be subject to processing (e.g., collection, use, distribution, transfer, or sale) absent a lawful basis. This starting premise is based on the societal recognition of personal information (often referred to as personal data or “PD”) as a fundamental human right essential for personal autonomy and freedom. GDPR Article 6 provides an example, limiting the lawful processing of PD to one of the following six specific conditions:
Even when a company can establish a lawful basis for processing someone’s personal data, that data can only be collected for specified, explicit and legitimate purposes and it cannot be subject to further processing that is “incompatible” with the original purpose for which it was collected.
While consent is only one of the specified lawful-basis for processing personal data under a data protection regime, the definition of what constitutes consent significantly restricted, particularly when compared with the notice-consent structure. Under data protection, consent cannot be inferred or obtained merely by “continued use” of a website or service. Instead, consent is only valid under four conditions:
Due to the above requirements, generic, broadly worded notices that are standard in the U.S. under notice-consent rules would generally be invalid under data protection laws.
Guarding Personal Data of Children Online Under Data Protection
When it comes to the interests of children, the GDPR, like other data protection laws and regulations, sets increased restrictions on the lawful processing of personal data. For example, Article 6(f) of the GDPR, noted above, specifically calls out the superseding interest of protecting a child over the otherwise legitimate interest of the controller or third party. More specifically, GDPR Article 8 removes the option of obtaining consent from a child under the age of 16 and requires that verified consent be obtained from a parent or guardian. As explained in GDPR Recital 38:
“Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child.”
Under GDPR, children under 16 are protected three-fold when using websites or online services. First, the child cannot provide consent on their own. Second, their parents can withhold consent for the collection and subsequent processing of the child’s personal data while still allowing the child to obtain the services of a website or online service. Third, if consent was given, that consent can be later withdrawn by the parent or the child once the child reaches the age of 16, and that stops all further processing.
Guarding Personal Data of Children Online Under Notice and Consent
The title “Children’s Online Privacy Protection Act” indicates that the objective of COPPA is to protect the privacy of children in their online interactions, and officially, COPPA does afford some protection to children in the U.S. in a similar, although more limited manner than GDPR. First, children under the age of 13 cannot directly consent. Instead, notice must be provided to a parent or guardian and their verified consent must be obtained. Second, under COPPA a provider cannot “condition a child’s participation in a game, the offering of a prize, or another activity on the child disclosing more personal information than is reasonably necessary to participate in such activity.”
[3] Third, a parent can withhold consent for the rental or sale of a child’s information to third parties. And finally, a parent can rescind their consent.
In reality, however, the COPPA Rule merely provides a slightly more burdensome version of the standard US-centric notice-consent rule. Parental notice can still be in the form of a privacy policy that looks the same as the hundreds of other data privacy notices adults click through on a daily basis without reading. Once a parent is shown and checks the box or clicks through a data privacy notice for their child’s use of a website, app, or another online service—and that parental consent is verified—any content delivery platform, supported by a rapidly growing ad tech industrial complex, can then feast upon the cornucopia of data collected from the child.
At a recent FTC workshop on the “Future of the COPPA Rule,” FTC Commissioner Noah Joshua Phillips reaffirmed the view that COPPA is designed to protect children from online marketing, stating, “[t]he ability of a strange person to contact and communicate with a child is not the same as an advertisement appearing when they watch a show. We must recognize that.”
[4] Phillips further commented, “COPPA is all about empowering parents and protecting kids. You should keep that in mind.”But how is COPPA empower parents and protecting kids when it comes to privacy?
The Illusion of Walled Gardens
If the true purpose of COPPA is to protect children from websites where there may be online predators, then it makes sense to create special places online for children to go and feel safe. This has led to the proliferation of specialized applications or areas within websites or other services that contain limited content expressly intended for children. These are sometimes referred to as “walled gardens.”
Netflix, for example, has created an area on its streaming services labeled as “Kids.” Netflix also provides for the creation of multiple user profiles for a single account, including the option to designate a profile as being intended for a “Kid.”
[5] See Fig.1 below.
Figure 1
Similarly, YouTube offers a walled garden branded as “YouTube Kids.” Created in 2015, YouTube Kids is presently available as specialized apps on Android and Apple iOS devices. However, there is currently no option to access this walled garden through a website or on apps for the myriad of Smart TVs and over the top devices such as Roku, Apple TV, and Amazon Fire.
Walled gardens, like those offered by Netflix, YouTube, and many others, provide only an illusion of privacy and protection for children. The COPPA Rule does not prevent websites and online service operators from harvesting and selling the personal data of children who have entered their walled garden with the verified consent of their parent or guardian. Rather, service operators are essentially free to sell target advertising to children inside their walled garden, as well as to collect, sell or rent their personal data obtained during the child’s activities – subject only to rescission of parental consent on further processing.
Operators who do establish a walled garden are, however, responsible for providing notice and obtain verified parental consent when the company has actual knowledge that the service is collecting the PI of a child. Failing to meet those core requirements under COPPA can land a company in hot water, as we found with Google, through its subsidiary YouTube, recently learned
[6]. Once those obligations are satisfied, the personal information of children within the walled garden can be monetized with few limits.
The Teenage Trap
An often-overlooked quirk in COPPA is that the law specifically defines a child as someone “under age 13.” But what about teenagers, i.e., children between the ages of 13 and 17?
According to the FTC, COPPA’s parental notice and consent model is not designed to address the privacy issues faced by teenagers online.
[7] First, according to the FTC, teens are less likely than pre-teens to provide parental contact information and more likely to falsify or lie about their ages to participate in online activities. Thus, the FTC asserts there would be little point in requiring verified parental consent. Additionally, courts have recognized that as teens come of age, they have an increasing constitutional right to access information and express themselves. Finally, the FTC notes that teens are more likely to seek content that appeals to adults and placing restrictions on content they can access could impinge upon adult users sharing content sources. Thus, the FTC has concluded that it is appropriate that teens 13-17 are not covered under COPPA.
Consequently, under COPPA and the current U.S. notice-consent rule, just like adults, the personal information of teenage children can be collected, packaged and sold, and then used, repackage, and resold ad infinitum across the information economy. Meanwhile, Congress does not seem to mind that these minor teens lack the legal capacity for providing valid consent necessary to form a contract, cannot vote and formally seek redress from the government, and cannot personally avail themselves of the legal system to prohibit the unlawful use of their personal information should their election to opt-out of processing be ignored.
Takeaways
Data protection laws, such as the GDPR, are well-designed to provide significant safeguards against the unwanted online collection, use, and sale of the personal data of children for all people, with special protections provided for children under the age of 16. The U.S., on the other hand, has COPPA, a law and associated FTC regulation designed for the internet of the 1990s, where children needed to be protected from online predators in chat rooms. In that case, asking parents to let a child into a chat room makes sense. But COPPA is not designed to protect children from today’s world of manipulative behavioral online marketing run by an expanding veracious ad tech industry. Instead, COPPA merely establishes a series of procedural hurdles that online service operators must follow before they can deliver content to children 12 and under and monitor their subsequent activity. Meanwhile anyone over the age of 13 in the U.S. is subject to the same notice-consent privacy rules, under which the use of nearly any website, online service or mobile app will result in that individual’s personal information being captured, bought and sold on the open market with little, if any, limitation.
[1] 15 U.S.C. §§ 6501-6508; 16 C.F.R. §312; See also, Eric P. Mandel,
Examining COPPA and the Google-YouTube Settlement, Driven Co-Pilot (Oct. 9, 2019) (examining COPPA in the context of the September 2019 COPPA settlement involving the U.S. Federal Trade Commission and State of New York against Google and its YouTube subsidiary).
[2] The term “ad tech” relates to a rapidly expanding industry developed over the past decade that uses analytics, statistics, and technology to target advertising to individuals based on their user behavioral characteristics. A visual graphic showing the complexity of the ad tech ecosystem can be found at:
https://adexchanger.com/wp-content/uploads/2010/09/LUMA-Display-Ad-Tech-Landscape-for-AdExchanger.jpg.
[3] COPPA Rule, 16 CFR §312.3(d).
[4] Joe Duball and Ryan Chivetta, FTC Workshop Aims to Inform Potential COPPA Updates, iapp.org (October 8, 2019), available at: https://iapp.org/news/a/ftc-workshop-aims-to-inform-potential-coppa-updates/ (sub. req.).
[5] Note: While Netflix provides a specialized “Kids” area and designation of a “Kid” during profile creation, the company states in its privacy policy “While individuals under the age of 18 may utilize the service, they may do so only with the involvement, supervision, and approval of a parent or legal guardian.” When creating a “Kid” profile, Netflix does not provide any special notice or request parental consent to collect, use, and distribute the PI of children. See Privacy Statement, Netflix.com, available at:
https://help.netflix.com/en/legal/privacy (last viewed 10/31/2019).
[6] See,
Examining COPPA and the Google-YouTube Settlement, Note 1, above.
[7] See Federal Register, Vol. 76, No. 187 at 59805.