The Children’s Online Privacy Protection Act (COPPA) directed the Federal Trade Commission (FTC) to promulgate rules to protect children’s privacy and data on the internet, and to vest decision-making authority about what to do with that data in the hands of parents. In September 2019, the FTC and YouTube entered into the largest monetary settlement under COPPA to date for violations of the existing rules. [1] Following the settlement, the FTC announced in July 2019 that it would begin accepting comments on possible revisions to the COPPA rule. [2]
The notice and comment period generated over 175,000 comments from industry groups, content creators, privacy advocates, government officials, and other interested parties. [3] The battle lines have largely been drawn in this case. YouTube (and its parent company, Google), trade groups, and content creators believe the rules are overly inclusive and will result in a drastic decrease in revenue for child-appropriate content. Any hit to the pocketbooks of content creators is likely to result in an environment where high-quality, child-appropriate content is simply no longer available because of a lack of incentive to develop it. Privacy advocates on the other side argue that the stringent protections are necessary in order to protect children’s online identities. The rule should be strengthened given the embedded nature of the internet in our daily lives.
This Comment will begin by explaining the major features of COPPA and the COPPA rule. Next, this Comment will explore the unique nature of YouTube and the difficulties of COPPA compliance for a platform that relies on third-party content creators. The next Part will outline the arguments for strengthening and loosening the rule and will assess their strengths and weaknesses. Finally, this Comment will suggest some prescriptive measures that the FTC should take prior to amending the rule.
The drafters of COPPA sought to achieve four goals with the enactment of the bill: (1) empower parents to protect the privacy of children online; (2) protect children’s safety in their use of online channels; (3) ensure the security of children’s personally identifiable information; and (4) require parental consent when collecting personal information from children. [4]
The Act directed the FTC to promulgate rules requiring website and online service operators to obtain “verifiable parental consent” when the website has actual knowledge that the user is a child. [5] The FTC issued the first iteration of the “COPPA rule” in 1999, taking effect in April 2000. [6] Since the bill was enacted, internet use among children and teens has increased dramatically. [7] Further, tablets and mobile devices have changed the way children play, learn, communicate, and entertain themselves. [8] The FTC issued the final round of amendments to the rule in 2012, taking effect in 2013. [9] The 2013 revisions expanded the categories of data collected that qualify as personal information, in addition to extending the types of websites covered by the rule. [10] While the FTC normally reviews rules and regulations every ten years, [11] the agency announced in July 2019 that the rule would be reviewed for revisions and opened the floor for public comments. [12] The proposed topics for comment included the verifiable parental consent requirement, the correct factors for determining whether a website qualifies as child-directed, and whether smart TVs and video games fall under the guise of the rule, among others. [13]
The COPPA rule defines a child as any person under the age of thirteen. [14] Operators of websites or online services are prohibited from collecting [15] personal information [16] from children if the website is directed towards children [17] or the operator has actual knowledge a child is using the site or service, [18] unless the operator has obtained verifiable parental consent. [19] Operators are also required to take “reasonable steps” to ensure that when a child’s personal information is shared with third parties, that those third parties are also COPPA-compliant. [20]
For purposes of COPPA compliance, the FTC specifies that there are three categories of websites: (1) child-directed websites; (2) general audience websites; and (3) mixed audience websites. [21] A website qualifies as child-directed based on the subject matter of content, the visual appearance of the website, the site’s marketing practices, and data regarding the actual users of the site. [22] Websites that qualify as child-directed are subject to the restrictions on collection of personal information and are subject to heightened privacy requirements. [23] Under the COPPA rule, child-directed websites or service operators are required to treat all users as if they are children. [24]
General audience websites are not defined in the rule or the statute. The simplest answer as to what type of website qualifies as general audience is a website that hosts content not directed at a child audience. [25] The FTC offered clarification as to the type of content the agency would expect to find on a general audience website in response to questions generated during the notice and comment period in 2019. [26] Content including “traditionally adult activities like employment, finances, politics, home ownership, home improvement, or travel, [is] probably not covered.” [27] General audience websites do not fall under the COPPA restrictions unless they have actual knowledge that a child is using their website. [28]
Like general audience websites, mixed audience websites are not defined in the rule. Mixed audience websites are websites intended for a primary audience other than children but may contain content or subject matter that attracts children as a secondary audience. [29] A website that targets teens as the primary audience is not strictly directed toward children because the primary audience is over the age of thirteen; however, the website may still qualify as child-directed if the factors balance in that direction. [30] Mixed audience websites are subject to the COPPA rule if they have actual knowledge that a child is using the site. [31] Further, they are permitted to “age gate” [32] their websites to prevent collection of personal information from children or obtain consent prior to doing so. [33]
A violation of COPPA regulations is treated as an “unfair or deceptive act or practice” and therefore falls under the auspices of the FTC Act. [34] The FTC Act allows the Commission to enforce regulations through civil penalties, injunctions, and compliance monitoring. [35] State Attorneys General may also bring an action on behalf of state residents under COPPA for violations of the rule. [36]
COPPA also allows for self-regulation through the safe harbor provision. The COPPA safe harbor program allows for industry groups to set up and administer a self-regulatory compliance program. [37] The safe harbor privacy protection regulations are required to meet a minimum standard amounting to “substantially . . . the same or greater” protections as outlined in COPPA [38] and must submit guidelines before being approved by the FTC. [39] Safe harbors allow for application and website developers, content creators, and other entities that publish content for children to receive regulatory guidance without being subject to enforcement action. Existing safe harbors include the Children Advertising Review Unit (CARU), [40] Entertainment Software Rating Board (ESRB), [41] Privacy Vaults Online (PRIVO), and TRUSTe. [42] Once a company has signed on and demonstrated compliance with the policies of the safe harbor, they are able to display a seal of approval to signify to consumers that the website, game, or application is COPPA-compliant. [43] More importantly, failure to comply with the safe harbor’s standards is considered an unfair and deceptive trade practice and subject to enforcement by the FTC.
YouTube is an online video hosting and sharing platform. [44] YouTube is the second-most-visited website in the world, outclassed only by its parent company, Google. [45] One of the features that makes YouTube unique is that content is created and presented by users called “channel owners.” [46] Channel owners post videos to their channels where other users can watch, comment, like, share, and subscribe to see more content from that channel. [47] YouTube does not require an account for users to watch videos; users only need an account to comment and post their own content. [48] In order to create a Google or YouTube account, users must certify that they are thirteen years of age or older. [49] YouTube is free to use; [50] however, channel owners can monetize their videos by enabling advertising. [51] On average, “YouTubers” pull in almost $10 per 1,000 views in advertising revenue. [52]
Channel owners uploading content to YouTube fall under COPPA’s statutory definition of “website or online service.” [53] In terms of compliance, channel owners are treated as if they are their own website or online service. [54] The FTC has been clear that the agency intends to bring enforcement action against individual channel owners if they are found to be violating COPPA. [55] Andrew Smith, then-chief of the FTC Bureau of Consumer Protection, indicated that the agency would sweep the YouTube platform to ensure that child-directed content was being properly handled by channel owners. [56]
An analysis of the most popular channels on YouTube revealed content directed at children, especially content featuring a child under the age of thirteen, was more popular in terms of views and appeared on channels with more subscribers than content aimed at a general audience. [57] The popularity of children’s content on YouTube is probably not a surprise to parents. A Pew Research Center survey revealed “81% of all parents with children age 11 or younger . . . let their child watch videos on YouTube,” with 34% indicating that they allow their child to do this regularly. [58] YouTube itself marketed the platform as “the #1 website regularly visited by kids” and the “new Saturday Morning Cartoons.” [59] While the popularity of YouTube among children is undeniable, YouTube is not a child-directed website. The difficulties of determining which audience definitions under COPPA apply to a platform like YouTube are instructive for the complexities of compliance with the rule.
YouTube operates in a COPPA gray area. While some of the content hosted on the site is directed towards children, it is also fairly characterized as a general audience platform hosting content related to sports, politics, instructional videos, and product reviews. [60] YouTube does offer a “kid-friendly” version of its platform called YouTube Kids; [61] however, Common Sense Media conducted a poll in partnership with SurveyMonkey indicating that 81% of parents let their child watch YouTube via the general audience platform, either through the mobile app or on a computer. [62] Only 24% of parents indicated their children watch YouTube via the YouTube Kids application. [63]
The FTC entered into its largest settlement under COPPA against YouTube in September 2019. [64] The principal complaint levied against YouTube is that, despite meeting the COPPA rule’s definition of a child-directed website, YouTube insisted to channel owners and advertisers alike that the platform was intended for a general audience and did not require COPPA compliance. [65] However, the COPPA rule applies to websites that have actual knowledge that a child is using their website, [66] and the FTC argued that YouTube’s representations to advertisers and channel owners showed the company had actual knowledge that children were using the site. [67]
There is no way to know for sure how much advertising revenue YouTube generated from inappropriate content directed towards children. However, the FTC’s complaint against YouTube revealed that among the channels highlighted, “which represent only a few examples of the possible universe of child-directed content,” YouTube earned nearly $50 million. [68] Further, a recent study indicated that 85% of children’s videos contained at least one advertisement, and videos targeting early-childhood-age children, elementary-age children, and teens had higher total ad counts than videos directed towards adults. [69] The FTC’s complaint did spawn a shareholder derivative lawsuit against Alphabet Inc. [70] (Google and YouTube’s parent company), but it does not appear that the $170 million judgment against YouTube will make much of a financial dent in the company’s deep pockets. [71] In some ways, the FTC is limited by its own willingness to levy relatively small penalties on companies for violating consumers’ privacy. [72] Thus, while many may argue that a larger fine was warranted against YouTube, the FTC was bound by the constraints of “injury to the public.” [73] The FTC has other enforcement mechanisms in its arsenal, [74] and it employed some of them against YouTube. [75] The consent decree appeared to incentivize YouTube to change its policies and practices. [76] Yet, the question remains whether a prohibition on the activity that led to the enforcement action in the first place will be any more effective at deterring the conduct than the pre-existing rules and regulations.
To comply with the court’s order, YouTube implemented a number of policy changes impacting creators and channel owners throughout its ecosystem. First, YouTube mandated that all channel owners need to designate an audience for already-published videos and forthcoming videos. [77] This policy impacts all YouTube users regardless of their location. [78] Channel owners must designate their videos as “made for kids” or not made for kids. [79] The audience setting can be done on a video-by-video basis or on a channel level as a whole. [80] YouTube will also use machine learning to identify and classify videos as made for kids; however, the site warns creators not to “rely on our systems to set your audience for you because our systems may not identify content that the FTC or other authorities consider to be made for kids.” [81] YouTube advises creators that they are unable to provide guidance to creators as to whether their audience is set correctly, instead deferring to the FTC’s guidelines. [82]
The “made for kids” audience setting restricts the features available on YouTube videos and channels. The most significant impact to creators is that personalized advertising will no longer be available on videos with the made for kids designation. [83] YouTube notes that this “may result in a decrease in revenue for some creators who mark their content as made for kids.” [84] The same restrictions will apply to videos that YouTube’s machine learning algorithm designates as made for kids. [85] YouTube has not indicated what creators should expect in terms of how significant the financial impact will be as a result of these policy changes, but the expectation is that the impact will be substantial. [86] In order to facilitate the creation of family-friendly content, one of YouTube’s top earning content categories, [87] YouTube pledged over $100 million to a fund “dedicated to the creation of thoughtful, original children’s content on YouTube and YouTube Kids globally.” [88] Other features that will be disabled on made for kids videos include auto play, commenting, channel branding, and save to playlist and watchlist features. [89] At the channel level, in addition to the features that will not be available on individual videos, made for kids channels will not have channel memberships, notifications, or posts and stories. [90]
News of YouTube’s impending policy changes was met with praise from children’s privacy advocates [91] and disdain from content creators. [92] The FTC appears poised to make a change to the COPPA rule that will carve out an exception for mixed audience sites, that is, sites not directed towards children but, nevertheless, have a large number of children that use it. [93] Some have viewed this question, and comments that FTC commissioners have made, as a sign that the agency is ready to weaken the COPPA rule in favor of incentivizing content creators to create better content subsidized by personalized advertising. [94] The next Part will discuss the debate regarding revisions to the COPPA rule, the positions of the major stakeholders, and the efficacy of their arguments.
The FTC’s request for comments on the future of the COPPA rule generated nearly 175,000 comments. Politicians, privacy advocates, industry and trade associations, YouTubers (and their army of fans), online businesses, and others participated in the process.
The FTC requested comments on the definitions provided in the rule. [95] Specifically, the request for comment sought feedback as to whether the rule considers the correct factors, whether the factors need to be clarified, and whether the definition should be amended to account for websites that have mixed audiences. [96]
Google, YouTube’s parent company, responded to the request for comment to articulate its position on this precise question. [97] Google believes that the distinction between general audience content and child-directed content is easiest to determine at the extreme ends of the spectrum where content is designed to appeal exclusively to a child audience or general audience website. [98] The distinction becomes much more difficult to parse out for mixed audience channels. [99] This is a problem articulated by YouTubers as well. [100] Gaming-related content is one of the most popular content categories on YouTube. [101] Some gaming content features characters that are likely to appeal to children, [102] yet the general tone and language of the video may be entirely inappropriate for an audience under the age of thirteen. [103] Thus, on its face, the content of the video may indicate that the video is intended to target children as a primary or secondary audience. Viewed in context, however, a reasonable observer would probably conclude that the creator had no intention of targeting children, but rather a general audience. [104] The factors provided in the rule make it difficult for content creators to determine whether they are making videos that could be construed as directed towards children for purposes of COPPA compliance; some creators have resorted to shedding their signature costumes in order to avoid the made for children designation. [105]
Drawing on lessons from the FTC’s enforcement action against Yelp, [106] presumably one method a website could use to avoid compliance with COPPA is to refrain from asking for age at all or not to allow children under the age of thirteen to create an account. [107] The COPPA rule attempts to account for the willful disregard of a user’s age by requiring that any website that is directed towards children comply with the personal information minimization procedures and verifiable parental consent requirements. [108]
While there is some common sense to the request for clarity, some advocates have argued that Google’s position on this point is disingenuous at best. [109] Google is in the business of personalized marketing, [110] and its algorithms are able to accurately predict a user’s attributes such as age, education level, home ownership status, and other interests. [111] Advertising networks boast that they can drill down target audiences to such specificity as to target people who have visited hospitals or medical care facilities. [112] “In other words, these companies possess information that can and should be used to affirmatively identify websites or online services that are child-directed in practice . . . .” [113] The irony of this proposition is that in order for companies to accurately predict the age of a user without implementing an age-gate system, they would be required to engage in passive tracking of children—in violation of current COPPA rules.
One possible solution to the audience identification conundrum is to allow website operators to rebut the presumption that users of child-directed content are children. Google advocated for this policy change arguing that third-party platforms that host child-directed content should be able to “treat their adult users as adults, with the appropriate safeguards . . . .” [114] This option was on the table during the 2013 COPPA revisions, but ultimately, the commission chose to maintain the policy that users of child-directed content were to be treated as children. [115] Under a rebuttable presumption model, adult users logged in to profiles on general audience platforms that engage with child-directed content would be treated normally. [116] Users engaging with child-directed content that were not signed into a profile would be treated as children. [117] Google’s rationale for the implementation of the rebuttable presumption policy is that the statute itself only prohibits collection of data and personal information from children; adults were never intended to be covered by the statute. [118] Thus, a rebuttable presumption model more accurately accomplishes the law’s intended purpose. Google’s proposed mechanism for rebutting the presumption is to use a similar approach to verifiable parental consent. [119]
Google’s proposal, while practical, has some obvious flaws. First, the rebuttable presumption standard assumes that anyone signed in as an adult is an adult. YouTube does not allow users to register for accounts if they are under the age of thirteen, [120] thus, many children use their parent’s general audience YouTube account. [121] The proposal does not account for the data that will be collected from children using their parents’ YouTube accounts.
YouTube is responsible for the content that is hosted on its application, but parental supervision will always be the last line of defense against privacy violations and inappropriate content being directed towards children. However, many parents have assisted their children in subverting the privacy protections currently available to children. [122]
Second, verifiable parental consent has not been implemented effectively, in part because of the lack of guidance from the rule. Some ambiguity in the wording of the rule is probably necessary in order to allow for flexibility in approaches to implementation. However, it is surprising that Google would want to introduce more vagueness into the rule when one of the principal complaints levied against COPPA is the lack of clarity. [123]
Future implementation of the COPPA rule will need to engage in some give-and-take to both sides of the argument. The FTC should certainly not grant Google a wish list of proposed COPPA reforms, but rather should engage in a balancing of interests on both sides of the argument. There is undoubtedly a need to protect children online, but COPPA has largely targeted the wrong parties with ineffective methods.
Content creators, websites, and application developers need to have an economic incentive to create child-friendly content. On YouTube, channel owners who enable personalized advertising have an opportunity to earn more from advertisements displayed on their videos than those with contextual advertisements. [124] Further, mobile application developers depend on analytics from application usage to drive monetary earnings. [125] One possible solution to ease the tension between privacy and monetization that the FTC should explore is expanding COPPA’s existing safe harbor framework.
Even though safe harbors purport to offer at least the same level of privacy protection as currently guaranteed by COPPA, there is evidence to indicate that safe harbors may not be any more effective in implementing the required privacy standards. [126] One element of the impending COPPA rule change the FTC should consider is strengthening the standards required for safe harbors, and to require the safe harbor certifying organizations to list publicly the companies, applications, websites, and services that are members. This would ensure that the general public can take advantage of the FTC’s reporting mechanism against safe harbor operators in addition to developers and operators.
YouTube could make its platform safer by implementing its own safe harbor program. YouTube has already indicated that it will implement more stringent standards for creators if they wish to publish child-directed content; however, it will do so with the aid of machine learning and algorithmic safety measures. [127] YouTube’s algorithm and machine learning capabilities are vulnerable to exploitation by bad actors and content creators creating parodies of otherwise appropriate children’s content. [128] The YouTube algorithm is designed to keep users on the site for as long as possible; thus, the site is vulnerable to delivering the user the next video in the chain and is not designed to ensure that video is appropriate for the user that is watching. [129] YouTube hired thousands of human moderators to police content on the platform, yet there may still be additional steps that need to be taken to ensure a safe platform for children. [130]
The safe harbor business model would translate well to YouTube’s existing business model and would likely result in greater content and privacy protections for children. YouTube, as the safe harbor, would be responsible for providing policies and procedures to the FTC for approval. [131] Further, under the rule, YouTube would be required to regularly report on compliance and identify any potential violations of safe harbor procedures. YouTube already has a network of existing channel owners and content creators that are subject to their guidance and content-control policies. [132] The next step in creating a safer environment for children on YouTube would require creators to designate content and channels as being child-directed. YouTube should curate a specific feed of child-approved content that is moderated and reviewed by humans rather than artificial intelligence. Finally, YouTube will need to find a way to market this application widely to parents. This will probably require YouTube to make the platform available on the web as well as in a mobile application. One of the reasons parents purportedly did not use YouTube Kids as frequently as general audience YouTube is because the convenience factor of the standard YouTube application outweighed privacy and safety priorities. [133] The FTC should consider requiring YouTube to publicly disclose the exploitation of the recommended videos algorithm as a possible mechanism of incentivizing parents to use the kid-friendly version of the application.
While the scope of this paper was limited to the COPPA rule’s application to YouTube, the problems outlined are pervasive across the internet. Platforms like Twitch [134] share similarities with YouTube: content created by third parties, [135] behavioral advertising powered by consumer data collection, [136] and content likely to appeal to a child audience. [137] Any change to the COPPA rule is bound to have broad implications for YouTube, Twitch, and other emerging forms of entertainment. Ambiguities in the rule have done little to ensure effective compliance and have more than likely subverted the intent of the rule by encouraging willful ignorance of child users to avoid the loss of earnings. The law is in need of improvement; it should be written to accommodate for future shifts in technology as the universe of applications and websites expands. Further, the FTC should consider revising the scheduled updates to every five years, instead of ten, as it currently stands. Rolling back privacy protections for children, as they navigate an increasingly more dangerous internet, is a dangerous path forward. The solution is to find a common-sense approach to providing more stringent privacy protections.
[I]ts subject matter, visual or audio content, age of models, [presence of child celebrities or celebrities who appeal to children], language or other characteristics of the website or online service, as well as whether advertising promoting or appearing on the website or online service is directed to children. The Commission will also con-sider competent and reliable empirical evidence regarding audience composition; [and] evidence regarding the intended audience[.]