February 25, 2025

Children’s Online Privacy: Recent Actions by the States and the FTC

Share

At A Glance

As the digital world becomes an integral part of children's lives, state legislatures are placing greater emphasis on regulating how companies handle children’s personal information. This Legal Update explores the recent developments in state and federal children’s privacy legislation, examining how states are shaping the future of online safety for minors and shedding light on amendments to the federal Children’s Online Privacy Protection Act.

As social media companies and digital services providers increasingly cater to younger audiences, state legislatures are placing greater emphasis on regulating how companies handle children’s personal information. This Legal Update explores the recent developments in state and federal children’s privacy legislation, examining how states are shaping the future of online safety for minors and shedding light on amendments to the federal Children’s Online Privacy Protection Act (“COPPA”).

I. US State Developments

Social Media Legislation

Several states, including California, Connecticut, Florida, Georgia, Louisiana, New York, Tennessee, and Utah, have passed legislation focused on regulating the collection, use, and disclosure of children’s data in connection with social media use. Below is a brief summary of notable requirements and trends across each state law.

  • California. On September 20, 2024, California Governor Newsom signed the Protecting Our Kids from Social Media Addiction Act. The law prohibits companies from collecting data on children under 18 without parental consent and from sending notifications to minors during school hours or late at night. The Ninth Circuit has temporarily blocked the law until April 2025, when the court will examine whether it infringes on free speech rights.
  • Connecticut. Effective October 1, 2024, Connecticut’s law prohibits features designed to significantly increase a minor’s use of an online service (such as endless scrolling), unsolicited direct messaging from adults to minors, and the collection of geolocation data without opt-in consent.
  • Florida. Effective January 1, 2025, Florida’s Social Media Safety Act requires social media companies to verify the age of users and terminate accounts for children under 14 years old.
  • Georgia. Effective July 1, 2025, the Protecting Georgia’s Children on Social Media Act will require platforms to verify users’ ages and obtain parental consent for users under 16. The law will also require schools to adopt policies that restrict social media access on school devices. 
  • Louisiana. Effective July 1, 2024, the Louisiana Secure Online Child Interaction and Age Limitation Act requires platforms to verify users’ ages and obtain parental consent for users under 16 to create accounts. The law also bans users under 16 from direct messaging unknown adults and restricts the collection of unnecessary personal information.
  • New York. On June 21, 2024, New York Governor Kathy Hochul signed the Stop Addictive Feeds Exploitation (“SAFE Kids Act”). The SAFE Kids Act requires platforms to obtain verifiable parental consent to provide addictive feeds to users under 18. The law also bans users from sending notifications to children between 12:00 to 6:00 a.m. and prohibits degrading the quality or increasing the price of the product or service due to not being able to provide the minor an addictive feed.
  • Tennessee. Effective January 1, 2025, the Tennessee Protecting Children from Social Media Act requires that social media companies verify express parental consent for users under 18. It also allows parents the ability to monitor their children’s privacy settings, set time restrictions, and schedule breaks in account access.
  • Utah. Passed in March 2023, and amended in 2024, the Utah Social Media Regulation Act mandates that social media platforms require parental consent for minors to use their services. Unless required to comply with state or federal law, social media platforms are prohibited from collecting data based on the activity of children and may not serve targeted advertising or algorithmic recommendations of content to minors. Enforcement of Utah’s law is also currently blocked by litigation.

This year, children’s privacy bills related to social media regulations continue to be introduced in other state legislatures. For instance, Utah’s App Store Accountability Act recently passed the State Senate and would require app store providers to verify users’ ages. South Carolina’s Social Media Regulation Act would require social media companies to make commercially reasonable efforts to verify the age of South Carolina account holders and require parental consent for users under the age of 18 to have an account. Similar children’s privacy bills have also been introduced in Alabama (HB 276), Arizona (HB 2861), Arkansas (HB 1082 and HB 1083), Colorado (SB 86), Connecticut (SB 1295) Iowa (HF 278), New York (S 4600 and S 4609), and Tennessee (SB 811 and HB 825).

Age-Appropriate Design Codes

Last year, multiple states enacted laws requiring age-appropriate design codes to improve online privacy protections for children. The success of these laws has varied.

  • California. The California Age-Appropriate Design Code Act (the “Act”) mandates that online services likely to be accessed by children under 18 prioritize their well-being and privacy. The Act requires businesses to assess and mitigate risks from harmful content and design features that may exploit children. Initially set to take effect on July 1, 2024, the Act is currently subject to a partial injunction by the Ninth Circuit Court of Appeals. In August 2024, the Ninth Circuit upheld a preliminary injunction of the Act’s data protection impact assessment provisions but lifted the injunction on provisions restricting the collection, use, and sale of children’s data and geolocation data.
  • Connecticut. As of October 1, 2024, Connecticut’s amended Consumer Data Privacy Act includes provisions for collecting data on children under the age of 18. The law also requires companies to complete a data protection impact assessment for each product likely to be accessed by children. Additionally, companies must exercise “reasonable care to avoid any heightened risk of harm to minors” caused by their products or services and to delete minors’ accounts and data upon request.
  • Maryland. Effective October 1, 2024, the Maryland Kids Code requires social media platforms to implement default privacy settings for children, prohibits collecting minors’ precise locations, and requires a data protection impact assessment for products likely to be accessed by children.
  • Illinois, South Carolina, and Vermont. Illinois, South Carolina, and Vermont have each introduced bills requiring age-appropriate design codes in their 2025-2026 legislative sessions.
Harmful Content Age Verification Legislation

States are increasingly enhancing online privacy protections for children through “harmful content age verification” laws. These laws require companies to implement reasonable age verification measures before granting children access to potentially harmful content (such as pornography, violence, or other mature themes) or face liability for failing to do so. As of January 2025, 19 states have passed laws requiring age verification to access potentially harm content:  Alabama, Arkansas, Florida, Georgia Idaho, Indiana, Kansas, Kentucky, Louisiana, Mississippi, Montana, Nebraska, North Carolina, Oklahoma, South Carolina, Tennessee, Texas, Utah, and Virginia. 

On January 15, 2025, Texas Attorney General Ken Paxton defended Texas’ law (HB 1181) before the Supreme Court. The case centers on whether the law, which requires that websites with harmful content verify users’ ages to prevent minors from accessing such content, infringes on the First Amendment. The Court has not yet issued its opinion on the matter.

Children’s Data Protection Legislation

States’ privacy measures for children extend beyond social media regulation. For example, Texas passed the Securing Children Online through Parental Empowerment (SCOPE) Act last year, which applies to all digital service providers. Effective September 1, 2024, the SCOPE Act prohibits digital service providers from sharing, disclosing, or selling a minor’s personal identifying information without parental consent. It also requires companies to provide parents with tools to manage and control the privacy settings on their children’s accounts. These protections extend to how minors interact with AI products.

Similarly, the New York Child Data Protection Act (CDPA) will prohibit websites, mobile applications, and other online operators from collecting, using, disclosing, or selling personal information of children under the age of 18 unless:

  • For children 12 years or younger, such processing is permitted by COPPA; or
  • For children 13 years or older, “informed consent” is obtained or such processing is strictly necessary for certain specified activities. Informed consent must be made clearly and conspicuously.

Companies will be subject to the CDPA if they have both: (a) actual knowledge that data is from a minor user; and (b) the website, online service, online application, mobile application, or device is “primarily directed to minors.” The CDPA comes into effect on June 20, 2025.
Other states have passed COPPA-style laws that impose additional restrictions on processing of minors’ data for targeted advertising, including New Hampshire and New Jersey. Similarly, Maryland’s Online Data Privacy Act prohibits the sale or processing or personal data for targeted advertising if the business knew or should have known the consumer is under 18.

Virginia amended its general consumer privacy law to address children’s privacy protections. The amendment to the Consumer Data Protection Act, effective January 1, 2025,  requires parental consent for processing personal information of a known child1 under 13 and requires data protection assessments for online services directed to known children. Similarly, Colorado amended its privacy law to strengthen protections for minors’ data. Companies are prohibited from processing minors’ data for targeted advertising and must exercise reasonable care to avoid any heightened risk of harm to minors. The Colorado privacy law amendment will take effect on October 1, 2025.

Pending Legislation

California is leading the way in enacting legislation to protect children from the risks associated with Artificial Intelligence (AI). On February 20, 2025, the California legislature introduced AB 1064, known as the Leading Ethical Development of AI (LEAD) Act. Among its provisions, the LEAD Act would require parental consent before using a child's personal information to train an AI model and mandate that developers conduct risk-level assessments to classify AI systems based on their potential harm to children. It would also prohibit systems involving facial recognition, emotion detection, and social scoring. Additionally, the LEAD Act would establish an advisory body, the LEAD for Kids Standards Board, to oversee AI technologies used by children.

II. US Federal Developments

COPAA aims to protect children’s privacy online and imposes various requirements on online content providers. On January 16, 2025, the FTC finalized updates to COPPA, which originally took effect in 2000 and had not been revised since 2013. The new changes will become effective 60 days after publication in the Federal Register, with a compliance date set for one year after publication. The updates include several revised definitions, new retention requirements, and expanded consent requirements. Additionally, there will be increased transparency regarding compliance with the COPPA Safe Harbor Programs.

The revisions to COPPA were unanimously approved by a 5-0 vote and include updates to address new technology and data collection practices, such as:

  • Clarifying definitions to assist companies navigating compliance “gray areas.”The amendments introduce a new definition for “mixed audience website or online service,” which covers cases where websites might fall under COPPA’s scope. A “mixed audience website or online service” is a website or online service that is directed to children (as further described in COPPA), but which:

a)  does not target children as its primary audience, and
b) does not collect personal information from any visitor (other than for certain limited purposes outlined in the statute) before collecting age information or use technology to determine whether the visitor is a child.

Further, to qualify as a mixed audience website or online service, any collection of age information or other means of determining whether a visitor is a child must be done in a neutral manner without defaulting to a set age or encouraging visitors to falsify age information.

  • Accounting for new types of data collectionby updating the definition of “personal information” to include “a biometric identifier,” which “can be used for the automated or semi-automated recognition of an individual, such as fingerprints; handprints; retina patterns; iris patterns; genetic data, including a DNA sequence; voiceprints; gait patterns; facial templates; or faceprints.”2
  • Expanding on ways for parents to give their consent. Operators may use a text message coupled with additional steps to ensure the person providing the consent is the parent for use cases where personal information is not “disclosed” (as defined in COPPA). These additional steps include sending a confirmatory text message to the parent after receiving consent or obtaining a postal address or telephone number from the parent and confirming consent by letter or telephone call. Operators using this method must notify parents that they can revoke any consent given by responding to the initial text message.

There have been other attempts to pass federal legislation regarding children’s privacy rights and children’s online safety in recent years, including the Kids Online Safety Act, which was introduced in 2022 and passed the Senate (packaged together with an update to COPPA), but did not pass last Congress. More recently, on February 19, 2025, the Senate Judiciary Committee held a hearing on children online safety and efforts to boost safeguards for children.

III. Enforcement

Children’s privacy is also a subject of enforcement scrutiny by state attorneys general and the FTC. For example, the Texas Attorney General has launched investigations into several technology companies regarding their handling of minors’ data and potential violations of the SCOPE Act. In his press release about the investigation, Attorney General Ken Paxton warned, “[t]echnology companies are on notice that [the Texas Attorney General’s] office is vigorously enforcing Texas’s strong data privacy laws. These investigations are a critical step toward ensuring that social media and AI companies comply with our laws designed to protect children from exploitation and harm.” 3

The FTC has been actively enforcing COPPA violations against website operators. From January 2023 to January 2025, the FTC published six enforcement actions related to COPPA investigations on its website. Earlier this year, the FTC settled with an application owner for $20 million for allowing children under the age of 16 to make in-app purchases without parental consent and deceiving children about the costs of such purchases.

IV.  Key Takeaways and Predictions

  • States are moving to enhance parental controls around children’s privacy. Social media legislation across various states is helping parents maintain greater control over their children’s privacy by requiring companies to obtain parental consent and providing parents the ability to set time restrictions and monitor their children’s account use. Companies should develop and provide these tools that enable parents to manage their children’s online experiences to stay aligned with regulatory trends.
  • States will likely begin regulating the threats AI chatbots pose to young kids over the coming months. AI chatbots are increasingly central to discussions about children’s safety. For example, one technology company’s chatbot is currently facing litigation from a mother alleging that the platform’s AI chatbot encouraged her son to commit suicide.  Another lawsuit claims that the same company’s chatbot service suggested a child should kill his parents over screen time limits. State legislatures may expand their regulatory scrutiny to address threats to children posed by AI chatbots.
  • Businesses may face steep penalties and injunctions for violations. Several of these laws grant state attorneys general the authority to impose civil penalties for violations. For example, the New York Attorney General can impose civil penalties of up to $5,000 per violation, issue injunctive relief, and obtain disgorgement of any profits gained from violating the SAFE Kids Act. Companies violating Florida’s social media law may face fines of up to $50,000 per violation. Companies should implement robust age verification processes to accurately verify users’ ages and obtain necessary parental consent to avoid potential risk and enforcement scrutiny.
  • COPPA was updated for the first time in 12 years. The  amendments to COPPA reflect technological advancements made since the law was last revised 12 years ago. While some of these updates introduce additional compliance requirements for website operators, others clarify “gray areas” that the previous version of the law did not address.

 


 

1 The Virginia Consumer Data Protection Act does not define a “known child” but defines a child as any natural person younger than 13 years of age. § 59.1-575.

2 COPPA Final Rule at§ 312.2

3 https://www.texasattorneygeneral.gov/news/releases/attorney-general-ken-paxton-launches-investigations-characterai-reddit-instagram-discord-and-other

Stay Up To Date With Our Insights

See how we use a multidisciplinary, integrated approach to meet our clients' needs.
Subscribe