
Protecting Children’s Privacy at Home, at School, and Everywhere in Between
Summary
Young people today face surveillance unlike any previous generation, at home, at school, and everywhere in between. Constant use of technology while their brains are still developing makes them uniquely vulnerable to privacy harms, including identity theft, cyberbullying, physical risks, algorithmic labeling, and hyper-commercialism. A lack of privacy can ultimately lead children to self-censor and can limit their opportunities. Already-vulnerable populations—who have fewer resources, less digital literacy, or are non-native English speakers—are most at risk.
Congress and the Federal Trade Commission (FTC) have repeatedly considered efforts to better protect children’s privacy, but the next administration must ensure that this is a priority that is actually acted upon by supporting strong privacy laws and providing additional resources and authority to the FTC and support to the Department of Education (ED). The Biden-Harris administration should also establish a task force to explore how to best support and protect students. And the FTC should use its current authority to increase its understanding of the children’s technology market and robustly enforce a strong Children’s Online Privacy Protection Act (COPPA) rule.
At this inflection point, the choice is not between speed and safety but between ungoverned acceleration and a calculated momentum that allows our strategic AI advantage to be both sustained and secured.
Improved detection could strengthen deterrence, but only if accompanying hazards—automation bias, model hallucinations, exploitable software vulnerabilities, and the risk of eroding assured second‑strike capability—are well managed.
A dedicated and properly resourced national entity is essential for supporting the development of safe, secure, and trustworthy AI to drive widespread adoption, by providing sustained, independent technical assessments and emergency coordination.
Congress should establish a new grant program, coordinated by the Cybersecurity and Infrastructure Security Agency, to assist state and local governments in addressing AI challenges.