Addressing the Disproportionate Impacts of Student Online Activity Monitoring Software on Students with Disabilities
Student activity monitoring software is widely used in K-12 schools and has been employed in response to address student mental health needs. Education technology companies have developed algorithms using artificial intelligence (AI) that seek to detect risk for harm or self-harm by monitoring students’ online activities. This type of software can track student logins, view the contents of a student’s screen in real time, monitor or flag web search history, or close browser tabs for off-task students. While teachers, parents, and students largely report the benefits of student activity monitoring outweigh the risks, there is still a need to address the ways that student privacy might be compromised and to avoid perpetuating existing inequities, especially for students with disabilities.
To address these issues, Congress and federal agencies should:
- Improve data collection on the proliferation of student activity monitoring software
- Enhance parental notification and ensure access to free appropriate public education (FAPE)
- Invest in the U.S. Department of Education’s Office for Civil Rights
- Support state and local education agencies with technical assistance
Challenge and Opportunity
People with disabilities have long benefited from technological advances. For decades, assistive technology, ranging from low tech to high tech, has helped students with disabilities with learning. AI tools hold promise for making lessons more accessible. A recent survey conducted by EdWeek of principals and district leaders showed that most schools are considering using AI, actively exploring their use, or are piloting them. The special education research community at large, such as those at the Center for Innovation, Design and Digital Learning (CIDDL) view the immense potential and risks of AI in educating students for disabilities. CIDDL states:
“AI in education has the potential to revolutionize teaching and learning through personalized education, administrative efficiency, and innovation, particularly benefiting (special) education programs across both K-12 and Higher Education. Key impacts include ethical issues, privacy, bias, and the readiness of students and faculty for AI integration.”
At the same time, AI-based student online activity monitoring software is being employed more universally to monitor and surveil what students are doing online. In K-12 schools, AI-based student activity monitoring software is widespread – nearly 9 in 10 teachers say that their school monitors students’ online activities.
Schools have employed these technologies to attempt to address student mental health needs, such as referring flagged students to counseling or other services. These practices have significant implications for students with disabilities, as they are at higher risk for mental health issues. In 2024, NCLD surveyed 1349 young adults ages 18 to 24 and found that nearly 15% of individuals with a learning disability had a mental health diagnosis and 45% of respondents indicated that having a learning disability negatively impacts their mental health. Knowing these risks for this population, careful attention must be paid to ensure mental health needs are being identified and appropriately addressed through evidence-based supports.
Yet there is little evidence supporting the efficacy of this software. Researchers at RAND, through review of peer-reviewed and gray literature as well as interviews, raise issues with the software, including threats to student privacy, the challenge of families in opting out, algorithmic bias, and escalation of situations to law enforcement. The Center for Democracy & Technology (CDT) conducted research highlighting that students with disabilities are disproportionately impacted by these AI technologies. For example, licensed special education teachers are more likely to report knowing students who have gotten in trouble and been contacted by law enforcement due to student activity monitoring. Other CDT polling found that 61% of students with learning disabilities report that they do not share their true thoughts or ideas online because of monitoring.
We also know that students with disabilities are almost three times more likely to be arrested than their nondisabled peers, with Black and Latino male students with disabilities being the most at risk of arrest. Interactions with law enforcement, especially for students with disabilities, can be detrimental to health and education. Because people with disabilities have protections under civil rights laws, including the right to a free appropriate public education in school, actions must be taken.
Parents are also increasingly concerned about subjecting their children to greater monitoring both in and outside the classroom, leading to decreased support for the practice: 71% of parents report being concerned with schools tracking their children’s location and 66% are concerned with their children’s data being shared with law enforcement (including 78% of Black parents). Concern about student data privacy and security is higher among parents of children with disabilities (79% vs. 69%). Between the 2021–2022 and 2022–2023 school years, parent and student support of student activity monitoring fell 8% and 11%, respectively.
Plan of Action
Recommendation 1. Improve data collection.
While data collected from private research entities like RAND and CDT captures some important information on this issue, the federal government should collect such relevant data to capture the extent to which these technologies might be misused. Polling data, like the CDT survey of 2000 teachers referenced above, provides a snapshot and is influential research to raise immediate concerns around the procurement of student activity monitoring software. However, the federal government is currently not collecting larger-scale data about this issue and members of Congress, such as Senators Markey and Warren, have relied on CDT’s data in their investigation of the issue because of the absence of federal datasets.
To do this, Congress should charge the National Center for Education Statistics (NCES) within the Institute of Education Sciences (IES) with collecting large-scale data from local education agencies to examine the impact of digital learning tools, including student activity monitoring software. IES should collect data on students disaggregated the student subgroups described in section 1111(b)(2)(B)(xi) of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 6311(b)(2)(B)(xi)) and disseminate such findings to state education agencies and local educational agencies and other appropriate entities.
Recommendation 2. Enhance parental notification and ensure free appropriate publication education.
Families and communities are not being appropriately informed about the use, or potential for misuse, of technologies installed on school-issued devices and accounts. At the start of the school year, schools should notify parents about what technologies are used, how and why they are used, and alert them of any potential risks associated with them.
Congress should require school districts to notify parents annually, as they do with other Title I programs as described in Sec. 1116 of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 6318), including “notifying parents of the policy in an understandable and uniform format and, to the extent practicable, provided in a language the parents can understand” and that “such policy shall be made available to the local community and updated periodically to meet the changing needs of parents and the school.”
For students with disabilities specifically, the Individuals with Disabilities Education Act (IDEA) provides procedural safeguards to parents to ensure they have certain rights and protections so that their child receives a free appropriate public education (FAPE). To implement IDEA, schools must convene an Individualized Education Program (IEP) team, and the IEP should outline the academic and/or behavioral supports and services the child will receive in school and include a statement of the child’s present levels of academic achievement and functional performance, including how the child’s disability affects the child’s involvement and progress in the general education curriculum. The U.S. Department of Education should provide guidance about how to leverage the current IEP process to notify parents of the technologies in place in the curriculum and use the IEP development process as a mechanism to identify which mental health supports and services a student might need, rather than relying on conclusions from data produced by the software.
In addition, IDEA regulations address instances of significant disproportionality of children with disabilities who are students of color, including in disciplinary referrals and exclusionary discipline (which may include referral to law enforcement). Because of this long history of disproportionate disciplinary actions and the fact that special educators are more likely to report knowing students who have gotten in trouble and been contacted by law enforcement due to student activity monitoring, it raises questions about whether these incidents are a loss of instructional time for students with disabilities and, in turn, a potential violation of FAPE. The Department of Education should provide guidance to clarify that such disproportionate discipline might result from the employment of student activity monitoring software and how to mitigate referrals to law enforcement for students of disabilities.
Recommendation 3. Invest in the Office for Civil Rights within the U.S. Department of Education.
The Office for Civil Rights (OCR) currently receives $140 million and is responsible for investigating and resolving civil rights complaints in education, including allegations of discrimination based on disability status. FY2023 saw a continued increase in complaints filed with OCR, at 19,201 complaints received. The total number of complaints has almost tripled since FY2009, and during this same period OCR’s number of full-time equivalent staff decreased by about 10%. Typically, the majority of complaints received have raised allegations regarding disability.
Congress should double its appropriations for OCR, raising it $280 million. A robust investment would give OCR the resources to address complaints alleging discrimination that involve an educational technology software, program, or service, including AI-driven technologies. With greater resources, OCR can initiate greater enforcement efforts against potential violations of civil rights law and work with the Office of Education Technology to provide guidance to schools on how to fulfill civil rights obligations.
Recommendation 4. Support state and local education agencies with technical assistance.
State education agencies (SEAs) and local education agencies (LEAs) are facing enormous challenges to respond to the market of rapidly changing education technologies available. States and districts are inundated with products to select from vendors and often do not have the technical expertise to differentiate between products. When education technology initiatives and products are not conceived, designed, procured, implemented, or evaluated with the needs of all students in mind, technology can exacerbate existing inequalities.
To support states and school districts in procuring, implementing, and developing state and local policy, the federal government should invest in a national center to provide robust technical assistance focused on safe and equitable adoption of schoolwide AI technologies, including student online activity monitoring software.
Conclusion
AI technologies will have an enormous impact on public education. Yet, if we do not implement these technologies with students with disabilities in mind, we are at risk for furthering the marginalization of students with disabilities. Both Congress and the U.S. Department of Education can play an important role in taking the necessary steps in developing both policy and guidance, and providing the resources to combat the harms posed by these technologies. NCLD looks forward to working with decision makers to take action to protect students with disabilities’ civil rights and ensure responsible use of AI technologies in schools.
This idea is part of our AI Legislation Policy Sprint. To see all of the policy ideas spanning innovation, education, healthcare, and trust, safety, and privacy, head to our sprint landing page.
This TA center could provide guidance to states and local education agencies that lack both the capacity and the subject matter expertise in both the procurement and implementation process. It can coordinate its services and resources with existing TA centers like the T4PA Center or Regional Educational Laboratories, on how to invest in evidence-based mental health supports in schools and communities, including using technology in ways that mitigate discrimination and bias.
As of February 2024, seven states had published AI guidelines (reviewed and collated by Digital Promise). While these broadly recognize the need for policies and guidelines to ensure that AI is used safely and ethically, none explicitly mention the use of student activity monitoring AI software.
This is a funding level requested in other bills seeking to increase OCR’s capacity such as the Showing Up For Students Act. OCR is projecting 23,879 complaint receipts in FY2025. Excluding projected complaints filed by a single complainant, this number is expected to be 22,179 cases. Without staffing increases in FY2025, the average caseload per investigative staff will become unmanageable at 71 cases per staff (22,179 projected cases divided by 313 investigative staff).
In late 2023, the Biden-Harris Administration issued an Executive Order on AI. Also that fall, Senate Health, Education, Labor, and Pensions (HELP) Committee Ranking Member Bill Cassidy (R-LA) released a White Paper on AI and requested stakeholder feedback on the impact of AI and the issues within his committee’s jurisdiction.
U.S. House of Representatives members Lori Trahan (D-MA) and Sara Jacobs (D-CA), among others, also recently asked Secretary of Education Miguel Cardona to provide information on the OCR’s understanding of the impacts of educational technology and artificial intelligence in the classroom.
Last, Senate Majority Leader Chuck Schumer (D-NY) and Senator Todd Young (R-IN) issued a bipartisan Roadmap for Artificial Intelligence Policy that calls for $32 billion annual investment in research on AI. While K-12 education has not been a core focal point within ongoing legislative and administrative actions on AI, it is imperative that the federal government take the necessary steps to protect all students and play an active role in upholding federal civil rights and privacy laws that protect students with disabilities. Given these commitments from the federal government, there is a ripe opportunity to take action to address the issues of student privacy and discrimination that these technologies pose.
Individuals with Disabilities Education Act (IDEA): IDEA is the law that ensures students with disabilities receive a free appropriate public education (FAPE). IDEA regulations require states to collect data and examine whether significant disproportionality based on race and ethnicity is occurring with respect to the incidence, duration, and type of disciplinary action, including suspensions and expulsions. Guidance from the Department of Education in 2022 emphasized that schools are required to provide behavioral supports and services to students who need them in order to ensure FAPE. It also stated that “a school policy or practice that is neutral on its face may still have the unjustified discriminatory effect of denying a student with a disability meaningful access to the school’s aid, benefits, or services, or of excluding them based on disability, even if the discrimination is unintentional.”
Section 504 of the Rehabilitation Act: This civil rights statute protects individuals from discrimination based on their disability. Any school that receives federal funds must abide by Section 504, and some students who are not eligible for services under IDEA may still be protected under this law (these students usually have a “504 plan”). As the Department of Education works to update the regulations for Section 504, the implications of surveillance software on the civil rights of students with disabilities should be considered.
Elementary and Secondary Education Act (ESEA) Title I and Title IV-A: Title I of the Elementary and Secondary Education Act (ESEA) provides funding to public schools and requires states and public school systems to hold public schools accountable for monitoring and improving achievement outcomes for students and closing achievement gaps between subgroups like students with disabilities. One requirement under Title I is to notify parents of certain policies the school has and actions the school will take throughout the year. As a part of this process, schools should notify families of any school monitoring policies that may be used for disciplinary actions. The Title IV-A program within ESEA provides funding to states (95% of which must be allocated to districts) to improve academic achievement in three priority content areas, including activities to support the effective use of technology. This may include professional development and learning for educators around educational technology, building technology capacity and infrastructure, and more.
Family Educational Rights and Privacy Act (FERPA): FERPA protects the privacy of students’ educational records (such as grades and transcripts) by preventing schools or teachers from disclosing students’ records while allowing caregivers access to those records to review or correct them. However, the information from computer activity on school-issued devices or accounts is not usually considered an education record and is thus not subject to FERPA’s protections.
Children’s Online Privacy Protection Act (COPPA): COPPA requires operators of commercial websites, online services, and mobile apps to notify parents and obtain their consent before collecting any personal information on children under the age of 13. The aim is to give parents more control over what information is collected from their children online. The law regulates companies, not schools.
About the National Center for Learning Disabilities
We are working to improve the lives of individuals with learning disabilities and attention issues—by empowering parents and young adults, transforming schools, and advocating for equal rights and opportunities. We actively work to shape local and national policy to reduce barriers and ensure equitable opportunities and accessibility for students with learning disabilities and attention issues. Visit ncld.org to learn more.
The U.S. government should establish a public-private National Exposome Project (NEP) to generate benchmark human exposure levels for the ~80,000 chemicals to which Americans are regularly exposed.
The federal government is responsible for ensuring the safety and privacy of the processing of personally identifiable information within commercially available information used for the development and deployment of artificial intelligence systems
The United States is in the midst of a once in a generation effort to rebuild its transportation and mobility systems. Meeting this moment will require bold investments in new and emerging transportation technologies.
Employee ownership is a powerful solution that preserves local business ownership, protects supply chains, creates quality jobs, and grows the household balance sheets of American workers and their families.