Shadow AI in schools is the use of AI tools by teachers, staff and students without the knowledge or permission of the administration, especially AI tools that haven’t been vetted and approved by the district. The National Cybersecurity Alliance stated in its 2024-2025 annual cybersecurity report that 38% of survey participants admitted to sharing sensitive work information with AI without their employer's knowledge. This fact alone is worrying, but even more worrying is the nature of data collection by many AI tools. For example, OpenAI’s ChatGPT is the most popular AI chatbot with over 5 billion monthly visits. OpenAI’s website states that users shouldn’t share sensitive information because OpenAI cannot delete prompts from user history. Unless users alter their data control settings, prompts can also be used to train the AI model and potentially be accessed by other user prompts.
Perhaps the most immediate legal risk for districts is staff and students entering legally protected student information into AI tools. Laws that protect student data include:
- FERPA: Protects student education records and related personally identifiable information. Teachers inputting names, grades, disciplinary records, etc. into an AI tool without a FERPA-compliant agreement put the district at risk.
- COPPA: Requires parental consent for online services collecting personal information from children 13 and younger. AI tools often collect this sort of information to sign up, and many AI tools include age limits in their terms of service.
- PPRA: Requires parental consent before minors can participate in surveys that disclose political affiliation, mental issues, religion and more. Districts using AI for counseling or assignments without parental consent may be at risk.
- Ohio Law: ORC 3319.321 protects students’ personally identifiable information from disclosure.
It is imperative that schools train both staff and students to use AI responsibly. Precautions taken are far less effective if users don’t know how to avoid inputting sensitive data and other unnecessary risks. Schools should also consider implementing AI policies and procedures that align staff and student AI use with district mission, vision and goals. Finally, consider using AI education tools created specifically with school staff and student use in mind. These are much more likely to have been developed in collaboration with education professionals and prioritize data security and compliance with applicable student privacy laws.