Back to School Privacy Dispatch
Authored by Maddie Dugan, CIPM and Michele Martell, Esq.
As kids of all ages head back to school, we want to focus attention on the myriad issues that can arise in connection with “Ed Tech” and information gathering mechanisms used by schools. Often, parents and children do not have any ability to refuse to download or use required software, so being aware of potential issues may help you have important conversations with your child’s school administrators about these risks.
To do so, we’ve rounded up some of the latest news and advice about children and online safety.
Security Breaches
Overall, ransomware attacks on US schools and colleges have increased, with 491 incidents recorded since 2018, impacting over 8000 educational institutions and exposing 6.7 million individual records. The most affected US states included California (43 attacks), New York (42 attacks), Washington and Ohio.
Children’s Privacy Protection
The Childrens’ Online Privacy Protection Act (“COPPA”) has been the primary regulation governing apps and online sites targeted to children 13 and under for the past 24 years.
The Federal Trade Commission has been very active in enforcing COPPA in the Ed Tech space. In 2022, they issued guidance to the Ed Tech industry, saying very directly that “protecting kids’ privacy is your business.” In their Policy Statement, the FTC makes it clear that “COPPA-covered companies, including Ed Tech providers, must not condition participation in any activity on a child disclosing more information than is reasonably necessary for the child to participate in that activity.” [emphasis added]
The Policy Statement closes with the express language: “Children should not have to needlessly hand over their data and forfeit their privacy in order to do their schoolwork or participate in remote learning, especially given the wide and increasing adoption of Ed Tech tools. Going forward, the Commission will closely scrutinize the providers of these services and will not hesitate to act where providers fail to meet their legal obligations with respect to children’s privacy.”
Required Software
When your child starts a new academic year, you may be required on their behalf to download apps or use third party software to access homework, grades, etc. Proprietary software used in the classroom, on tablets, or on mobile devices can also pose risks to children’s privacy.
Recently, an Ed Tech company tried to argue that parents had waived their right to sue, claiming that the Court should compel arbitration based on provisions contained in the Terms of Service between the Ed Tech company and the school districts where the plaintiffs’ children attend school. Both the FTC and the parents have argued that COPPA does not create an agency relationship between the schools and parents.
The parents echoed the FTC's contention that COPPA doesn't "impose a conservatorship on parents or their children in favor of schools for any purpose — especially to bind those parents and children to secret arbitration," in addition to asserting that there was no way that they could have directly consented to the software terms of service merely by using public school services "to which they are already legally entitled."
What Are Questions That Parents Should Ask
Parents can and should ask to see the applicable privacy policies, Terms of Service and other important information that should clearly disclose the limited uses of children’s data.
Bring any issues or concerns to your school administrators to learn about their current protections against possible invasions of data privacy, security breaches and other potential harms.
New York Secretary of State Walter T. Mosley advised, “With technology being used more and more as a learning tool, New York parents and caregivers should know their rights when it comes to protecting their children’s personal information and privacy.”
How else can you protect your child’s personal information? Understand where your child’s information is stored. Ask administrations, after-school organizations and sports clubs about how secure their digital and physical records are: Are digital records connected to the internet and, if so, are they encrypted? Are physical records locked in filing cabinets? Who has access?
Use caution when providing identifying information. If asked for a Social Security number (SSN), inquire why it is needed and ask to use another identifier. Oftentimes organizations include the SSN request as a formality and it may not be mandatory.
Generative AI in Education
There is a rush to incorporate various forms of GenAI into the classroom, which, while it may provide benefits, also comes with clear risks. Some are concerned that systemic level risks such as discrimination from algorithmic bias or increased surveillance, especially of historically discriminated against groups, outweigh any potential benefits. These concerns become even more pointed in the context of educating children – a vulnerable group that should be more protected, rather than less.
Research funded by organizations with close ties to technology companies, such as the Walton Family Foundation, are using classroom environments to test OpenAI’s ChatGPT capabilities, stating that because many teachers and students like GenAI, it should be incorporated in schools. Rather than promoting learning outcomes, some educators have become suspicious that county-provided lesson plans and test materials are being crafted by AI because the content is inconsistent, constraining, and incomprehensible. This belief is further supported as county officials have recommended the materials be fed to ChatGPT to clear up any confusion. Top-down curricula that are swiftly implemented without adequate review raises ethical concerns, and is likely to cause some students to fall behind.
The US Department of Education Technology states, “As AI models are not generally developed in consideration of educational usage or student privacy, the educational application of these models may not be aligned with the educational institution’s efforts to comply with federal student privacy laws, such as FERPA, or state privacy laws.”
The recent Office of Education Technology’s AI and the Future of Teaching and Learning white paper states that “Policies are urgently needed to implement the following:
leverage automation to advance learning outcomes while protecting human decision making and judgment;
interrogate the underlying data quality in AI models to ensure fair and unbiased pattern recognition and decision making in educational applications, based on accurate information appropriate to the pedagogical situation;
enable examination of how particular AI technologies, as part of larger edtech or educational systems, may increase or undermine equity for students; and
take steps to safeguard and advance equity, including providing for human checks and balances and limiting any AI systems and tools that undermine equity.”
Policy-makers have noted a “pressing need for guardrails and guidelines that make educational use of GenAI advances safe, especially given this accelerating pace of incorporation of GenAI into mainstream technologies.” The development and deployment of GenAI requires access to detailed data which goes far beyond conventional student records to detailed information about what students do as they learn with technology, presenting higher levels of data privacy and security risk. Students themselves are concerned about the ethics of products they experience in their lives and have much to say about what products they’d like to see or not see in school, and it is important to help them understand the legal, ethical, and privacy implications of sharing data with AI enabled technologies.
As policy development takes time, policy makers and educational constituents together need to start now to specify the requirements, disclosures, regulations, and other structures that can shape a positive and safe future for all constituents – especially students and teachers.
Ways to Protect Personal Information
Whether your child is encountering risks to their privacy in the classroom, in school data collection practices, or on personal devices, here are other ways to protect personal information:
Discuss the risks of providing sensitive information to AI programs, chat bots, or large language models of any kind. Kids have always looked for shortcuts to complete their schoolwork, and it’s easier than ever to have AI assistants complete assignments in exchange for information.
Review internet safety tips with children and remind them to be careful about opening attachments and suspicious emails.
Both parents and students should be careful on all social media platforms: don’t overshare, and ask friends & family to minimize sharing as well, as any information posted can be used by identity thieves. Avoid sharing personal information including full names, addresses, phone numbers, Social Security numbers or even where they go to school.
Only label books, backpacks and lunches with your child’s full name and any other information on the inside. Using initials on the outside is okay, but names, even first names, on the outside can create an unsafe situation.
Remember:
Your child’s personal information cannot be sold or released for any commercial purposes. If your child is under age 18, you have the right to inspect and review the complete contents of your child’s education records. The rush to incorporate Ed Tech and generative AI needs to be assessed even more carefully when the data subjects are children. Their effects will outpace regulation without attention from the consumer base to the downstream goals of the technology, and basic privacy best practices that can be applied in daily life.
The contents of this article are intended to convey general information only and not to provide legal advice or opinions, and should not be construed as, and should not be relied upon for, legal advice in any particular circumstance or fact situation. Nothing in this article is an offer to represent you, and is not intended to create an attorney-client relationship.