Photo credit: Lucelia Ribeiro via Flickr, CC BY-SA 2.0

6 Key Takeaways from Congress’ Hearing on Protecting Student Data


Yesterday, the U.S. House Education and Workforce Committee held a hearing titled, “Protecting Privacy, Promoting Data Security: Exploring How Schools and States Keep Data Safe.” Here are some key takeaways:

1.) More leaders are realizing the need to reduce the amount of student data collected.

By far, the best witness from a parental rights and pro-privacy perspective was David Couch, Chief Information Officer for the Kentucky Department of Education. A former military cyber security expert, his most cogent remarks had to do with decreasing the amount of data collected:

We have put KDE on a “healthy data diet” so that we collect only the data that we know is necessary, which has the side effect of improving data quality, which researchers love, and minimizing our attack surface. We regularly encourage our districts to do the same.

These remarks were nearly identical to one of the key statements from the testimony of APP’s Jane Robbins during the January hearing on data privacy, who said, “The government should hold as little data as possible, not as much. You can’t hack what isn’t there.”

2.) School districts are making a valiant effort to protect data, but it’s a difficult task.

Dr. Gary Lilly of the Bristol, Tenn., school district highlighted various efforts to protect student data. These include background checks on employees and limiting access to data depending on employee role. Both Lilly and Couch discussed the need for more training for teachers and administrative personnel to avoid inadvertently releasing personally identifiable information via spreadsheets or phishing attacks.

The extreme challenge of their efforts was made clear, however, when Couch admitted that there were nearly four billion attacks on the Kentucky DOE data system in one year.

3.) The corporate and foundation Big Data interests were well represented.

Amelia Vance, director of education privacy at the Future of Privacy Forum (FPF) was one of the witnesses. FPF is a creation of many of the biggest, worst actors on the privacy front, including the Gates Foundation, Google, and Facebook. She spoke about how necessary it is for taxpayers to spend more money training school districts and corporations to properly protect privacy.

In her written testimony, Vance even had the audacity to call the Class Dojo software “communication software” and cite it as a sterling example of having a wonderful privacy policy. This is highly ironic, when, Class Dojo is actually social emotional learning and behavioral modification software developed to inculcate, assess, and change students’ personality traits in order to predict and steer children into careers chosen by corporations and governments, not the students. Jane Robbins also explained this phenomenon with the SEL gaming company EdModo in her recent piece at The Federalist. This is even more concerning, because EdModo has been purchased by a Chinese company with highly sensitive SEL data going to an entity not necessarily subject to already poorly enforced American data protections.

Vance’s testimony was also problematic, given that, in addition to the many privacy concerns, ed tech applications have a very poor record of actually improving academic achievement.

4.) There was a disturbing lack of discussion on student psychological data privacy and security.

Given the enormous implications of the Facebook-Cambridge Analytica scandal for student data privacy; the efforts of OECD to develop a Facebook-style international personality profile test analogous to its PISA test for reading; the non-consensual psychological experimentation on students via software; and the disturbing lack of data security at the U.S. Department of Education (USED), one would think that some of these critical issues would be discussed at a hearing on data security. Sadly, they were not. This seems emblematic of the many competing interests within and between political parties and the enormous financial clout of Big Data.

5.) One witness was there to preserve harmful Obama-era education policies.

Three of the four witnesses were present to discuss student data security, while one was brought in by the Democratic caucus to discuss civil rights issues on the anniversary of the Brown v. Board of Education Supreme Court decision. The entire and admitted focus of the testimony of Catherine Lhamon, former Assistant Secretary of Civil Rights during the Obama administration, was to attack the Trump administration efforts to undo the damaging school safety and transgender bathroom policies.

As discussed, the school safety guidance signed by Lhamon during her time at USED has seriously undermined the safety of students and school staff across the nation. This is especially true in Parkland, Fla., where the district superintendent misled the public and officials about the shooter’s involvement in a federal grant program to decrease on-campus arrests of minority and disabled students. Of course, the difference in arrests and suspensions is blamed on racism instead of the sad but strong propensity of students growing up in fatherless households to have emotional difficulties and act out.

6.) The witnesses clearly understand the great need to revamp FERPA.

The three witnesses that discussed data were uniform in their recommendation to revamp FERPA. As discussed in the most recent national coalition letter to Congress, this is an important thing to do, but it needs to be done carefully by:

  1. Rescinding the Obama-era regulatory fiat that gutted the law and made student PII more available to researchers, corporations, and other federal agencies.
  2. Doing whatever is possible to decrease the amount of data collected on students, especially SEL data. Collection of such data should be eliminated or at the very least a) not collected without informed opt-in parental consent and b) be treated as medical data.
  3. Treating whatever mental health, social emotional, or behavioral data collected for special-education evaluations or any other related program — such as Positive Behavioral Intervention and Supports (PBIS) or Multi-Tiered Systems of Support (MTSS) — as medical data that cannot be housed in longitudinal databases.
  4. Using aggregate rather than individual data to the greatest extent possible.
  5. Obtaining parental consent if data collected for one purpose is to be repurposed or shared with another federal agency.
  6. Eliminating the current language in FERPA allowing predictive testing.

Photo credit: Lucelia Ribeiro via Flickr, CC BY-SA 2.0

Karen R. Effrem, MD

Dr. Karen Effrem and her husband have three children. She is trained as a pediatrician and serves as national education issues chairman for Eagle Forum and president of Education Liberty Watch.

Get the Real News.

Big Tech keeps banning or shadow-banning us, so we have to keep innovating.

Drop us your e-mail, so they can't keep hiding our investigations from you.

Raheem Kassam

You have successfully subscribed to the newsletter

There was an error while trying to send your request. Please try again.

The National Pulse. will use the information you provide on this form to be in touch with you and to provide updates and marketing.