Friday, April 19, 2024

APP’s Jane Robbins Reveals Dangers of Gov’t Data Mining to Congress (VIDEO)

On January 30th, American Principles Project senior fellow Jane Robbins gave outstanding testimony at a U.S. House Education and Workforce Committee hearing titled “Protecting Privacy, Promoting Policy: Evidence-Based Policymaking and the Future of Education.” She spoke eloquently about the critical need for data privacy over data security, the dangers of social emotional learning, and the enormous problems of steering students into certain careers based on predictive algorithms. You can watch her opening statement below:

The video of the whole hearing is available here. Below are some of the strongest points from her opening statement:

The Government Has No “Right” to Student Data

The idea that the government should be able to vacuum up mountains of personal data and employ it for whatever purposes it deems useful – without the citizen’s consent, or in many cases even his knowledge — conflicts deeply with [the] truth about the dignity of persons.

Citizens’ Data Belongs to Them — Not the Government

Our founding principles, which enshrine the consent of the governed, dictate that a citizen’s data belongs to him rather than to the government. If the government or its allied researchers want to use it for purposes other than those for which it was submitted, they should get consent (in the case of pre-K-12 students, parental consent). That’s how things should work in a free society.

The Social Emotional “Craze” Is a Problem

And much of this education data is extraordinarily sensitive — for example, data about children’s attitudes, mindsets, and dispositions currently being compiled, unfortunately, as part of the craze for ‘social-emotional learning.’ Do we really want this kind of data to be made more easily accessible for ‘evidence-building’ to which we as parents have not consented?

This Data Could Be Used to Steer Children into Certain Careers

Literally everything can be linked to education. Data-analysis might study how one’s education affects his employment. Or his participation in the military. Or his health. Or his housing choices. Or the number of children he has. Or whether he purchases a gun. Or his political activity. Or whether his suspension from school in 6th grade foreshadows a life of crime. Education technology innovators brag that predictive algorithms can be created, and the government could use those algorithms to steer students along some paths or close off others.

There Is No Perfect Way to Secure This Data

This raises a crucial distinction that the Commission’s report seemed to miss — the distinction between data security and data privacy. “Data security” refers to protecting data once you have it. Despite the Commission’s strong endorsement of enhanced security measures, there simply is no unbreachable system, and the federal government’s alarming history of enormous data breaches brings that point home. The government should hold as little data as possible, not as much. You can’t hack what isn’t there.

Data Collection Without Consent Is an Affront to Freedom

The goal of benefiting others in society, in vague and theoretical ways, or of “helping” citizens lead their own lives and make their own decisions, does not justify the federal government’s collection and dissemination of millions of data points on individuals — without their consent. This should not be happening in a free country. Some lines should not be crossed regardless of their supposed benefits. This is one of those lines.

Robbins also strongly and correctly reiterated this crucial point in her closing: “I suggest that the goal of benefitting others in society or of ‘helping’ citizens make good decisions does not justify the federal government’s collection, analysis, and dissemination of millions of data points on individuals – without their consent. This might happen in China, but it should not happen here.”

Additionally, she was able to make or reiterate some other extremely important points during the question and answer period:

The Government Ignores the Data It Already Has

Especially important was the point she made to Rep. Brett Guthrie (R-Ky.) that research is ignored and programs are funded anyway, despite epic failure:

…and the other thing I would say about the research is that we have to find a way, if possible, to make the government actually listen to it. There has been a lot of research in the past, ever since the government has been involved in education that shows that certain things don’t work or they are a waste of money or actually harmful, and they get funded every year. So, some or a lot of people in the country are a little bit cynical about that. Why are we setting up what could become a massive database system when we don’t have a great deal of confidence that is going to be used properly anyway?

Allowing Access to Social Emotional Data Is a Serious Risk

Robbins also discussed very well with Rep. Lloyd Smucker (R-Pa.) the concept of data minimization (you can’t hack what isn’t there) and reiterated how inappropriate it is to collect SEL data on children, especially as part of a statewide longitudinal database. This was a key point in her long-form written testimony that referenced our co-authored article for The Federalist on SEL and made the key point:

Compiling this type of [SEL] data is bad enough; allowing nonconsensual access to researchers and other agencies could create a nightmare for children and their families and should be unthinkable in a free society.

While of course not all of the questioning or statements were friendly and pro-privacy on either side of the aisle, Chairwoman Virginia Foxx (R-N.C.) and the committee nevertheless deserve great kudos and thanks for putting together such an excellent hearing. It is a great testament to pro-privacy and parent groups across the nation consistently raising our voices against FEPA, CTA, SETRA and the other federal data mining bills in Congress. Thank you for your diligence. Please continue to explain to Congress the critical need for parental consent, data privacy, data minimization, use of aggregate data, the dangers of SEL and personalized learning, and stay tuned!

More From The Pulse