New Hampshire Bill Threatens Children’s Personal Privacy Rights

The passage of New Hampshire’s SB 267 will threaten your chid’s personal privacy rights.

New Hampshire students have a unique pupil identifier assigned to them to protect their personal indentity.  Their UPI is used when they take the State Standardized Assessment.  This prevents testing companies from using or sharing their personal information.  The protections that have been put in place to protect children are now at risk of being removed for convenience purposes. 

New Hampshire students will be taking the State Standardized Assessments this spring.  Many parents have refused the standardized tests for their children, but now there may be even a better reason to refuse these tests.  

SB 267 would give testing vendors the student’s name, date of birth, student ID, and the ability to “analyze” the data. If that isn’t bad enough, SB 267 gives exemptions for data sharing and, removes the requirement for the testing vendor to destroy data when it’s no longer needed. SB 267 also leaves out parental consent or recourse. Not only does this violate a child’s 4th Amendment rights, but their civil rights of privacy and personal freedom. Nothing in SB 267 includes language protecting the diagnostic portions of assessment and the data thereof.

As of right now, all states, as well as all the laws connected to and including Every Student Succeeds Act (ESSA), function on the GUTTED version of Family Educational Rights and Privacy Act (FERPA). The Foundations for Evidence-Based Policymaking Act FEPA is also a massive data collection system coming out of the federal level. SB 267 takes NONE of this into consideration from the language, as written. 

There has been bi-partisan support in New Hampshire for privacy protections. This was illustrated recently by the decisive passage of the privacy amendment to the New Hampshire Constitution. Even before that constitutional amendment was passed, our state had established such a reputation that the Parent Coalition for Student Privacy ranked New Hampshire as one of the best states in the country in terms of protecting the privacy of students. Unfortunately, SB 267 would take us in the wrong direction.

When Massachusetts administered the MCAS several years ago, all test questions were made public after the assessment was completed.  This gave everyone the opportunity to make sure the questions asked were of the quality they expected.  Professors at area colleges could look through the questions and make sure they were free from bias and errors.  This information is not available to the public using the current standardized assessments in New Hampshire.  A lack of transparency on test questions alone should have legislators thinking twice about providing the testing company with our students’ personal information. 

11th grade students are required to take the SAT as the standardized assessment. But as you can see from this article from studentprivacymatters.org, they claim that the College Board, “did not deny that they sell students’ personal data – or in their words, “license” the data for a fee to institutions, for-profit corporations and the military.”  In addition to selling the data, “….you can see that this script for proctors is written in the most ambiguous way possible, with voluntary questions mixed in with required ones, and no clear indication which is which or that much of this personal data will be shared with third parties for a fee.”  That data includes their social security number, which is considered highly sensitive. 

They go on to say, “How the College Board gets away with this, year after year, is really a scandal — especially since all the new state laws have been passed banning the selling of student data.  Perhaps they are relying on the distinction without a difference of “licensing” the data vs selling it.

Dr. Peg Luksik has referenced to standardized assessments used in the past, the Educational Quality Assessment (EQA), and how the internal documents said, “we are testing and scoring for the child’s threshold for behavior change without protest.”  When past standardized assessments have included questions that do not test academic knowledge, but instead attempt to change the students’ values, attitudes and beliefs, some parents will be concerned about any attempts to provide the testing company with their personal information. 

Since these new assessments are adaptive, meaning students will be answering different questions based upon the answers they provide, Dr. Luksik warns about the ability to manipulate the outcome.  
By allowing testing companies to access our children’s personal information SB267 will cement into law their ability to gain access to their personal information without parental knowledge or consent. 

Dr. Luksik explains in this short video why that is dangerous to our children:

There is still time to contact New Hampshire Senators and Representatives and ask them to vote NO on SB267.  

Technocratic Corporatocracy Hijacks Public Schools for Profit

Photo Credit: Lexie Flickinger (CC-By-2.0)

Making People Transparent for Profit Through Nontransparent Algorithms

Imagine if everything about you was on a giant billboard and you could see who was buying information about you and making lists. That is exactly what is happening without your knowledge or consent each time you use the Internet. Everything a user does online is tracked and monetized — Google, Facebook, Twitter, Amazon, apps — they all collect your data. A provider of computer-run education programs has admitted that they share the data they gather with 18 “partners.” Invisible analytics, profiling, sharing or selling of data collected without consent or knowledge makes every Internet user vulnerable to manipulation and control, ending personal privacy and sovereignty. 

The process works like this: new data are collected covertly through apps; the collected data is transferred to data brokers who access the new data and combine it with existing data about an individual using nontransparent algorithms. The algorithms create a very detailed profile of individual users; vendors are sold access to the profiles and target individuals based on profile analyses. Google is by far the most used third party analytics tracker and makes 90% of its revenue tracking user searches. In attached bibliography includes multiple examples of how the tech industry not only sells data, but sells data collection programs and devices to measure behaviors and infer emotions and thoughts. 

The documentary, The Creepy Line, explains how tech giants use algorithms to shape behavior and shape thoughts. Google used algorithms to influence voter behavior in the 2016 presidential election In a recently leaked video of a Google company meeting conducted shortly after that election, one employee asked if Google is willing to “invest in grassroots, hyper-local efforts to bring tools and services and understanding of Google products and knowledge so that people can “make informed decisions that are best for themselves.” Google CEO Sundar Pichai responded that Google will ensure its “educational products” reach “segments of the population [they] are not [currently] fully reaching.” Apparently, Google will ensure that Google Chromebooks and the Google manipulated search engine will be standard “education” materials in American schools so that students can make Google informed decisions.

Tracking and “Educating” Children

Schools funded with tax dollars allow the tech industry to collect billions of student data points about every aspect of every student using school issued personal devices by mandating students complete assignments using online tools and apps for classwork and homework on these devices. The data are used to build comprehensive profiles on each student. Every state has a database and students’ personal data can be shared with researchers and companies. Google launched a public relations campaign, Be Internet Awesome, that includes a curriculum and online game for Chromebooks, to promote itself as a “good” company; but a critical analysis of Be Internet Awesome concluded,

. . ., the program’s conceptualization of Internet safety omits key considerations. Specifically, it does not acknowledge the role of companies in keeping data and personal information secure. Instead, its focus on user-centered strategies obscures the degree to which users are often powerless when it comes to controlling how their personal data is used. [It] generally presents Google as impartial and trustworthy, 
which is especially problematic given that the target audience is impressionable youth. 

Transporting human beings without their consent for exploitation is human trafficking. Transporting human beings’ private data without their knowledge or consent is human data trafficking. Transporting children’s private data by collecting it in compulsory schools without parent knowledge or consent to exploit them in the data market is nothing less than institutional child data trafficking.

Failure of Government

Existing federal laws are inadequate for protecting student data privacy. FERPA generally does not apply to online data collection and FERPA was changed by executive rule in 2011, removing parental consent for data collection. FERPA now allows companies (such as Google) to be declared a “school official,” giving them access to student data on par with professionals who have a “need to know” to provide appropriate services to students. HIPPA does not apply to student records. COPPA does not generally apply to schools, and COPPA is rarely enforced even when complaints have been filed, and we know thousands of Android apps are improperly tracking children. There is no federal law regulating companies’ use of online student data.

The FBI recently issued a warning about privacy and security risks of educational technology and the U.S. Department of Education issued guidance that schools should not force parents to consent to third party terms of service. Yet, parents are told they cannot attend the school if they don’t allow their child to have a fill in the blank edtech app or program (e.g., Naviance, or NWEA, or Google Gsuite account). EdTech people say education is the most datamineable industry by far and we know now that students’ social-emotional data is the new goldmine despite the pseudo-science propping up social-emotional learning. The West Virginia teachers strike in Spring 2018 was in part sparked because, among other reasons, teachers were being forced to download Go365, a wellness and rewards app which would track their steps and other health data. Teachers were required to upload a variety of personal health information into the app and saw the program as an invasion of personal privacy

Google’s money wields an enormous amount of influence on U.S. education policy. Under the Obama Administration Google’s lobbyists had essentially unrestricted visits to the White House . A shocking number of White House officials now work for Google or vice versa. The U.S. Department of Education was heavily populated with former employees of organizations associated with Bill Gates, also advocating for computer-administered education. We know tech firms including Google have recently been lobbying the White House for a new federal privacy law on their own terms; Google even provided their own framework for a favorable privacy bill that does not include opt-in consent. It is time for Congress and states to kick the fox out of the henhouse — reject corporatocracy and restore our Constitutional democracy.

Responsibility of Government

U.S. citizens are protected from the government’s invasion of privacy and from property theft. They must also be protected from corporations’ invasion of privacy and theft of their electronically created property. Sovereign citizens cannot be coerced into giving their data or penalized/denied public education services for not consenting to sharing their data. Given that the infrastructure has already been built, Congress must adopt strong privacy laws at least as stringent as the European Union’s global data standards established in its General Data Protection Regulation (GDPR). The FTC should be given rule-making authority and resources to investigate and directly prosecute violations; but by no means should Congress abdicate its responsibility to protect the general welfare of the Americans and allow Silicon Valley to dictate to Congress or the FTC.

Virginia School Posts Students Health Information Online

Tanners Creek Elementary – Norfolk, VA

Norfolk Public Schools in Virginia illegally posted dozens of students’ private health information online, as part of the district’s crisis plans The Virginian Pilot reports.

For every school, the plans list the chain of command, potential evacuation sites and students and staff with special needs who may require assistance in an emergency.

They also include the cell phone numbers for key staff, including principals and school resource officers.

Not every school identified students or staff, but many did.

One elementary school’s plans, for instance, name a student with autism and two students with mobility issues.

Another’s names 27 students who have either asthma, food allergies, seizures or heart conditions. Another elementary school’s plans name five teachers who are diabetic or have high blood pressure.

There is no reason, no reason at all that a crisis emergency plan should have that much detail as to include student names along with their health information, especially when these conditions go beyond mobility in the case of an emergency. Why in the world does a crisis emergency plan need to list students with food allergies? This is absolutely appalling. 

The district says they take student privacy seriously, but obviously, that is not true otherwise those names would not be included.

ACT Sued by Disabled Students Over Release of Personal Information

A class action lawsuit was filed against ACT by a group of disabled students and parents of disabled students for the release of personal information.

The nation-wide lawsuit was filed in U.S. District Court in Los Angeles by with Panish Shea & Boyle LLP and Miller Advocacy Group. They claim that ACT violated the civil rights of disabled students.

The plaintiffs allege that the Iowa City-based testing company acquired the disability status of students taking the ACT college entrance exam and then disclosing that confidential disability information on score reports to colleges and other programs. They also allege ACT sold the information to other for recruitment and enrollment purposes. This activity is a direct violation of the American with Disabilities Act (ADA), the Unruh Act, California Constitution, and California’s Unfair Competition Law.

“ACT flags students’ test scores, discloses their confidential information to colleges pre-admission, and stigmatizes students with disabilities in the admissions process,” Rahul Ravipudi of Panish Shea & Boyle LLP, said. “Not only does this unlawful practice violate the privacy, security and confidentiality of information entrusted to ACT by the students in its care – it does so for profit, and at the expense of America’s most vulnerable students who are striving to further their education.”

Their complaint states two ways that ACT illegally uses student’s disability information: 

  1. ACT “flags” student score reports by disclosing detailed student disability information and the use of accommodations on the score report it sends to colleges. This information is collected through questions on the online ACT Student Profile Section filled out when students register to take the exam. On exam day students also fill out the Student Information Form.
  2. ACT sells the detailed student disability data to various postsecondary organizations including colleges, scholarship programs, and other third parties who use it for recruitment and marketing related to the admissions process.

The plaintiffs allege unlike ACT sending the score report to colleges; this information was sent without the student or high school’s knowledge. 

“I was shocked to learn that ACT was using my disability information against me and making it more difficult for me to get into college and get the money I need to go to college,” Halie Bloom, one of the plaintiffs said. “I’m speaking out, because I know that someone has to stand-up for all of the students who are scared about how their disabilities will be used against them.”

Bloom is a college-bound, 2018 high school graduate who had an Individualized Education Plan (IEP) under the IDEA and a 504 Plan under the Rehabilitation Act since middle school, and she took the ACT several times with approved accommodations. ACT acquired Ms. Bloom’s disability status from her testing registration and annotated her score reports with “learning or cognitive disability” that requires special provisions. ACT disclosed Ms. Bloom’s disabilities on all ACT Test score reports sent on her behalf to colleges to which she applied and thereby flagged her score reports. She had no expectation that ACT would include her disability status with her score reports or otherwise ever disclose her confidential disability information.

Read the complaint filed below:

Do You Know What Data Is Being Collected On Your Student?

Photo credit: Nick Youngson (CC BY-SA 3.0)

Toward the end of the school year in May, a sophomore at Sharpsburg, Georgia’s Northgate High School (in the Coweta County School System) texted her mother to find out her blood type. She said she needed the information for an assignment in her American literature class. Wondering how students’ blood types could possibly be relevant to American literature, the mom investigated and uncovered an appalling invasion of privacy and possible violation of federal student-privacy law.

What she discovered should be a warning to all parents of school-age children: Monitor everything that goes on in the classroom.

The Northgate teacher had required students to fill out a “dossier” of personal information, including height, weight (“DON’T LIE,” she warned), distinguishing features (“tattoos, scars, gold crowns/caps, particular speech/mannerism, or walking traits”), blood type, hair type or texture, and handwriting sample. Each student was also told to imprint a fingerprint on a piece of tape, and to supply a photo and a hair sample (for DNA analysis). When some students objected to providing this personal information, the teacher responded, “It’s supposed to be fun,” and warned that failure to complete the dossier would result in a lower grade.

If this weren’t bad enough, the teacher then posted the dossiers on the classroom wall – in full view of all students and visitors. 

Shocked and perplexed, the investigative mom contacted the principal. It took two emails, but he finally responded with a bare statement that her daughter’s information would be returned to her. No apology, no acknowledgement of impropriety, no information about how this happened.

Not until local school board member Linda Menk contacted the district superintendent did the mom receive any explanation of this bizarre assignment. In response to several direct questions, the principal said the teacher acted on her own in assigning the dossier, that no one else in the department participated, and that it was intended to introduce STEM (science, technology, engineering, math) content – as part of forensic science — in conjunction with (already completed) study of the book Twelve Angry Men. 

How could such a dossier help students understand an utterly unrelated book? And why was an English teacher wasting time with STEM? No explanation of that. The principal also claimed all the dossiers were collected and destroyed, even though at least two students were known to have taken theirs home.

Dissatisfied with this explanation, Northgate parents and Menk contacted public-interest law firm Liberty Counsel. The lawyer there analyzed the facts and wrote to the district superintendent to lay out the multiple federal violations involved in the dossier assignment. The Coweta County School System’s lawyer denied the assignment violated federal law but agreed, in an understatement, that it isn’t “best practice” to gather such personal information. He claimed all the dossiers that could be recovered had been shredded.

The inappropriateness of this assignment should have been glaringly obvious to any teacher. Not only did it shatter the privacy boundaries of students, but it had no connection to the subject supposedly being taught. 

But this is where our students find themselves. Little by little, they are acclimated to losing any expectation of privacy – for their own good, of course.

In the name of “personalized” and “social-emotional” learning, students are required to interact with sophisticated software that vacuums up enormous amounts of data about not only their knowledge, but their mindsets and attitudes. Some of this software comes in the form of video games, which gather sensitive data while heightening the propensity toward addiction. (In Georgia, even the youngest schoolchildren are put on video games as a means of “assessment.”) Digital software uses the data to create algorithms that predict a child’s future behavior, capabilities, or accomplishment based on analyzing his keystrokes. The federal government even touts software that observes students’ physiological reactions to a lesson and feeds that data into the algorithms. Some schools are buying apps that allow constant surveillance and sharing of information about students’ emotional states. And there is little or no control over what technology vendors do with this data, or to whom they might sell it.

In this atmosphere, privacy is downgraded in the name of developing the “whole child.” Is it any wonder that a particularly obtuse teacher might fail to see the harm in asking probing personal questions and posting the answers on the wall?

This strange episode at Northgate demonstrates that parents must be ever vigilant about what’s happening in their children’s classrooms. The days when students’ personal information was off-limits to prying eyes – and when American literature meant American literature — are over. Can the same be said of common sense?

Student Data For Sale

Natasha Singer in The New York Times wrote about how student data collected by the College Board through surveys connected with the SAT and PSAT.

I wanted to highlight an excerpt:

Three thousand high school students from across the United States recently trekked to a university sports arena here to attend an event with an impressive-sounding name: the Congress of Future Science and Technology Leaders. Many of their parents had spent $985 on tuition.

Months earlier, the teenagers had received letters, signed by a Nobel Prize-winning physicist, congratulating them on being nominated for “a highly selective national program honoring academically superior high school students.”

The students all had good grades. But many of them were selected for the event because they had once filled out surveys that they believed would help them learn about colleges and college scholarships.

Through their schools, many students in the audience had taken a college-planning questionnaire, called MyCollegeOptions. Others had taken surveys that came with the SAT or the PSAT, tests administered by the College Board. In filling out those surveys, the teenagers ended up signing away personal details that were later sold and shared with the future scientists event.

Read the rest.

She mentioned the U.S. Department of Education in May released guidance on this particular practice (which ACT does as well). This guidance recommended that schools make it clearer that pre-test surveys are optional. You can it below:

 

Report Reveals a Lack of Transparency In Marketplace of Student Data

Photo credit: Nick Youngson (CC BY-SA 3.0)

Fordham Law School’s Center on Law and Information Policy has released its findings from a multi-year study on the commercial marketplace for the sale and exchange of student information.

Transparency and the Marketplace for Student Data sought to gain an understanding of the commercial marketplace for student data and the interaction with privacy law. Over several years, Fordham CLIP reviewed publicly-available sources, made public records requests to educational institutions, and collected marketing materials received by high school students.

The study uncovered and documented an overall lack of transparency in the student information commercial marketplace and an absence of law to protect student information.

Key findings of the reporter include:

  • Parents and students are generally unable to determine how and why certain student lists were compiled or the basis for designation a student as associated with a particular attribute  like race, religion, and purported interests.
  • It is difficult to ascertain sources for student data: large school districts claim they do not sell directory information except to the military and other educational institutions.
  • Data brokers operating in the student information marketplace frequently change names, merge and have affiliated relationships, making it difficult to identify student data brokers.
  • Despite all of this, students lists are commercially available for purchase on the basis of ethenicity, affluence, religion, lifestyle, awkwardness and even a preceived or predicated need of family planning services.

The findings also revealed that a profitable ecosystem for commercial student data exists, but a lack of transparency and accessibility to information remains.

Based then on the research and the deficiencies in existing law and regulation of the commercial marketplace for student data, Fordham CLIP makes the following policy recommendations:

  • The commercial marketplace for student information should not be a black market. Parents, students, and the general public should be able to reasonably know (i) the identities of student data brokers, (ii) what lists and selects they are selling, and (iii) where the data for student lists and selects derives. A model like the Fair Credit Reporting Act (FCRA) should apply to compilation, sale, and use of student data once outside of schools and FERPA protections. If data brokers are selling information on students based on stereotypes, this should be transparent and subject to parental and public scrutiny.
  • Brokers of student data should be required to follow reasonable procedures to assure maximum possible accuracy of student data. Parents and emancipated students should be able to gain access to their student data and correct inaccuracies. Student data brokers should be obligated to notify purchasers and other downstream users when previously-transferred data is proven inaccurate and these data recipients should be required to correct the inaccuracy.
  • Parents and emancipated students should be able to opt out of uses of student data for commercial purposes unrelated to education or military recruitment.
  • When surveys are administered to students through schools, data practices should be transparent, students and families should be informed as to any commercial purposes of surveys before they are administered, and there should be compliance with other obligations under the Protection of Pupil Rights Amendment (PPRA).

N. Cameron Russell, Executive Director of Fordham Law School’s CLIP, and one of the co-authors of the study, said that Vermont’s recent passage of H.764 – the United States’ first legislation regulating commercial data brokers – is responsive to, and in part inspired by, problems identified in the Fordham CLIP study.

“I recently had the opportunity to testify before the Vermont House Committee on Commerce and Economic Development on the need for closer regulation and oversight of commercial data brokers, and the passage of H.764 requiring data brokers to register with the state as well as include specific information disclosures for brokers of student information underscores the need for an overhaul of the commercial student information marketplace, particularly increased transparency,” said Russell.

Joel Reidenberg, Professor of Law and Founding Academic Director of Fordham Law’s CLIP, says the passage of H.764 in Vermont is likely to have a national impact.

“The Vermont law is likely to become a national model and have a nationwide effect. Data brokers harvest personal information on a national scale and the Vermont registry requirement will result in increased national transparency for the identities and practices of these brokers,” said Reidenberg.

The full report is available below:

Jane Robbins Testifies to House Committee on Education & the Workforce

Jane Robbins, senior fellow, with American Principles Project was one of four witnesses for the House Committee on Education and the Workforce‘s hearing on “Evidence-Based Policymaking and the Future of Education.”

Robbins and Paul Ohm, a professor at the Georgetown University Law Center, were the only two witnesses who seemed to be concerned with student privacy. The other two witnesses, Dr. Casey Wright, Mississippi’s State Superintendent of Education, and Dr. Neal Franklin, the program director for Innovation Studies with WestEd, were touted the gains that could be made with student data.

You can watch the whole hearing below:

Here are the transcript and video of Robbins’ opening statement to the committee:

Madam Chairman and members of the committee:

My name is Jane Robbins, and I’m with the American Principles Project, which works to restore our nation’s founding principles. Thank you for letting me speak today about protecting privacy when evaluating government programs, especially in the area of education.

The Commission on Evidence-Based Policymaking was created to pursue a laudable goal: To help analyze the effectiveness of federal programs. We all certainly agree that public policy should be based on evidence, on facts, not on opinion or dogma. So unbiased scientific research, for example, is vital for policymaking.

But the problem arises when the subjects of the research and analysis are human beings. Each American citizen is endowed with personal dignity and autonomy and therefore deserves respect and deference concerning his or her own personal data. Allowing the government to vacuum up mountains of such data and employ it for whatever purposes it deems useful – without the citizen’s consent, or in many cases even his knowledge – conflicts deeply with this truth about the dignity of persons.

Bear in mind that the analyses contemplated by the Commission go further than merely sharing discrete data points among agencies. They involve creating new information about individuals, via matching data, drawing conclusions, and making predictions about those individuals. So, in essence, the government would have information about a citizen that even he or she doesn’t have.

Our founding principles, which enshrine the consent of the governed, dictate that a citizen’s data belongs to him rather than to the government. If the government or its allied researchers want to use it for purposes other than those for which it was submitted, they should get consent (in the case of pre-K-12 students, parental consent). That’s how things should work in a free society.

Let’s consider a few specific problems. The Commission’s recommendations to improve evidence-building, while well-intentioned and couched in reasonable language, fail to recognize that data turned over by citizens for one purpose can be misused for others. It is always assumed that the data will be used in benevolent ways for the good of the individual who provides it. But especially with respect to the enormous scope of pre-K through college education data, that simply isn’t true.

Literally everything can be linked to education. Data-analysis might study the connection between one’s education and his employment. Or his health. Or his housing choices. Or the number of children he has. Or his political activity. Or whether his suspension from school in 6th grade foreshadows a life of crime. Education technology innovators brag that predictive algorithms can be created, and those algorithms could be used to steer students along some paths or close off others.

And much of this education data is extraordinarily sensitive – for example, data about children’s attitudes, mindsets, and dispositions currently being compiled, unfortunately, as part of so-called “social-emotional learning.” Do we really want this kind of data to be made more easily accessible for “evidence-building” to which we as parents have not consented?

The Commission recommends that all this data be disclosed only with “approval” to “authorized persons.” But we should ask: Approval of whom? Authorized by whom? There are myriad examples of government employees’ violating statute or policy by misusing or wrongfully disclosing data. And even if the custodians have only good intentions, what they consider appropriate use or disclosure may conflict diametrically with what the affected citizen considers appropriate. Again, this illustrates the necessity for consent.

We should take care to recognize the difference between two concepts that are conflated in the Commission’s report. “Data security” means whether the government can keep data systems from being breached (which the federal government in too many cases has been unable to do). “Data privacy” refers to whether the government has any right to collect and maintain such data in the first place. The federal Privacy Act sets out the Fair Information Principle of data minimization, which is designed to increase security by increasing privacy. A hacker can’t steal what isn’t there.

Another problem with the “evidence-building” mindset is that it assumes an omniscient government will make better decisions than individuals can themselves. But what these analyses are likely to turn up are correlations between some facts and others. And correlations do not equal causation. So, for example, we might end up designing official government policy based on flawed assumptions to “nudge” students into pursuing studies or careers they wouldn’t choose for themselves.

Human beings are not interchangeable. Our country has thrived for centuries without this kind of social engineering, and it’s deeply dangerous to change that now.

In closing, I reiterate my respect for the value of unbiased research as the foundation for policymaking. But speaking for the millions of parents who feel that their concerns about education policy and data privacy have been shunted aside at various levels of government, I urge you to continue the protections that keep their children from being treated as research subjects – without their consent. This might happen in China, but it should not happen here. Thank you.

You can read her longer, written testimony here.

Different committee members asked questions of the panel. Congresswoman Virginia Foxx (R-NC), the committee chair, asked Robbins the first question.

Congressman Brett Guthrie (R-KY) asked Robbins whether there is a balance between gathering useful information and privacy protection that can be found.

She had an exchange with Congressman Rick Allen (R-GA) about data collection and student privacy.

Congressman Tom Garrett, Jr. (R-VA) asked her why couldn’t the federal government pull together all of the metadata already out there.

She discussed the College Transparency Act with Congressman Paul Mitchell (R-MI).

She also explained to Congressman Glenn Thompson (R-PA) why education reformers will ignore evidence when it is negative, like in the case of digital learning.

She discussed longitudinal data systems with Congressman Glenn Grothman (R-WI).

Congressman Lloyd Smucker (R-PA) asked what student data would be ok for schools to collect, and if there was any data ok to share.

A Belated Win for Student Privacy

Photo credit: Nick Youngson (CC BY-SA 3.0)

The U.S. Department of Education recently found that the Agora Cyber Charter School in Pennsylvania did violate the Family Educational Rights and Privacy Act (FERPA). Rules established during the Obama administration weakened FERPA, and there has been concern about how outdated it has become considering the rise of educational tech. So it’s remarkable anyone would be found in violation.

Unfortunately, the original complaint was filed on December 16, 2012, and it took the Department of Education almost five years to respond.

EdSurge reports:

Last November, after reviewing responses from Agora, the Department found that the cyber charter did violate FERPA. To use services from Agora, which contracted with third-party service providers such as K12 Inc., Blackboard, and Sapphire, parents were required to agree to policies set forth by those providers. K12’s Terms of Use policy required students to enter identifiable data and granted the company and its affiliates “the right to use, reproduce, display, perform, adapt, modify, distribute, have distributed, and promote [information put into the platform] in any form, anywhere and for any purpose.”

The Department ruled that requiring students to use third-party services that share student data with unauthorized parties as a condition of enrollment is a violation of FERPA. In its letter, federal education officials wrote that “a parent or eligible student cannot be required to waive the rights and protections accorded under FERPA as a condition of acceptance into an educational institution or receipt of educational training or services.“

Perhaps this is a sign that the Education Department under the Trump Administration will be responsive to parents’ student data privacy concerns. This ruling is a good first step. Let’s hope they significantly reduce the response time.