DeVos on 2017 NAEP Results: “We can and must do better for America’s students.”

Betsy DeVos at CPAC 2017

U.S. Secretary of Education Betsy DeVos speaking at the 2017 Conservative Political Action Committee
Photo Credit: Gage Skidmore

U.S. Secretary of Education Betsy DeVos responded to the stagnant nationwide scores on the National Assessment of Educational Progress (NAEP) but touted Florida who saw gains in fourth and eighth-grade reading and math scores as a bright spot.

Here is her statement in full:

The report card is in, and the results are clear: We can and we must do better for America’s students. Our nation’s reading and math scores continue to stagnate. More alarmingly, the gap between the highest and lowest performing students is widening, despite billions in Federal funding designated specifically to help close it.

One bright spot in today’s report is Florida, where Sunshine State students are bucking the national trend, showing significant improvement in 4th and 8th grade math and in 8th grade reading. Both low and high performers in Florida demonstrated that improvement, again bucking the national trend and narrowing the achievement gap.

Florida leaders, administrators, and, most importantly, teachers are to be commended for their continued efforts on behalf of students. Florida has been at the forefront of bold, comprehensive education reform for decades. From accountability, to literacy, to teacher certification and recognition, to providing parents more freedom to select the learning environment that best fits their students’ needs, Florida is rethinking education.

Florida’s results show what is possible when we focus on individual students. This Administration is committed to working with States and communities across our country to bring about the much-needed change our students deserve.

Many thanks to the professional staff at the National Center for Education Statistics and the members of the National Assessment Governing Board for their work and for their commitment to continually improving NAEP.

Florida saw an improvement from 227 to 228 on the fourth-grade reading assessment. The average score is still below 240 which is the score students need to receive to be deemed “proficient.” 41 percent of Florida’s 4th-graders are considered proficient or above up from 38 percent in 2015.

Florida saw a bigger jump with eighth graders on their reading assessment. They jumped from an average score of 264 to 267. A score of 280 is considered proficient. 35 percent of Florida’s 8th-graders are considered proficient up from 30 percent in 2015.

Florida’s 4th graders saw a larger jump in math with an average score of 243 in 2015 to 246 in 2017 which is a new high for the Sunshine State. Fourth graders are considered proficient if they attain a score of 250 or higher. 47 percent of Florida’s 4th-graders attained that score or higher, up from 42 percent in 2015.

Florida’s 8th graders saw a smaller bump from 281 in 2015 to 282 in 2017. This is still two points below their historic high of 284 attained in 2013. The proficiency benchmark is 300 for the 8th-grade math assessment. Only 29 percent of Florida’s 8th-graders are considered proficient or above up from 27 percent in 2015.

 

2017 NAEP Results Show Little Change

The 2017 results from the National Assessment for Educational Progress (NAEP) in Mathematics and Reading have been released and there has been little change.

Here are some of the key findings:

  • Compared to 2015, there was a 1-point increase in the average reading score at grade 8 in 2017 (still below the 2013 score of 268), but no significant change in the average score for reading at grade 4, or for mathematics at either grade.
  • A growing achievement gap: NAEP scores are reported at five selected percentiles to show the progress made by lower- (10th and 25th percentiles), middle- (50th percentile), and higher- (75th and 90th percentiles) performing students. In comparison to 2015, the 2017 mathematics and reading scores were higher for eighth-graders performing at the 75th and 90th percentiles and lower for fourth-graders performing at the 10th and 25th percentiles.
  • Across the fifty states, the District of Columbia, the Department of Defense schools, and Puerto Rico (in mathematics only), average scores for most states were unchanged from 2015 in both subjects and at both grades.
  • In Florida, average scores increased in both grade 4 and grade 8 mathematics. Average scores for students in Puerto Rico increased in grade 4 mathematics and for the Department of Defense schools in grade 8 mathematics. Scores decreased in 10 states in grade 4 mathematics and in three states in grade 8 mathematics.
  • In reading at grade 4, average scores did not increase in any state/jurisdiction, and scores decreased in nine states/jurisdictions. In eighth-grade reading, 10 states/jurisdictions had score increases, and one state, Montana, had a score decrease compared to 2015.
  • Most changes in scores across districts were seen in grade 4 mathematics, where four districts (Duval County (FL), Fresno, Miami-Dade, and San Diego) had increases, and four districts (Charlotte-Mecklenburg, Cleveland, Dallas, and Detroit) had decreases in scores compared to 2015. In grade 8 mathematics, Philadelphia had a decrease in its average score. In grade 4 reading, San Diego had a score increase, and in grade 8 reading, Albuquerque and Boston had increases in scores compared to 2015.
  • Fourth graders who are eligible for National School Lunch Program (NSLP), attend a city school or have a disability saw a decrease in mathematics scores from 2015.
  • 8th-grade students in public schools, suburban schools, with disabilities, ELL, and non-ELL saw an increase in reading scores.
  • Catholic school students outperformed public school students.
  • The performance gap between 4th grade white and black students widened in Arizona, Arkansas, Kentucky, and Louisiana in mathematics compared to 2015. No state saw the performance gap narrow in mathematics. In reading, Arizona saw the performance gap widen while the District of Columbia saw their performance gap narrow.
  • Among 4th graders, the white-Hispanic student performance gap widened in mathematics in Alaska, Georgia, Louisiana, and New Mexico compared to 2015. Kansas saw their gap narrow. In reading, Tennessee saw their gap widen while Kansas saw their gap narrow.
  • Boys’ average scores are slightly higher than girls in mathematics, they trail girls in reading.

Read the rest of the key findings here.

Co-Author of 1993 MA Ed Reform Act Concerned About Current Policies

Former Massachusetts Senate President Tom Birmingham
Photo Credit: Rappaport Center (CC-By-2.0)

Massachusetts Education Reform Act co-author and Former Massachusetts Senate President Tom Birmingham, who now serves as a distinguished senior fellow in education at the Pioneer Institute, spoke at an event at the Massachusetts State House marking the education reform act’s 25th Anniversary.

Birmingham praised the historic success that has been achieved since the law was enacted in 1993:

If you had told me then that more than 90 percent of our students would pass MCAS and that we would have 13 consecutive years of improvement on SAT scores, or that our students would rank first in the nation in every category and in every grade tested on NAEP between 2005 and 2013, and that they would place at or near the top on gold-standard international math and science tests like the TIMSS, I would have thought you were unrealistically optimistic. We all had ambitious hopes for education reform on that day 25 years ago, but I doubt any of us would have dared to predict the historic successes we have actually enjoyed under the Act.

He shared what K-12 education in Massachusetts was like before the bill:

Before 1993, we witnessed the grossest disparities in spending on our public schools. In some districts we were spending more than $10,000 per child per annum and in others we were spending $3,000. In those circumstances to pretend that we were affording our children anything remotely approaching equal educational opportunity was nothing short of fraudulent.

And the academic quality of education was materially different in virtually every school district across the Commonwealth. Partly as a result of those disparities in spending, the state did precious little to insist on uniform standards. Pre-1993 there were but two state-imposed requirements to get a high school diploma: one year of American history and four years of gym. Clearly a testament more to the lobbying prowess of gym teachers than to any coherent pedagogical vision.

But the Education Reform Act strove to change all this; to change the state funding mechanism and the academic expectations for all our students. I believe we have largely succeeded.

Addressing Massachusetts current standards and tests he said:

With regard to standards and tests, we have jettisoned our tried and true reliance on higher-quality academic standards and MCAS and replaced them with inferior Common Core standards and PARCC testing. It’s worth noting that the PARCC consortia has now lost over two-thirds of its member states; hardly a ringing endorsement. I fear the implementation of Common Core and MCAS 2.0, which is a rebranded version of PARCC, has contributed to Massachusetts being a negative growth state on NAEP reading and math between 2011 and 2015.

Why Massachusetts would settle for having the same English, math, or science standards and rebranded PARCC tests as do Arkansas or Louisiana, whose students could not possible meet Massachusetts performance levels, is puzzling to me. The Common Core and its PARCC-style testing regime represent one of those rare instances where what may be good for the nation as a whole is bad for Massachusetts.

Read his full remarks here.

HT: Pioneer Institute

Louisiana’s State School Chief John White Worries About Upcoming NAEP Results

 

The 2017 National Assessment of Educational Progress (NAEP) results will be released on April 10th and it seems as though Louisiana State Superintendent of Education John White got an advanced look and he is worried.

Matt Barnum with Chalkbeat reported on a letter that White sent to Dr. Peggy Carr, the Acting Commissioner of the National Center for Education Statistics, who administers NAEP, that questions about how the switch to computer-based testing will impact student scores.

White wrote:

The 2017 NAEP administration marked a significant transition from paper-based testing to computer-based testing. NCES found that, consistent with research on the NAEP (Bennett et al., 2008; Horkay, Bennett, Allen, Kaplan, & Yan, 2006), this shift in the mode of testing contributed to lower performance on NAEP forms among the general U.S. sample population. Using the small sample of paper-based testers, NCES calculated a baseline level of performance and adjusted nationwide scores to maintain the longitudinal NAEP trend. Based on this mode effect adjustment, NCES has preserved the integrity of its effort to report trends in nationwide math and reading over time.

I understand that NCES may have found disparities in the mode effect on different subgroups of students. However, any disparate effect found was not significant. Thus NCES did not include any difference from one group of students to the next in its calculation of the mode effect. The adjustment NCES made in order to preserve the national trend is the same for every student.

It is my understanding that, though NCES maintained a consistent longitudinal trend at the national level, there remains the possibility that the mode effect in a given state may have been greater than the nationwide mode effect. This could be attributable to a disproportionately large population of a subgroup that experienced a greater mode effect than the national effect. It also could be attributable to the relative capacity of 4th and 8th grade students in a given state to use computers.

As a potential illustration of this point, no Louisiana student in 4th grade or 8th grade had ever been required to take a state assessment via a computer or tablet as of the 2017 NAEP administration. This fact, coupled with a variety of social indicators that may correspond with low levels of technology access or skill, may mean that computer usage or skill among Louisiana students, or students in any state, is not equivalent to computer skills in the national population.

The first problem, as Richard Innes with the Bluegrass Institute pointed out is that NAEP may have some validity issues. He wrote, “Certainly, the possibility White raises that NAEP might have performed differently for different states could be a real concern. Could the comparability of NAEP data between states and across years have been compromised?”

To that end White asked Carr for additional information to which Chalkbeat reports Carr said she would provide. White wrote:

I am therefore writing to request that the following information be made available to state chiefs as soon as possible:

  1. The mode effect adjustment applied to each grade and subject nationally
  2. The average mean scores for students taking the paper-based test and for students taking the tablet-based test, at the state level and at the national level, in each grade, subject, and subgroup
  3. Evidence of the random equivalence of the groups of students taking the paper-based testsand students taking the tablet-based tests, at the state level and at the national level
  4. National subgroup performance trends, reported by performance quintile, quartile, or decile.

The second problem, that White concedes, is with using computer-based tests to begin with. He wrote, “I would like to be assured, as soon as possible, that when NCES reports math and reading results on a state-by-state basis over a two-year interval, the results and trends reported at the state level reflect an evaluation of reading and math skill rather than an evaluation of technology skill.”

This is why I don’t favor computer-based assessments.

This, of-course, could be spin. Barnum writes, “Even though researchers warn that it is inappropriate to judge specific policies by raw NAEP results, if White’s letter is a signal that Louisiana’s scores have fallen, that could deal a blow to his controversial tenure, where he’s pushed for vouchers and charter schools, the Common Core, letter grades for schools, and an overhaul of curriculum.”

White contends that he’s not concerned about Louisiana scores. “I doubt that any mode effect would have radically vaulted Louisiana to the top or dropped Louisiana further below,” White told Chalkbeat. “The issue is from a national perspective.”

Jay P. Greene, Chair of the Department of Education Reform at the University of Arkansas, says that it looks like “pre-spin” to him. He wrote on his blog, “Maybe it’s just a remarkable coincidence that White has suddenly developed these technical concerns about the validity of NAEP at about the same time that he was briefed on his state’s results.  How much do you want to bet that there is a decline in LA?”

I’m not a betting man.

Considering the Council of State School Chiefs also expressed concern, Louisiana is probably not alone.

Requiring Parental Permission for Students to Receive a Constitution?

I am supportive of local schools requiring parental consent for a whole host of things. I say this in light of stories that I’m sure we’ve all seen about some nightmare curriculum, school assembly, or handout that many of us would never want our students to be taught from, have to sit through or read. Not to mention all of the data collection that often takes place without parental knowledge or consent.

A school in New Hampshire back in 2016 required consent for the strangest thing. They wanted parental permission before their student would receive a copy of the U.S. Constitution.

Our friends at GraniteGrok report:

It appears the Windham School District’s Administration has a double standard regarding when parental approval is deemed appropriate regarding student instruction. In September 2016, parents were required to sign a permission slip before their children would receive a copy of the United States Constitution on Constitution Day (Sept. 17th). That request rightfully raised some eyebrows in town. (And many students were never even offered a copy, even though the district was offered a donated copy for every student in the district…hmmm…)

The Superintendent’s response was that the donated Constitutions were published by the National Center for Constitutional Studies – and their website (www.nccs.net) had references to religion – so out of respect for parents and their values, the administration said it was appropriate to seek parental approval before children were allowed to receive a copy of the United States Constitution.

Many of our Founding Fathers believed that our Constitutionally protected inalienable Rights come from God – and not from the government… and therefore, the government cannot take our Rights away. That’s an important concept that our children are not being taught, but should be.

I’ve been on the organization’s website and have had a copy of their pocket constitution, any reference to religion is minimal. A reasonable person would not see it as a violation of the First Amendment’s establishment clause. I have to wonder how this school handles the Declaration of Independence that clearly cites that, but I digress.

Now, this is not breaking news since it’s from 2016, but GraniteGrok also pointed out that same school did not require parental consent related to the student walkouts in opposition to gun violence that was promoted and sponsored by a liberal political group. That is a double standard.

Think about it: the school in practice believes that a student should not need parental permission to exercise their right of protest, but needs parental permission to get a copy of the document that enshrines that right. Does that not seem strange to you?

It seems this school, in particular, encourages political action without giving students the civic context and foundation political action is built upon.