Saturday, November 22nd, 2014
False
Hall
Gains in Atlanta Public Schools scores on a national standardized test lend credence to "dramatic" score increases on state tests.

Beverly Hall on Wednesday, August 10th, 2011 in a commentary published in Education Week

Former APS head Beverly Hall says national test proves real gains

Ever since Atlanta Public Schools’ cheating scandal began, then-Superintendent Beverly Hall has repeatedly pointed to a national test as proof that the changes she made produced real results: the National Assessment of Educational Progress.

The NAEP is a top measure of student achievement, and APS’ NAEP scores improved under Hall’s term, Hall has said. One of her backers mentioned these scores in a New York Times article last month. Hall, herself, raised them Aug. 10 in a commentary published by the national trade publication Education Week.  

"The results of the standardized tests administered in 2010 and 2011 under this enhanced security have not been questioned -- and most important of all the dramatic improvement in test scores has remained. That improvement has also been confirmed by the National Assessment of Educational Progress, or NAEP, which is independently administered," Hall wrote.

"Dramatic" improvement in scores on the Criterion-Referenced Competency Tests has been "confirmed" by the NAEP?  We decided to take a look.

First, a brief aside. We checked Hall’s claim that CRCT tests scores for 2010 and 2011 "have not been questioned" and gave her a Pants on Fire. State investigators, the Governor’s Office for Student Achievement and The Atlanta Journal-Constitution questioned the validity of both years’ results.  

We asked Hall’s attorney for evidence backing her claim about the national test but received no response. After our inquiry, Council of Great City Schools Executive Director
Michael Casserly repeated a similar claim Sept. 7 in The New York Times, but the story included no supporting evidence. Hall previously served on the board of Casserly’s organization.

Now, the NAEP is different from your typical standardized test. Like the CRCT, it tests core subjects such as reading and mathematics, but many students don’t take it. Those who do don’t take all of it. They don’t even get their results.  

Instead, the NAEP randomly selects a representative sample of students to take the test.  They take slices of the exam over a single day.  

Historically, the NAEP has been used to track student achievement at the state and national level. During the past decade, however, about a dozen large urban districts were recruited for a special trial to see how well the test evaluates district performance. APS joined the trial in 2002.

The test was not the focus of the state’s CRCT investigation, and a federal review found no evidence of NAEP cheating, said Arnold Goldstein, who is program director for the assessment division of the National Center for Education Statistics, the division of the U.S. Department of Education that oversees the NAEP.

The federal review’s findings were not published in a report, so PolitiFact Georgia could not examine its work.

Experts agree that it’s much more difficult to cheat on the NAEP than the CRCTs, and they give the following reasons:

-- NAEP test administrators are federal contract workers, not APS employees, so they have little incentive to cheat. Some are retired APS teachers, but a federal testing official told PolitiFact that they all worked as part of three-member teams with colleagues who had no ties to APS.

-- NAEP test materials are not kept in schools overnight or during the weekend. That’s when much of the CRCT cheating took place, according to the state investigation.  

-- Students are tested on multiple subjects during the same NAEP test session. While one is working on a reading exam, another might be doing mathematics, so they can’t copy each others’ answers.

But cheating is not out of the question. Sonny Perdue, who as governor launched the state investigation into the CRCT scores, is a member of the board that oversees the NAEP.  He suggested at an August board meeting that there may be weaknesses in the system but did not give details. He declined through a spokesman to comment for this story.

APS critics say the district might have tipped scores in its favor by withholding names of low-performing students from the roster the NAEP uses to select its test-takers.

The NAEP doesn’t check student rosters by name. The procedure is to see whether the list it receives from a school matches up with the school’s overall characteristics, Goldstein said.

The NAEP review found no problems, Goldstein said.

But the federal review focused on whether APS followed standard procedures -- not whether the rosters were accurate. To comply with privacy policies, NAEP administrators routinely destroy the rosters schools send them, Goldstein said. Therefore, we think it may be impossible to check whether APS sent accurate lists.

Now, let’s take a closer look at the NAEP results starting in 2002, which is when APS began participating in the urban district program.  

We found that even if you ignore the possibility of cheating, it’s not clear the gains are dramatic.

APS previously trumpeted that since 2002, its students demonstrated the largest gains of other large urban school districts on the NAEP reading tests. This is correct, but since that time, APS’ demographics have shifted toward whites and Hispanics -- groups that generally score higher on the NAEP, experts noted.

For instance, in 2002, 6 percent of APS fourth-graders were white and 3 percent were Hispanic. By 2009, the percentage of white students more than doubled to 13 percent. Hispanics increased to 5 percent.

"I think that demographic changes in the fourth grade account for 40 percent or so of the progress, but that’s not something anybody wanted to talk about or even to point out," said Mark Musick, an East Tennessee State University professor who once was chairman of the board that directs the NAEP.

Also, APS’ NAEP marks aren’t all improvements. The score gap between whites and blacks remains the second-largest in the nation among large districts. In addition, children in poverty now lag even further behind their peers in reading.

Furthermore, Atlanta still trails well behind other large urban districts in crucial areas. For instance, Atlanta’s eighth grade 2009 mathematics scores were lower than all others but Cleveland, the District of Columbia and Los Angeles.

And at current rates, it will take Atlanta 50 to 110 years for all its students to become proficient, said Binghamton University education professor Lawrence Stedman, who thinks that the cheating scandal is proof that the U.S. needs to scrap its approach to high-stakes testing. What gains APS made slowed in recent years, he said in a recent analysis of the district’s NAEP scores.

Even if we take the NAEP gains at face value, it’s not accurate to say they support CRCT gains. Though student achievement rose according to both measures, NAEP gains don’t quite match up to CRCT increases, and demographic shifts might account for a portion of the increase.

That said, it’s not clear we can take the NAEP scores at face value. While a federal review of APS procedures found no red flags, the bulk of the district’s cheating on the state CRCT exam only became public knowledge after a rigorous investigation that looked beyond procedures.

Such an inquiry has not taken place on the NAEP. And even if it did, it may not ever be possible to ensure that APS sent accurate rosters to NAEP administrators. The NAEP destroyed its copies.

Both the CRCT and NAEP registered some gains, but the causes are unclear, the results are mixed and the district’s integrity remains in question. So many questions remain unanswered that it makes little sense to conclude that NAEP scores confirm "dramatic" gains on the CRCT.

Hall therefore earns a False.