In tweets and Facebook postings, supporters of labor unions in Wisconsin have promoted statistics that would suggest that collective bargaining for teachers -- which would be severely restricted under a proposal by Gov. Scott Walker -- is correlated with higher scores on standardized academic tests.
On Feb. 23, we found the following version on Facebook by the Democratic Party of Wisconsin:
"Only 5 states do not have collective bargaining for educators and have deemed it illegal. Those states and their ranking on ACT/SAT scores are as follows: South Carolina -50th/ North Carolina -49th/ Georgia -48th/ Texas -47th/ Virginia -44th. If you are wondering, Wisconsin is currently ranked #2."
When a reader brought this to our attention, we thought it deserved a look.
First, we checked to see whether it’s true that only five states bar teachers from collective bargaining. We found a webpage maintained by the National Council on Teacher Quality, an independent research group that urges "reforms of the nation's teacher policies."
The group agrees. It reports that all but five states -- Georgia, North Carolina, South Carolina, Texas and Virginia -- "either require or permit school districts to bargain a contract with the local teachers' union." While the organization notes that states' "scope of bargaining statutes has a considerable impact on the balance of power," we think the Facebook post is accurate because it referred explicitly to states that deemed teacher collective bargaining "illegal."
With that question out of the way, we’ll take a look at the thornier question of how those five states' test scores stack up nationally, and against Wisconsin in particular.
On Feb. 20, 2011, Angus Johnston, an adjunct assistant professor at the City University of New York, published a comprehensive analysis of this question on his blog. He published links to a chart that appears to have been the inspiration for the tweets and Facebook postings. It offers a state-by-state analysis of scores on the SAT and the ACT, the two leading college-admissions tests, assembled by University of Missouri law professor Douglas O. Linder.
Johnston is critical of Linder’s methodology for a variety of reasons, which he explains in more detail here. But without even taking those concerns into account, we find the statistics unreliable. They were published in 1999, meaning that the statistics themselves are likely more than a dozen years old -- far too old to be presumed valid in 2011.
Fortunately, it’s possible to obtain state-by-state rankings for the SAT and ACT of a more recent vintage. Here’s a table of the relevant states:
Mean SAT scores by state, 2010:
|State||Overall Score||National Rank||Participation Rate|
|North Carolina||1485||38th||63 percent|
|South Carolina||1447||49th||66 percent|
Mean ACT scores by state, 2009
|State||Overall Score||National Rank||Participation Rate|
|North Carolina||21.6||26th||15 percent|
|South Carolina||19.8||46th||50 percent|
Before we analyze these figures, let us explain why we included the fourth column above -- "participation rate." That refers to the percentage of high school graduates in that state who took the test in question.
The SAT and the ACT serve the same general purpose -- gauging academic achievement for college admissions, but the tests are essentially competitors in the marketplace. By force of tradition or other factors, states vary widely in the percentage of students who take one test rather than the other (or both). So, knowing the participation rate is crucial to knowing whether the ranking being measured is statistically valid.
Consider Wisconsin’s third-place ranking in the SAT. It sounds great -- but only 4 percent of graduates in the state took the test in 2010, and those that did likely did so because they had a particular need to take the SAT as they applied to certain colleges. And that means that Wisconsin SAT takers were a self-selecting group, probably more academically advanced than average.
As a result, it’s fairer to look at Wisconsin’s ranking on the ACT, which was taken by 67 percent of graduates in 2009. And that ranking was 13th in the nation -- not bad, but well short of the 2nd place finish cited in the Facebook post.
Meanwhile, in the five non-collective-bargaining states, the SAT was the more widely taken test, and in those rankings, the non-union states placed between 34th and 49th nationally. Meanwhile, for the ACT -- where participation ranged from 15 percent to 50 percent -- the rankings in the non-union states ranged from 22nd to 46th.
So, on neither test did the five non-collective bargaining states perform as well as Wisconsin did, and in general those five states clustered in the bottom half of the national rankings. Given these statistics, it’s reasonable to say that Wisconsin outperformed the other five states significantly -- but not as overwhelmingly as the blog and Facebook posts suggest.
After we contacted the Democratic Party of Wisconsin, Melissa Baldauff, the party’s research director, wrote us to say that "after further investigation, we determined that the data was not the most up-to-date. Accordingly, we have removed the post from our Facebook page."
We should add another key question: What does SAT and ACT data actually tell us about the connection between collective bargaining rights and student achievement? The answer is a little -- but not very much.
Looking only at these six states, there’s a suggestion that lack of collective bargaining rights for teachers is mildly correlated with test scores, even though the linkage is a lot less striking than the Facebook post suggests. Still, it’s impossible to know whether collective bargaining has any role in causing test scores to rise. That’s because countless other demographic, economic and cultural factors play a role in shaping a state’s test scores.
"Most of the states that don’t have teachers’ unions are poorer than Wisconsin and have more English Language Learners in their schools, and rank higher for other demographic factors that make strong academic performance less likely," Johnston wrote. "Rich kids in a school with a teacher’s union will do better than poor kids in a school without one, generally, but that doesn’t have much to do with the union itself."
Consider just one statistic -- the percentage of residents living below the poverty line. Wisconsin ranked 38th in the nation, similar to Virginia (39th), and well below Texas (8th), South Carolina (9th), Georgia (13th) and North Carolina (15th). The fact that many fewer Wisconsin residents, proportionally, were impoverished almost certainly had an impact in shaping the states’ comparative test results.
Matthew Di Carlo, a senior fellow at the Albert Shanker Institute, which studies education policy, added in a recent blog post that there’s "tremendous variation within and between states in the strength of individual locals (there are thousands) and in the terms of the contracts they have negotiated, to say nothing of all the other factors that might influence achievement," such as school-level policies, student characteristics and resource availability.
In general, Di Carlo notes that, according to the available evidence, there may be some overall benefit from unions on student test scores, but due to all the complex factors involved, there is little basis for drawing strong causal conclusions. He also pointed out that unions confer other benefits, such as improved communication between teachers and administrators.
However, that’s all beyond the scope of our item. Ultimately, the Facebook post uses outdated data based on a questionable methodology. A review using current data finds that Wisconsin does perform better on test scores than the non-union states, but not as dramatically as suggested in the Facebook post. And there is at best limited evidence that unionization played a causal role in shaping differences in test scores. We rate the statement False.