The Arizona Department of Education recently mailed me, and every other parent of a public school child, a slick 32-page brochure on the performance of our public schools.
As most parents probably did, I quickly skimmed the cover message.
"Arizona citizens are entitled to know how student test scores compare with the test scores in other states. I'm pleased to report that Arizona students perform above the national average (emphasis in original)."
The Arizona Department of Education is right; we do deserve to know how our children compare with those in other states.
However, ADE is painting an inaccurate and misleading portrait of Arizona student achievement.
ADE bases its claim of above-average performance on a modified version of the national TerraNova test.
During the past two years, ADE has included some TerraNova questions in the AIMS test. So when a student takes Arizona's Instrument to Measure Standards, he also takes the TerraNova.
This makes Arizona's version of TerraNova fundamentally different than the version given in other states. So claims about our being above average are hard to verify.
Another test shows a very different - and worse - picture of student achievement.
The U.S. Department of Education administers the nation's most widely respected set of exams for comparing state academic achievement.
The National Assessment of Educational Progress, also known as "the Nation's Report Card," shows Arizona students test below the national average on every subject in every grade level.
Since 1992, a total of 29 NAEP exams have been administered to Arizona students in reading, writing, science and math.
In all 29 cases, our students scored below the national average.
Both tests can't be right, raising an important question: Is something wrong with NAEP or is something wrong with the TerraNova?
The discrepancy between NAEP and Arizona's version of the TerraNova prompted the Goldwater Institute to ask a national testing expert, Gregory Stone, to examine the state's technical documents supporting the validity of its exam.
What Stone found was discouraging. "I am surprised at the fundamental errors made throughout the defined process," he wrote.
He determined that Arizona's TerraNova results couldn't be compared with those in other states.
A 2005 University of Arizona report also raised serious questions about the validity of Arizona's version of TerraNova.
That review recommended that TerraNova be eliminated entirely in Arizona and the state rely instead on the NAEP for state-to-state comparisons.
Besides technical reasons to question Arizona's TerraNova results, there are also practical ones.
Arizona has a relatively high percentage of children who qualify for the federal free and reduced-cost lunch program (a standard measure of poverty) and a large number of students who are English language learners.
A study by the Manhattan Institute ranked Arizona as having the second "most difficult to educate student body" in the nation.
Credit where credit is due: Statistically speaking, studies indicate that Arizona students score higher than expected because of the demographic profile.
It would be a small but real triumph if Arizona scored at the national average.
Arizona should remove TerraNova questions from the AIMS exam and administer a stand-alone national test. This would provide parents and lawmakers the accurate information they need.
There is no time to waste. Texas, for example, has an almost identical student profile to Arizona's but scores much higher on NAEP's fourth-grade reading exam.
There is no question Arizona students can do better.
One step in that thousand-mile journey is making sure we have accurate information about the performance of our public schools.
But we must ensure we are testing students accurately and reporting the results without rose-colored glasses.