Technical Appendix

Indicator #1: Academic Achievement

The 2013 score is an unweighted average of 4th-grade reading, 4th-grade math, 8th-grade reading, and 8th-grade math scores on the 2013 National Assessment of Educational Progress (NAEP) test.

Source for test scores: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress, 2013 Mathematics and Reading Assessments. Accessed at http://nationsreportcard.gov/reading_math_2013/#/.

The 2005 scores from the first edition of Leaders & Laggards (published in 2007) are available at http://www.uschamber.com/reportcard/2007.

Note: For this report we use NAEP’s reported proficiency rates, not raw scores. Translating scale scores into meaningful indicators for lay readers is outside the scope of this report, so, as in previous iterations of Leaders & Laggards, we use NAEP’s own definitions of proficiency in its various tests. We recognize this approach introduces a level of subjectivity in where exactly the bar for proficiency is set. And, given the comparative nature of our data, the bar for proficiency in 2005 is quite likely different than that of 2013, as it is in 4th grade and 8th grade and between tests. However, it is a clear indication of what the National Center for Education Statistics (NCES) experts believe students needed to know in 2005 and 2013, respectively, so it is a straightforward and clear comparison.

Indicator #2: Academic Achievement for Low-Income and Minority Students

The 2013 score is an unweighted average of NAEP performance on all four exams (4th grade reading, 4th grade math, 8th grade reading, 8th grade math) for African-American, Hispanic, and low-income students. States were excluded from this calculation if they had fewer than 7 of the 12 total possible scores. For 2013, this only excluded Vermont. To generate the change data, we use the same scores and exclusion rules from Leaders & Laggards 2007, which excluded Maine, Montana, New Hampshire, North Dakota, South Dakota, and Vermont. As a result, these states do not have a change score (labeled “Progress made since 2007”) calculated.

Source for test scores: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress, 2013 Mathematics and Reading Assessments. Accessed via http://nces.ed.gov/nationsreportcard/. This was also the source for the percentage of African-American, Hispanic, and low-income students in each state.

2005 scores from the first edition of Leaders & Laggards (published in 2007) are available at http://www.uschamber.com/reportcard/2007.

Note: For this report we use NAEP’s reported proficiency rates, not raw scores. Translating scale scores into meaningful indicators for lay readers is outside of the scope of this report, so, as in previous iterations of Leaders & Laggards, we use NAEP’s own definitions of proficiency in its various tests. We recognize this approach introduces a level of subjectivity in where exactly the bar for proficiency is set. And, given the comparative nature of our data, the bar for proficiency in 2005 is quite likely different than that of 2013, as it is in 4th grade and 8th grade and between tests. However, it is a clear indication of what the NCES experts believe students needed to know in 2005 and 2013, respectively, so it is a straightforward and clear comparison.

Indicator #3: Return on Investment

The 2013 score is an unweighted average of 4th-grade reading, 4th-grade math, 8th-grade reading, and 8th-grade math scores on the 2013 NAEP test (as reported in the Academic Achievement indicator).

Source for test scores: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress, 2013 Mathematics and Reading Assessments. Accessed via http://nces.ed.gov/nationsreportcard/.

Note: For this report we use NAEP’s reported proficiency rates, not raw scores. Translating scale scores into meaningful indicators for lay readers is outside the scope of this report, so, as in previous iterations of Leaders & Laggards, we use NAEP’s own definitions of proficiency in its various tests. We recognize this approach introduces a level of subjectivity in where exactly the bar for proficiency is set. And, given the comparative nature of our data, the bar for proficiency in 2005 is quite likely different than that of 2013, as it is in 4th grade and 8th grade and between tests. However, it is a clear indication of what the NCES experts believe students needed to know in 2005 and 2013, respectively, so it is a straightforward and clear comparison.

Cost per student comes from the Census Bureau’s report Public Education Finances: 2012 by Mark Dixon. The report is available at http://www2.census.gov/govs/school/12f33pub.pdf (table 8). It includes the most up-to-date comparable finance information across states.

The Cost-of-Living Adjustment comes from the Missouri Department of Economic Development’s Cost of Living Data Series from the third quarter of 2013. It is available at http://www.missourieconomy.org/indicators/cost_of_living/.

To create the index, we multiply the cost per student as reported by the Census Bureau by the cost-of-living adjustment to create an adjusted cost per student. We then divide the NAEP index from the Academic Achievement indicator (the average of the 4th-grade and 8th-grade reading and math exams) by the adjusted cost per student.

Indicator #4: Truth in Advertising: Student Proficiency

Reproduction of Paul Peterson and Peter Kaplan’s calculations for Education Next, available at http://educationnext.org/despite-common-core-states-still-lack-common-standards/.

Indicator #5: Postsecondary and Workforce Readiness

Advanced Placement (AP) Passage Rate: College Board, the administrators of the AP exams, provided us with state-by state breakdowns of AP passage rates. This represents the percentage of students in the class of 2013 who graduated having passed an AP exam.

Graduation Rate: From Education Week’s Diplomas Count 2014, available at http://www.edweek.org/ew/toc/2014/06/05/index.html?intc=EW-DPCT14-FL1.

Mortensen Chance at College: From Thomas Mortensen’s proprietary reports on college access, available behind paywall at http://www.postsecondary.org/peocountry.asp. It is Metric #12, listed under the column “College Participation Rate” for the class of 2010.

The score for each state is the average of AP exams passed, graduation rate, and the Mortensen chance at college.

Indicator #6: 21st Century Teaching Force

The National Council on Teacher Quality’s State rankings come from the 2013 State Teacher Policy Yearbook: National Summary, available at http://www.nctq.org/dmsStage/2013_State_Teacher_Policy_Yearbook_National_Summary_NCTQ_Report.

Indicator #7: Parental Choice

This is an average of three metrics, the Center for Education Reform (CER) “Parent Power” Rankings, the National Alliance for Public Charter Schools Ranking, and the choice market share for 2011–2012 for the state.

CER “Parent Power” Ranking: The Center for Education Reform ranked states on their school choice options, teacher quality, transparency, and online learning options. The scores were reported as percentages out of 100, so they are reproduced for this element of the index. The scores are available at http://www.edreform.com/in-the-states/parent-power-index/.

National Alliance for Public Charter Schools (NAPCS) Ranking: Published in January 2014, the NAPCS ranked the charter laws of all states in the nation. There were a possible 228 total points awarded, so the grade presented is the percentage of points earned—the points earned divided by 228. The rankings are available at http://www.publiccharters.org/wp-content/uploads/2014/01/StateRankings2014.pdf.

Choice market share: This has three components:

  1. Number of students enrolled in charter schools as reported by the NAPCS in its data dashboard in the 2011–2012 school year. Available at http://dashboard.publiccharters.org/dashboard/home.
  2. The number of students in private schools in the 2010–2011 school year (the most recent available), as reported by the NCES Common Core of Data.
  3. The number of students enrolled in a voucher or tuition tax credit program in the 2011–2012 school year, as reported by the Friedman Foundation of Educational Choice’s ABCs of School Choice, available at http://www.edchoice.org/Foundation-Services/Publications/ABCs-of-School-Choice-3.aspx.

These three were added together and then divided by the total number of students in the state in the 2010–2011 school year (the most recent available), as reported by the NCES Common Core of Data.

Indicator #8: Data Quality

This is a reproduction of the Data Quality Campaign’s state scorecards on their 10 “Essential Actions” that states should take. The full scoring breakdown is available at http://www.dataqualitycampaign.org/files/DataForAction2013.pdf.

Indicator #9: Technology

This is a reproduction of Digital Learning Now!’s 2013 “Digital Learning Report Card” available at: http://digitallearningnow.com/report-card/.

Indicator #10: International Competitiveness

The International Competitiveness score is composed of three parts:

  1. International Proficiency. The average percentage of students in the state scoring “proficient” on the international standard set in Peterson, Woessmann, Hanushek, and Lastra-Anadon’s rankings of state performance, Globally Challenged: Are U.S. Students Ready to Compete? available at http://www.hks.harvard.edu/pepg/PDF/Papers/PEPG11-03_GloballyChallenged.pdf (see page 10 for math and page 16 for reading). This was created by averaging the scores in reading and math.
  2. STEM AP exams passed. College Board provided the state-by-state passage rates on the 10 STEM tests: Biology, Calculus AB, Calculus BC, Chemistry, Computer Science A, Environmental Science, Physics B, Physics C: Electricity and Magnetism, Physics C: Mechanics, and Statistics.
  3. Foreign language AP exams passed. College Board provided the state-by-state passage rates on the eight World Languages and Culture exams: Chinese Language and Culture, French Language and Culture, German Language and Culture, Italian Language and Culture, Latin, Spanish Language and Culture, and Spanish Literature and Culture.

Each of these was a decimal value between 0 and 1; they were added together to get an index score also between 0 and 1.

Indicator #11: Fiscal Responsibility

This metric had two parts that were averaged together:

  1. Total Pension Funding: Using data from the Pew Center on the States’ Widening Gap series (available at http://www.pewtrusts.org/en/research-and-analysis/analysis/2014/04/08/the-fiscal-health-of-state-pension-plans-funding-gap-continues-to-grow ), this is a measure of the amount currently in pension coffers compared with projected pension outlays in the future. Pew’s calculations assume an 8% rate of return, because that is what is required by the Governmental Accounting Standards Board (GASB). Some debate whether 8% is too high a rate of return to assume, but this report follows Pew and GASB.
  2. Fiscal Year 2012 Contribution: Using data from the same Pew report, this metric is a measure of the percentage of the annual required contribution states made toward their pension liabilities in the most recent year of Pew’s data.