ACT now testing more students, but college readiness remains stuck
Posted August 29
The ACT released its nationwide 2016 test results last week. The numbers are troubling but not necessarily surprising.
Roughly one-third of those who took the popular college entrance exam were ready for college, based on ACT's readiness benchmarks. But 84 percent of that group said they hoped to pursue higher education.
ACT judged college readiness by correlating test scores to first year college grades. ACT matches each of its four tested areas to a standard first year college course. Students are "college ready" if they have a 50 percent chance of getting a B and a 75 percent chance of a C in the corresponding course.
These numbers, says Mike Petrilli, president of the Washington, D.C.-based Fordham Institute, align "with we see from the SAT and the National Assessment of Educational Progress. We get about a third of our students to the college ready level by the time they finish high school."
It may be no coincidence that roughly one-third of Americans have earned four-year college degrees. "It lines up almost exactly," Petrilli said. "The percentage that are ready for college is very close to the percentage that complete college."
Stark ethnic and racial achievement gaps continue to dog the ACT, just as they do in other benchmark tests. The 2016 ACT report found 60 percent of Asian test takers were college ready, compared to 49 percent of whites, 23 percent of Hispanics, 11 percent of African-Americans.
The "bifurcation" of college readiness has widened over time, Weeks said. "We need to have more of these underserved groups coming out of high school to get them ready for what comes next."
Assessing our progress in getting more graduating high school students ready for postsecondary training has become a major policy focus. But the ACT is not the only tool for this, and some experts question if it is the best one for the job.
Diluting the pool
Just 38 percent hit college readiness targets on three of the four subject areas this year, the ACT reports. The ACT tests English, math, reading and science.
Last year, 40 percent hit readiness targets on at least three. Meanwhile, 34 percent of test takers failed to meet any of the four benchmarks, up from 31 percent last year.
The trend may sound troubling, but it's really not.
The drop occurred because six new states (Minnesota, Mississippi, Missouri, Nevada, South Carolina and Wisconsin) this year required all students to take the exam, dramatically altering the pool of test takers.
As a result of the new mandatory test takers, 64 percent of high school graduates nationwide took the ACT this year, jumping from 59 percent in 2015. The big jump in test takers drew in many with weaker skill sets, driving down test scores significantly in those six new states. In 22 other states, test score averages climbed, while in eight states scores remained flat.
"We see a mixed bag," said Paul Weeks, senior vice president of client relations at ACT. "But we know far too many students are leaving high school underprepared."
Now that so many students take the ACT, one challenge is to sort out who is going on to what kind of post-secondary training and what skills they actually need.
While it is true that not all students leaving high school are aiming for four-year schools, Weeks argues, the ACT itself actually serves as a good indicator of how prepared students are for alternate career paths.
"Our research suggests that too many of these students are not prepared for any of those tracks after high school," Weeks said.
For the past 20 years, ACT has supplemented its college entrance exams with its more basic WorkKeys exam, which tests three skills: applied math, locating information and reading for information.
WorkKeys, Weeks said, has been developed and adapted by researching 16,000 different jobs, and students who perform well on it are offered a National Career Readiness Certificate that many employers use as a hiring tool. Many state governments also rely on the certification to measure career readiness.
For the first time this year, ACT has compared those who have taken both WorkKeys and the ACT, allowing the company to offer comparison guides so that those who take the college prep course can see how they stand in other career paths.
The ACT reports that 68 percent of students who took the ACT this year would be on track to earn a Gold NCRC, which would qualify them for 93 percent of the 16,000 occupations in the WorkKeys mix.
That's a mixed bag, since that means 32 percent of those who took the ACT — which still skews toward the college bound — would not be prepared for many basic jobs. A complete sample of college graduates would, presumably, look yet worse.
Petrilli argues that much of the education reform movement — including Common Core and expanded standardized tests — results from a pressing need to give parents a clearer picture of where kids stand.
But he favors earlier testing to identify problems while they can still be handled, and he questions whether the ACT is the best national benchmark.
"We're finally telling parents the truth about whether they are on track," Petrilli said of the school accountability push over the last 15 years. "But we don't have to wait until these kids take the ACT in the 11th grade to tell that they are nowhere near college ready."
Nonetheless, Petrilli sees the ACT and SAT as playing key roles in college admissions. He points to evidence of grade inflation where "everybody gets A's and B's these days." It has become, he argues, ever harder for college admissions officers to make useful distinctions with grades.
"It's easy to hate tests," Petrilli said. "But the reason we have them is that we have no other way to uphold standards. If our schools were willing to make students work hard to earn A's, maybe we wouldn't need these things."
Those are fighting words to Bob Schaeffer, public education director of FairTest, an advocacy group that has long been critical of any high stakes testing, including the use of the ACT and SAT in college admissions.
Over 850 colleges do not require test scores, Schaeffer notes, and they seem to do just fine relying on classes taken and grades earned, along with other elements of an application portfolio. The key is not just grades but also courses taken. That is, admissions officers look for a student who has taken the toughest courses available at their high school and done well in them.
Weeks, a former college dean of admissions, says he doesn't view the ACT as a standalone tool. He agrees with Schaeffer that "the single best predictor is courses taken and grades received." (Weeks has to agree, Schaeffer rejoins, since their own documents have contained that caveat for years.)
There are two key components in college success, Schaeffer says. One is cognitive skills, and the other is noncognitive — problem solving, determination, persistence, and the like. Testing, he argues, gets at the cognitive, but only grades and courses taken get at both.
Paul Weeks, who as dean of admissions at Ripon College in Wisconsin before joining ACT, argues that more information is better and grades alone don't tell the whole picture.
"I advocate looking at the whole student," Weeks said. "If I've done my job right as an admissions officer, I should be able to overlook any blemish, whether it is in the ACT score, or poor grades in a semester."
Even setting aside his beef with high-stakes individual tests, Schaeffer argues the ACT is a poor instrument to get at national academic progress.
"It's totally unnecessary," Schaeffer says. "The U.S. already has an independent, statistically high-quality tool for this. It's called the National Assessment of Educational Progress. You don't need NAEP and SAT and ACT all claiming they are doing the same thing."