@NCCapitol

The 'Red Wave' that wasn't: Experts explain how the polls got the midterm elections wrong

On Tuesday, polls around the country seemed to predict a flood of Republicans to the polls that could net the GOP dozens of seats in the U.S House and Senate. The polls were wrong. Experts from both parties explain how it happened.

Posted Updated

By
Laura Leslie
, WRAL Capitol bureau chief
RALEIGH, N.C. — Coming into Tuesday’s election, polls around the country seemed to signal a massive Republican wave that could net the party dozens of seats in the U.S House and several in the Senate as well.

By Wednesday morning, it was clear that the polls were wrong. As of Friday, Republicans had flipped nine seats, not 30 or 40, and control of the Senate was still up in the air, with Democrats picking up a Senate seat in Pennsylvania.

So what went wrong?

Tom Jensen is lead pollster with Public Policy Polling, a Democrat-aligned firm located in Raleigh. Jensen said most of their polls were right on the mark this election, and he expected most professional Republican-aligned firms to have accurate results. However, he said the polling averages at many aggregators were way off.

The problem, he said, was caused by “cheap” pollsters who aren’t as skilled or as ethical as more reputable professional firms. In this cycle, Jensen said, a lot of the problematic firms were Republican-aligned.

“Republican pollsters made a real effort to influence the polling averages by releasing lots and lots of polls that showed an overly rosy picture for Republicans," said Jensen. "And that really ended up being the majority of polls that were out in public in the closing stretch of the campaign."

Jensen said good poll numbers tend to energize base voters.

“It's also something that helps with fundraising,” said Jensen. “People like to be part of a winning team. So maybe if you put out polls with a positive picture, that will make people want to give you money.”

Nathan Babcock is a Republican political consultant who commissions and uses polling for a living. Like Jensen, Babcock said the professional pollster he uses got accurate results this year. He agreed that late polls caused the problem.

“In the last week, you had all these kind of fly-by-night, operations come out and release their polling results," said Babcock. "And that kind of changed the narrative that Republicans are going to have a wave nationally. If you were to cut off the polling a week ago, the averages in most of these states would have been really close to on the money.”

Babcock thinks the motive was more of what he calls a “herding” effect among firms who jumped on what they thought the results would be.

“Pollsters want to want to be right" said Babcock. "And they want to say that they can point back to this and say that they were right.”

Jensen and Babcock said choices made by the pollsters could account for some of the inaccurate results.

Babcock thinks some may have overcompensated for what happened in 2016 and 2020. Research has shown that voters of all parties who backed former president Donald Trump in those elections were less likely to answer polls. This may have skewed poll results in favor of Democrats.

“A lot of pollsters were burned,” said Babcock. “Even if you got the right number of Republicans, the right number of Democrats, the right number of independents, you still might have been missing that hidden Trump vote. And so I think a lot of pollsters tried to correct for that this time.”

Jensen said he also saw problems in some pollsters' use of weighting. People who answer polls don’t match up neatly with the people who will turn out to vote. Pollsters have to analyze trends and weight their results to account for the differences.

“Women will answer polls more than men, older people answer polls more than younger people, white people tend to answer polls more than people of color, those sorts of things,” said Jensen. “And a big part of what determines what polls are accurate and what polls aren't, is how well those things are adjusted for.”

Both experts also saw problems in the way polls were used by media.

“I think the spin was what was wrong this time,” said Babcock. “I don't think the polls themselves are necessarily that wrong.”

Babcock pointed to a recent story in the New York Times that featured poll results in four swing districts in the U.S. House. Democrats were tied or ahead in all four races. Yet, he said, the story claimed the polls were "evidence that Republicans are poised to retake Congress this fall."

“If they were to describe their own polling accurately, they would have said 'Democrats are still holding on, it's gonna be close,'" said Babcock.

Babcock said the New York Times’ polls in 2016 and 2020, like many other polls, were inaccurate, and the outlet is also often accused of being too Democratic-leaning.

"I think they spun their own poll to say it was going to be a good night for Republicans because they're worried about that same trap,” Babcock said.

Jensen said the use of averages from polling aggregators by the news media is also problematic. He said some aren’t selective about the surveys they include.

“Some of the polls, for instance in Pennsylvania, that were really bad down the stretch were conducted by high-school students," said Jensen. "And they just got put into the polling averages the same as somebody who's been doing polls for 30 years at a really high level. When you treat all that stuff exactly the same, you can end up with situations like on Tuesday.

“Having clear standards about what polls we're going include in polling averages and which ones we aren't would go a long way towards restoring some credibility to the industry, so that you have to have at least some idea of what you're doing,” Jensen said.

 Credits 

Copyright 2024 by Capitol Broadcasting Company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.