One poll has former President Donald Trump leading by 3 percentage points. Next up, Vice President Kamala Harris has a lead of four.
Then three in a row show that they are tied. The next Harris has a one point lead and the next Trump has a two point lead.
The collective insight from the flurry of national polls leading up to the final day of voting on Tuesday: the presidential race is too close to call.
“The polls are wafer-thin. Very tight. The outcome will also be very thin and very tight. It could go one way or the other,” said Steve Vancore, a pollster and political strategist who has lived in Tallahassee for many years.
In Florida, by contrast, the outcome is quite certain for the first time in a generation. Trump is the likely winner of the state’s 30 electoral votes. His margin of victory, assuming this happens, is the key unknown.
Polls show Trump at +6, +6, +9, +12 and +5 in Florida over the past week.
And the next election battle — U.S. Sen. Rick Scott, R-Fla., versus the Democratic challenger, former U.S. Rep. Debbie Mucarsel-Powell — is much closer, according to many of the same surveys.
The polls are ubiquitous in the days leading up to the presidential election. It can be difficult to decipher what they mean, and how much confidence to place in their accuracy.
At least 125 national polls were released in October. There were 21 in Florida. And dozens of others sought to provide insight into what voters think in the seven battleground states that could go to Harris or Trump and determine who wins the 270 electoral votes needed to win the presidency.
Enthusiasts look to polls for a range of information about elections or issues, including whether a candidate is trending up or down and has the potential to pull off a surprise.
One thing they can’t do: say with certainty who will win, said Kevin Wagner, a political scientist at Florida Atlantic University. Each photo is a snapshot and the image may change the next day.
Related articles
Interest in polls
“The public seems to have a great appetite for elections. They want to know who wins,” he said. Wagner is also co-director of FAU’s PolCom Lab, a partnership of the School of Communication and Multimedia Studies and the Department of Political Science, which conducts public opinion polls.
When FAU publishes a poll, it often attracts attention across the country and is “some of the most talked about content the university produces,” said Joshua Glanzer, the university’s vice president for media relations and public affairs.
Polls are often “very misinterpreted,” says Brad Coker, CEO and principal of Mason-Dixon Polling & Strategy. He has conducted thousands of polls across the country since he started voting in Florida in 1983.
Small number
Skeptics scoff at the idea that a sample size of 1,500 people or less can accurately reflect what millions of Americans think.
Wagner himself raised the issue during a recent presentation at the Osher Life Lifelong Learning Institute on FAU’s Boca Raton campus. He said a national poll might use 800 responses, on the low end of 1,500 for a well-funded survey.
“We want to know what a large group of people think by asking a small group of people,” he said. “How can I ask a small group of people and get a realistic projection of what a large group of people think?”
Wagner compared it to tasting soup.
“If I give you a large bowl of soup and I want to know whether that soup tastes spicy or good, and what else I want to know about that soup, should I drink the entire bowl? No,” he said. “A spoonful or two is sufficient, as long as all the ingredients that make up that soup are [are well mixed] and land on your spoon.”
That explains the theory, and also why public opinion researchers spend a lot of time trying to make sure their samples represent the people most likely to vote. They also increase the weight given to some people if the survey doesn’t get enough responses from a certain demographic group.
Margin of error
Opinion polls are not accurate, even though they are often presented as such. And pollsters say people should be careful about reading too much into it.
Each number in a percentage is actually a series of possibilities that carry a margin of error and statistical probability.
If candidate A has 53% and candidate B has 47% and the margin of error is plus or minus 3 percentage points, A and B could each end up at 50 percent.
Or it could be 56-44.
For example, the Florida Atlantic University poll released Tuesday showed that Scott’s 50% to 46% lead for Mucarsel-Powell does not strictly mean he has a 4 percentage point lead and is likely to win by that amount.
Maybe.
But with a margin of error of plus or minus 3 percentage points, Scott could be somewhere between 47% and 53% and Mucarsel-Powell between 43% and 49%. Scott is in a better position — it’s always better to be ahead than behind, Wagner said — but the battle will most likely be in those areas.
And, Coker said, there’s a small chance that the person trailing by a few points will end up winning by a few points, but that’s not impossible.
Coker said he pays more attention to the person ahead when it is 5 or 6 points.
Vancore explained it this way: flipping a coin 500 times would theoretically yield 250 heads and 250 tails. But in reality it could also be 245 heads and 255 tails. That’s expected, random variation.
“The public expects a level of accuracy in polling that pollsters don’t even expect,” Vancore said.
Survey professionals don’t make many small percentage point changes. And they ignore the precision implied by an even smaller change, from, say, 50.9% to 51.1%.
Wagner said people want something more precise: “That’s not satisfying, is it? You want a prediction, someone to tell you exactly what will happen. But all the poll tells you is a margin within a margin of error.”
Impact on results
Many, but not everyone, involved in politics believe that polls can influence the outcome of an election, especially if one candidate clearly dominates the polls in a high-profile election.
A slew of polls suggesting that Candidate A will overwhelmingly defeat Candidate B could discourage voters on B’s side from actually voting – and that could hurt all the party’s candidates.
Political scientists and pollsters said they didn’t think that was too likely this year, at least not in battleground states, where every vote could make a difference.
But Florida shows what can happen.
During the campaign for the 2022 midterm elections, it was clear for months, and polls repeatedly showed, that Republican Gov. Ron DeSantis would handily defeat Democratic challenger Charlie Crist.
Analysts from across the political spectrum said this contributed to a collapse in Democratic turnout. And that, in turn, contributed to a series of Republican victories and Democratic losses for lower offices.
In Palm Beach County, for example, two County Commission seats flipped from Democratic to Republican.
If the election is “relatively close, I don’t think it will have an effect on turnout. But if it is a blowout like DeSantis-Crist, it does impact turnout,” Vancore said.
Boca Raton City Council member Andy Thomson saw that firsthand.
He was a Democratic candidate for state representative in southeastern Palm Beach County in 2022, when polls showed the top candidates for governor and U.S. Senate headed for defeat.
“When voters see polls coming out showing their candidate being crushed, that absolutely impacts how important they feel it is to vote,” Thomson said. “If, on the other hand, they see an exciting race, I think it will provide extra motivation to vote.”
Thomson lost 51.7% to 48.3%.
He doesn’t think voters lose their motivation to vote when they see their favorite candidate on track to victory. “Everyone wants to be on the side of a winner, and yet they will still be motivated to vote,” Thomson said.
The phenomenon is compared to fans at a football match. If one team wins by 30 points, more fans will leave before the end of the fourth quarter than if one field goal could change the outcome.
“Americans don’t like to vote for losers,” Wagner said. “We want to think that if we take the time to vote, that our vote will matter, that our candidate is going to win or has a good chance of winning, or that we should feel like our vote matters. will affect the outcome.”
Who is being questioned
The way pollsters reach voters is evolving.
Phone calls from living people asking questions for years were the gold standard. And for veteran pollsters like Coker and Vancore, that’s still the case.
But technology and changing habits have made that more difficult. At one time, almost every American household had a landline telephone. Now the vast majority rely on mobile phones.
And many more people screen their calls and don’t answer them, meaning live calling is more time-consuming and expensive. (Coker said men answer unknown numbers more often than women.)
Many pollsters have turned to other methods or a combination of approaches, including live callers, automated calls that ask people to enter numbers to respond to questions, text messages that send a link that people can use to take the survey , and online panels where people can sign up to answer questions.
Wagner said the non-live caller method can help fill the gaps of people who are less likely to answer their phone and participate in a phone survey.
Weighing
A major challenge for pollsters is ensuring that the sample is representative of the population being surveyed.
Even if a survey has enough people to make it statistically valid, it does not necessarily have a representative sample. There may be too few young voters, or too many voters in one party.
So pollsters judge how to weight the answers they receive to more accurately reflect their share of the voting population. In practice, this could mean giving more weight to one demographic group when generating the overall survey results.
Poll failures and poll discrepancies are often the result of assumptions made by the people who design the surveys. The toughest problem is figuring out how to get a sample of people who are likely to vote.
That was especially problematic in 2016, when Trump stirred up excitement among people who weren’t traditionally heavy voters and they turned out to vote.
Wagner said pollsters can’t simply ask whether people plan to vote and rely on that to determine likely voters, because more people will say they plan to vote than actually do.
“They don’t want to admit they don’t vote,” he said.
Anthony Man can be reached at aman@sunsentinel.com and can be found at @browardpolitics on Bluesky, Threads, Facebook and Mastodon.