I voted stickers - Element5 - Unsplash

2020 Has Changed A Lot of Things, and Exit Polling is No Exception

Nov 3, 2020
What to expect—and not expect—from exit polling

What do TV pundits, advocates, election data geeks, and politicians all have in common? They all want to know who voted, how they voted, and why they voted! Who wouldn’t?

For years, the go-to way to do this has been through the national Exit Poll, the mythical deep-dive into the minds of the voters conducted by Edison Research every election. The Exit Poll has become a staple of television coverage of election nights: watching news anchors announce the numbers, watching the panels break them down and guess what they could imply, and watching the numbers scroll across the chyrons on the bottom of the screen are the stuff of tradition in American households every first Tuesday after a Monday in November of even-numbered years.

But 2020 is proving to be the year that upends tradition, as the COVID-19 pandemic changes the way we approach virtually every aspect of our lives, especially our elections. The Exit Poll is one of many things that faces significant challenge under the current conditions, although the aftermath of 2016 had begun to erode it. Before you dive into those breathless exit poll breakdowns, consider what you can learn from this data—and where it might lead you astray.

HOW IS THIS POLL DIFFERENT FROM ALL OTHER POLLS?

1. “Exit poll” is a catchy, but misleading, name.

The term “exit poll” conjures images of nerds in oversized suits and thick glasses, standing outside the doors of elementary school gymnasiums across the nation, asking voters about their votes as they walk out and jotting down the data on clipboards.

I hate to break it to you, but that’s not how it works anymore.

While it is true that part of the Exit Poll is conducted by in-person interviewers at polling locations, this now represents a small percentage of the final batch of data that gets reported, and dwindling every year. In fact, in-person interviews are only held at a select few polling locations in a select few states, chosen by researchers for that particular precinct’s high number of a key voting demographic, or the likelihood that that precinct will be a “bellwether” of sorts for a city, county, district, state, or even the nation’s overall voting patterns.

So how is most of the Exit Poll’s data collected? Frankly, much like a normal poll that you would see in the lead-up to the election. Interviewers call (and nowadays, email or text) voters at random and collect information about the voter’s opinions, motivations, demographics, and voice choices.

There are some differences between exit polls and “regular” polls: exit pollsters call a much larger number of voters (usually exponentially larger) to collect a complete sample that is so large in size, it carries a negligible margin of error. In 2016, the national Exit Poll had almost 25,000 respondents, a sample size that carries a margin of error of +/- 0.63%. In comparison, the average national pre-election poll has a sample of around 1,000, which carries a margin of error of +/- 3%. Exit pollsters also collect data at a consistent pace throughout the day, versus a pre-election poll that only collects data for a few hours over the course of three to four days.

But, while a larger sample does add credibility and reliability to the numbers, exit polling is susceptible to the same hurdles and shortcomings that traditional pre-election polling faces: non-response bias, subsamples with a higher margin of error, and systemic underrepresentation of certain demographics, just to name a few.

2. Not many people will be “exiting” this year, and “valid votes” may be contested.

The COVID-19 pandemic has changed the way this election is held. The number of people voting by mail has skyrocketed, particularly in states where few people did so before this year. Early voting numbers have been huge, with some states (including the massive state of Texas) already exceeding their total 2016 turnout with early voting days to spare. Tens of millions of votes have already been banked. Since we know that voter fraud is one of the rarest of rarities, it is safe to assume that these folks will not be exiting the polls on Election Day.

“But you just said that most exit polling isn’t actually done at the polls! Won’t those early voters and mail-in voters just be called on Election Day and still included?”

The answer to that is yes! Early and mail-in voting is not new and exit pollsters have actually done a great job over the years of reaching and including those voters in their samples.

The difference between then and now is that this is the first American election in the modern era in which hundreds of thousands (potentially even millions) of early and mail-in votes are at threat of being directly challenged and possibly excluded from the count. And the voter may not even know the fate of their ballot.

There’s no way for exit pollsters to find out who in their sample has cast a “valid” vote and who hasn’t in this nightmare scenario. Thus, the possibility exists that the Exit Poll could paint a picture of an entirely different electorate than the one deemed “valid” when all is said and done.

3. Even under favorable circumstances, exit polls have gotten it wrong.

Some may remember the 1980 presidential election, when the Exit Poll so decisively pointed to the Reagan landslide that NBC News called the election hours before many states even finished voting. Others may remember election night 2004, when exit polling had President George W. Bush rehearsing his concession speech.

The reality is that exit polling doesn’t hit on every swing it takes, and when it misses, it’s usually for one particular reason: the increasing diversity of America’s electorate. Every election cycle, racial minorities become more powerful voting blocs and understanding their voting behavior becomes more and more critical. Exit pollsters have struggled to keep up, and their methods often don’t capture accurate reads on racial minority groups.

In 2004, the Exit Poll failed to capture George W. Bush’s unexpected strength among Hispanic and Latino voters, which helped him decisively capture multiple states that delivered him a second term. In 2018, the Exit Poll showed Florida gubernatorial candidate Ron DeSantis, a Republican, winning 18% of Black women voters, despite the facts that no Republican anywhere in the nation, for decades, had come anywhere close to such a number with the group before, and that DeSantis was running against a Black man, with a Black wife and family, and deep ties to the state’s Black community.

Both of these Exit Poll results were contradicted by later precinct and voter file analysis.

Edison Research, by their own admission, have work to do here, particularly with Hispanic and Latino voters. In fact, they have stated that their work “is not designed to yield very reliable estimates of the characteristics of small, geographically clustered demographic groups. …If we want to improve the National Exit Poll estimate for Hispanic vote (or Asian vote, Jewish vote or Mormon vote, etc.) we would either need to drastically increase the number of precincts in the National Sample or oversample the number of Hispanic precincts.”

Some news organizations that used to be part of the Exit Poll consortium, such as the Associated Press, broke with Edison Research after 2016 and formed a new exit polling organization called VoteCast, which uses an online methodology to poll many thousands of respondents (in 2018, they polled 116,792 voters and 22,137 nonvoters).

HOW TO INTERPRET EXIT POLLS RESPONSIBLY

The reasons listed above paint a dreary picture of what to expect from the exit polling this year. There is some reason to be optimistic: the Exit Poll does do a good job of assessing large populations rather than subgroups, and could be useful in the fight against voter suppression and potential litigation aiming to invalidate the votes of disproportionately underrepresented communities.

And there are competitors to the Edison-run National Exit Poll, such as AP VoteCast. This competition means that, like other polls, we can compare the results and get a more accurate picture. It also means that they’re incentivized to do a good job—and correct for systemic bias, for example.

But in terms of assessing who voted, how they voted, and why they voted, there are better tools in our arsenal—we just have to be patient. We can assess the ways that certain communities voted by taking a closer look at precincts in which they are most represented. In due time, the voter file will be updated with details on who voted, so we can assess if youth voting surged, if Hispanic and Latino voters turned out, just how big the early/mail-in vote was, and many other details—with far more reliability and accuracy. And, as always, the American National Election Study will be released in the Spring and contain details about turnout, issues, and the underlying attitudes that drive voters.

 

Meanwhile, in the days and weeks following the election, ReThink Media will dive into good, high-quality, and reliable data as it becomes available. We will provide analysis and recommendations to you as the data guides us, including whether new data changes any of the previous guidance on messaging and media strategy.

 

If you have questions about polling methodology, the election, public opinion, or messaging write to us at analysis@rethinkmedia.org.

Share