Why this extremely viral poll result might not be real
It's unlikely that Americans are half as patriotic as they were four years ago. Here's what's happening instead.
This chart from the March Wall Street Journal/NORC poll was destined to go viral. It shows that the values we think of as defining America—patriotism, having children, religion, community, involvement—are falling off a cliff. And the only thing that people now value more? Money. The decline in old-fashioned values has accelerated over the last few years. In 2019, 61 percent said patriotism was very important to them. Today, that number is 38 percent. Also, just four years ago, 62 percent said the same of community involvement. Now, that number is less than half that: 27 percent.
These findings fit into a declinist narrative we are already predisposed to believe. And that’s what makes this chart so powerful and compelling. It’s exceptionally easy to draw sweeping conclusions from it. For example:


And this observer may be right about a lot of this, in a broad directional sense. But the data itself is utterly silent on his explanations for why the sudden drop: the survey didn’t ask questions about loneliness or teen mental health, so any connection to these social ills is purely speculative.
My initial reaction to these numbers was different than most. If these numbers had been produced by my firm, I would immediately assume we had made a mistake and send them back to an analyst to double check.
Take a look at the zig-zaggy pattern on the community involvement question, for instance. That’s the only pro-social item on here that went up in the previous 21 years before the 2019 survey, but it’s declined by more than half in just four years without any clear inciting event explaining why. One could maybe speculate that people locked inside during the pandemic did not go out and do volunteer work, but a drop of 35 points in four years is implausible on its face.
The point here is not that the Wall Street Journal and NORC released bad data. The Journal is one of the more thoughtful media sponsors of polling and NORC is the premier survey data-collection organization in the country. Rather, the dramatically different results we see from 2019 and 2023 are because the data was collected differently. The March 2023 survey was collected via NORC’s Amerispeak, an extremely high-quality online panel. In the fine print below the chart, we can see that data from previous waves was collected via telephone survey.
Why should this matter? After all, panelists on NORC’s Amerispeak panel are recruited probabilistically, using the same random sampling methods as a telephone survey. It’s more expensive, but when when you want online data that looks as close as possible to the old gold-standard telephone survey data, you use NORC’s Amerispeak.
But survey mode still matters. Surveying the exact same types of respondents online and over the phone will yield different results. And it matters most for exactly the kinds of values questions that the Journal asked in its survey.
The basic idea is this: if I’m speaking to another human being over the phone, I am much more likely to answer in ways that make me look like an upstanding citizen, one who is patriotic and values community involvement. My answers to the same questions online will probably be more honest, since the format is impersonal and anonymous. So, the 2023 survey probably does a better job at revealing the true state of patriotism, religiosity, community involvement, and so forth. The problem is that the data from previous waves were inflated by social desirability bias—and can’t be trended with the current data to generate a neat-and-tidy viral chart like this.
Note that there’s no evidence that social desirability bias affects how people respond to political polling questions. People have for years tried to run down the “Shy Trump” theory of why the polls missed in 2016 and 2020. This theory holds that voters were afraid to admit they were voting for an uncouth figure like Trump and so they lied and said they were voting for Biden/Clinton or were undecided. Numerous research teams have tried to confirm the “Shy Trump” effect to no avail. We’re still not sure, but the problem was more likely one of Trump voters not trusting the polls and not taking surveys to begin with.
Still, the Journal’s chart says reveals something important about how information-age consumers are wired to process data. Surprising numbers and big shifts garner outsized attention—when best practice is simply to average the polls and be skeptical of outliers. That’s never more true then when these big shifts appear to confirm pre-existing media narratives.
Reality is almost always a lot more boring. We know that patriotism and religion have been on the decline for quite some time, but the rate of decline did not quintuple in the last four years—a fact that the Journal’s chart obscures by treating the latest four year increment the same as the previous 21-year one on the x-axis. For example, here’s what the trend from Gallup on pride in being an American looks like, with the 2019 level highlighted for comparison.
Pride in the country is certainly quite a bit lower than it was in 2004—just three years after the 9/11 attacks. But today’s rate of 65 percent saying they are extremely or very proud is not dramatically lower than it was in 2019, when it was 70 percent. This subtle shift did not generate waves of attention when it first came out in 2022—nothing like the 23-point drop in patriotism reported by the Journal yesterday. And there’s a clear reason for the strikingly different results. While the Journal changed its methods between its last two polls, Gallup has measured these things the same way of the years—through old-school telephone interviewing.
You got a shout out from The Dispatch: "You may have seen a Wall Street Journal/NORC poll in recent days purporting to show the percentage of Americans who value patriotism, religion, having children, and community involvement has plummeted in recent years. Don’t jump to conclusions. “If these numbers had been produced by my firm, I would immediately assume we had made a mistake and send them back to an analyst to double check,” Patrick Ruffini—co-founder of the polling firm Echelon Insights—writes on Substack. “Surprising numbers and big shifts garner outsized attention—when best practice is simply to average the polls and be skeptical of outliers. That’s never more true than when these big shifts appear to confirm pre-existing media narratives. Reality is almost always a lot more boring. We know that patriotism and religion have been on the decline for quite some time, but the rate of decline did not quintuple in the last four years—a fact that the Journal’s chart obscures by treating the latest four year increment the same as the previous 21-year one on the x-axis.”
I don't know much about polls, but your analysis raises some other questions.
First, most telephone polls I encounter are carried out by machines. How has this affected social desirability bias, and how has that affected current polls' comparability with earlier telephone polls? Does Gallup, for instance, still use old-school polls involving actual human interviewers?
Second, isn't the group of people who will answer an online poll very different from the group of people who will answer a telephone call from an unknown number? Has anybody done a study contacting the exact same people and giving them the same poll in the two different platforms?