Who Does The Polling?

From past articles, you know by now that polls are created to serve one of three purposes: to determine voter opinion, to craft political strategy, or to influence public opinion on important issues. To see how that happens, it’s important to understand who is doing the polling, and how that affects the published results.

A convenient source to check on who is doing the polls, would be the Polls section of RealClearPolitics.

From January through last Sunday, RCP listed 68 polls on a Trump v. Biden national preference, created by 17 separate polling agencies, listed here by number of polls:

Economist/YouGov: 13 polls
Emerson College: 5 polls
IBD/TIPP: 5 polls
FOX News: 5 polls
NBC News/Wall Street Journal: 5 polls
Monmouth University: 4 polls
CNBC: 4 polls
ABC News/Washington Post: 4 polls
Quinnipiac College: 4 polls
Harvard – Harris: 4 Polls
CNN: 4 polls
CBS news/YouGov: 3 polls
NPR/PBS/Marist: 2 polls
The Hill/Harris X: 2 polls
Rasmussen Reports: 1 poll
USA Today/Suffolk: 1 poll
LA Times/USC: 1 poll

That means 10 of the 17 polls, more than 58%, are media-produced, with another 7 produced by academic groups, leaving only 1 poll from a purely professional group.

What that means on a practical level, is that most polls you hear about are paid for by media outlets which want to promote a narrative, and who have demonstrated an eager desire to control the discussion on any important issue. The second-most common source for polls is college departments, which see a lot of turnover in students, so that the people performing the work are often inexperienced and unlikely to notice bias in the process or results.

On that note, where is Gallup? Poll mavens know that the Gallup Organization made its name on Presidential elections, but in 2016 Gallup was nowhere to be found, and this year Gallup is doing something rather odd.

For 2020, Gallup is polling Americans on a number of issues they say will affect the election, but Gallup is not running head-to-head polling between Trump and Biden, as if the firm wants to stay out of that aspect of polling altogether.

Very strange.

Anyway, going back to those 17 polling entities, here is a summary of what I found from them, listing the firm which did the actual poll work, and then the breakdown by party affiliation of the respondent pool:

CNBC: Change Research
49% Dem, 40% Rep, 11% Ind

NBC/WSJ: Hart Research Associates
45% Dem, 33% Rep, 13% Ind, 12% Other

FOX News: Beacon Research (D) and Shaw & Company Research (R)
43% Dem, 40% Rep, 17% Ind

LA Times/USC: U Southern California
Dem 39%, Rep 30%, Ind 31%

Harvard – Harris: The Harris Poll
Dem 37%, Rep 32%, Ind 28%, Other 3%

The Hill/Harris X: online by HarrisX
37% Dem, 32% Rep, 29% Ind

Quinnipiac: Quinnipiac University
Dem 36%, Rep 26%, Ind 31%, Other 7%

NPR/PBS/Marist: Marist college
36% Dem, 30% Rep, 34% Ind, 0% Other

Monmouth: Monmouth University
33% Dem, 29% Rep, 38% Ind

CNN: SSRS
32% Dem, 25% Rep, 44% Ind (report noted they deliberately “oversampled blacks”

Economist/YouGov: YouGov
31.5% Dem, 24.5% Rep, 43.9% Ind

ABC News/Wash Post: Abt Associates (Rockville, Md)
31% Dem, 24% Rep, 37% Ind

First, take a look at those twelve polls and notice their party affiliation weighting versus the actual party affiliation in the 2016 election which was 37% Democrat, 33% Republican, and 30% Independent. Some of those polls were way off the actual balance. Just imagine how different those polls might look, if historically accurate party weighting was used, not to mention accurate weighting by other demographic factors.

But that’s not nearly so bad as the last five polls, listed below:

Emerson: Emerson College
“based on 2016 election turnout” but not specifically cited

CBS News/YouGov: YouGov
2016 presidential vote and registration status” plus “U.S. Bureau of the Census”
(no, they did not explain how that worked out in their methodology)

Rasmussen Reports: Rasmussen
Restricted behind paywall

IBD/TIPP: not revealed
Not revealed at all

USA Today/ Suffolk: Suffolk University
Not revealed at all

Five polls with no clear identification of party weighting, even though such weighting can have very great effects on the poll’s results. While I am certain each of the five would throw out some excuse, to hide vital demographics from a published poll is unacceptable – polling is only a science if the data is there to be verified. Hiding important information makes the poll no better than a work of fiction, something written for no better purpose than entertainment. Just something to keep in mind, that almost a full third of the polling agencies deliberately hide key data from the public.

So, what about the entities which actually perform the polls? What do we know about them? Let’s go through that list again, separated by poll sponsor for clarity:

College/University Polls:

Quinnipiac: performed by Quinnipiac University
Monmouth: performed by Monmouth University

LA Times/USC: performed by the University of Southern California
NPR/PBS/Marist: performed by Marist College

Harvard – Harris: performed by The Harris Poll

(notice that in this group, there are two polls fully performed by a University, another two where media sponsors the poll but a University/College does the actual polling, and one poll where the university claims the name but hires the work out to a firm)

Media-sponsored Polls:

CNBC: performed by Change Research
NBC/WSJ: performed by Hart Research Associates
FOX News: performed by Beacon Research (D) and Shaw & Company Research (R)
The Hill/Harris X: performed online by HarrisX
CNN: performed by SSRS
Economist/YouGov: performed by YouGov
ABC News/Wash Post: performed by Abt Associates (Rockville, Md)

Before I go on, are any of those poll names familiar? Isn’t it interesting that polls which claim to reflect the honest opinion of so many American voters, are produced by firms which sell their name but hide the names and methodologies of the firm which did the work? Yes, some of them will publish basic statements of methodology if you dig enough into the matter, but I have to stop there to address something.

Polls hide their data, plain and nasty. If you go to a site like RealClearPolitics and want to view the polls, you’ll find a lot of handy links.

But if you click on those links, what you will often get is a splash page which announces summary findings of the poll like this one.

You have to dig deeper to get more numbers about the poll which addresses the election.

But even there we don’t have the full data. Scroll all the way down to the bottom and you’ll find some basic demographics, but not the full story. The poll I linked to, for example, has brief summaries of things like age, region, and party affiliation, but it doesn’t say how many of the calls were in the morning or evening, or how many urban. rural and suburban respondents were contacted. Frankly, the poll hides a lot of information which, if used one way or another, can and does alter the outcome of the poll.

I’m not saying the whole business is rigged, but it’s wise to be skeptical when so much of the matter is hidden.

With that said, time to meet our contestants on ‘Hiding The Truth’:

Quinnipiac University has built a decent reputation for itself as a polling group since 1996, but in my opinion they might have started to believe too much of their spin. Quinnipiac performs polls using standard polling procedures, but one weakness is the dependency on inexperienced college students to take the polls. Their website even advertises that “Interviewing jobs are available to Quinnipiac students interested in learning the art and science of survey research”. Frankly though, this means that quality control depends entirely on Dr. Douglas Schwartz and his ability to correct his own bias.

Monmouth University started polling in 2005, under the direction of Patrick Murray. Like Quinnipiac, Monmouth basically follows standard polling methodology, but also depends on inexperienced volunteer students to perform the polls.

The University of Southern California: The USC Dana and David Dornsife College of Letters, Arts and Sciences has cooperated with the Los Angeles Times since 2011 to produce political opinion polls. USC hides the people running their program a bit more than the previous schools discussed, with ‘CESR Survey Director’ Jill E. Darling running the ‘Economic and Social Research’ team and ‘CPF Director’ Robert M. Schrum handling the political polling. USC did not provide any academic titles for the two directors, nor their experience in polling. Kind of weird, that.

Marist College started polling in 1978, giving it seniority to the other three collegiate polls I discussed. But like its fellow academic pollsters, Marist has quite a high opinion of itself. I don’t mind a polling firm being proud of its success, but notice how Marist focuses on media attention, especially liberal poll maven Nate Silver. Silver did well in 2012 but blew the call badly in 2016, enough that he really should be regarded as a left-leaning partisan more than an icon of poll excellence. And like the other college poll groups, Marist depends on student volunteers for the poll work, although Marist has a professional staff who maintain a higher level of academic rigor under the leadership of Director Lee M. Miringoff.

Now on to the private polling companies:

The Harris Poll was founded in 1963 by Louis Harris, and has been bought out over the years by various investors, including 2014 by Nielsen Holdings (of TV show ratings reputation), then someone called ITWP bought the business the same year, then in 2017 by the Stagwell Group. Notably, the Stagwell Group said they expected to bring in $250 million a year from the acquisition. The splash page for the Harris Poll brags about seeking ‘Human Truth’ but doesn’t say much about the money side of the business beyond noting they provide “strategic counsel”. Remember the second purpose of polls in that light.

One difference between Harris (and other media polls) and the academic polls, is that Harris does online polling. Online polling used to be part of a robust debate about valid methodologies, since the respondent pool is – by definition – not really random since the respondents reach out to take part or worse, are recruited to take part by specific invitation. Either way, this kind of practice used to be considered a violation of proper standards, but it is cheap and fast, which seem to be what Harris considers its primary goals. Which brings me to Harris’ sluttier sister firm, HarrisX.

The HarrisX Poll is run by the same parent company, and like Harris runs its polls online. The splash page promotes a program called Telephia, which pays people to answer polls. I guess once you abandon the classical ethics of polling, you can just go with whatever gets the results you want.

The reason I hold this practice in such contempt, is the obvious character of the respondent pool it attracts – urban, tech-savvy and likely young people. I certainly want those people included, but the methodology used by the Harris polls dissuades participation by blue-collar, rural, and older citizens. You are simply not going to get a valid cross-section of the voting population in such polls, no matter what they promise.

But moving on …

Next up is Change Research. Aside from its name, the splash page for Change Research is interesting because it touts apparent connections with and support for left-leaning media, namely MSNBC, Politico, FiveThirtyEight, Axios, and Bloomberg. Not a moderate source much less conservative in the lot. Further down, just to reinforce, the site boats of cooperation with the ‘Democratic Attorneys General Association’, the NAACP, the AFL-CIO, and ‘Future Majority’.

Change Research started in 2017, and the ‘About’ tab tells us that Mike Greenfield and Pat Reilly worried about polling issues like high cost and “lack of innovation”. I went to the ‘Methodology’ tab and found that, like Harris, Change Research advertises for respondents and also uses social media for its polls. They brag about reaching “members of every gender, race, creed and political persuasion”, but given their orientation that last part is dubious, and in any case they certainly cannot claim valid quality control.

So, moving on, we come to Hart Research Associates. Officially known as Peter D Hart Research Associates and founded in 1971, the group is another left-leaning spin control firm. The splash page includes boats about working with Amy Klobuchar, CNBC, promoting funding for the US Postal Service, and advancing liberal causes like Caring Across Generations, which wants federal money for Universal Family Care.

Further into the website, I found that the Garin-Hart-Yang Research Group, identified as the “political division of Hart Research”, appears to do most of the poll work. This is important because right there you can read that Hart Research exists to help “Democratic candidates and progressive causes”. I rather think that statement should be included as a mandatory advisory on all of their press releases involving election polls.

Moving on, we come to Beacon Research (D) and Shaw & Company Research (R). Beacon Research is a Democrat-supporting firm, and their splash page confirms they support not only Democrats but public policy issues and labor unions. Beacon Research was formerly known as Anderson Robbins Research, changing their name in 2019. The polling firm was founded in 2011.

Shaw & Company Research was founded in 2011 (I think … there is no history I could find before 2011 for the group, and I could not find a website for Shaw & Company) by Dr. Daron Shaw, who earned his degrees from UCLA before taking a position with the University of Texas. Dr. Shaw worked to elect President G.W. Bush, and before starting Shaw & Company he worked for the Texas Lyceum Poll. Dr. Shaw was also directly connected to the “Fox News Poll”, which appears to fund his company.

As we continue, we come to SSRS. SSRS is a multi-service marketing and research firm. Like other polling firms, SSRS has abandoned established polling practices in favor of “multiple modalities, a fancy way of saying they mix different data sources, including “in-person interviews, online surveys, or hardcopy mail questionnaires”. As I observed before, this hodge-podge approach allows pollsters to claim they contacted all the different demographics, but the resulting conclusions are less natural than Dr. Frankenstein’s monster. Certainly no one could reasonably claim the respondent pool accurately reflects the voting population. I think SSRS knows this, since they did not even bother to post a statement about their methodologies. Scientific, they are not.

We move next to YouGov. The splash page makes clear that we are – yet again – visiting the hallowed halls of the Progressive media. YouGov posts praise from Bloomberg, The Washington Post, and The Guardian, and given the outcome of the last Presidential election, I did find it amusing that YouGov is still boasting about its “reliable data you can trust” and its ‘accuracy’ in election predictions. Like so many other polling firms, YouGov uses online polling, in their case exclusively. YouGov does not even bother to explain its methodology on its company site, simply saying instead that they use “sophisticated proprietary sampling technology”, which in English means they made up the rules and don’t want you to know that.

Finally, there is Abt Associates, of Rockville, Md. Abt Associates, like all the others, puts up a glossy splash page which makes them look very corporate and successful. One early giveaway, however, is that navigating the site is a bit difficult if you want hard data.

For example, I wanted to understand Abt’s methodology for polling, and found a link to “methodological resources”. If you actually click on that link, however, you will find nothing of the sort is actually presented there. Instead you get selection of projects and publications, in each case a showy list of conclusions but no work to support it.

I kept digging and found a reference to their poll work for ABC News and the Washington Post. But while their summary list offered “highlights’ which had nothing at all to do with the actual polls, and again nothing about their procedures or methodology. Abt actually included this very misleading comment: “Ours was one of just five polls – – and the only major news organization/media poll – – to receive an A+ rating from FiveThirtyEight, a site that evaluates the methodology and accuracy of polling organizations.”

Sorry but there are big problems with that statement. First, FiveThirtyEight is not credentialed to judge the accuracy of polls, they are a website which assigns values purely on subjective criteria, and it is an out and out lie to claim that FiveThirtyEight evaluates methodology, since the articles on FiveThirtyEight do not even discuss methodology in any detail. The most Abt can honestly say is that a left-leaning site which blew the call badly in 2016 likes their polls and gave them a grade based on agreeing with them.

By the way, pay attention to where these polling entities are based:

Quinnipiac: Hamden, Connecticut
Monmouth: West Long Branch, New Jersey
USC: Long Angeles, California
Marist: Poughkeepsie, New York
The Harris and HarrisX Polls: Chicago, Illinois
Change Research: San Francisco, California
Hart Research Associates: Washington, D.C.
Beacon Research: Boston, Massachusetts
Shaw & Company Research: Austin, Texas
SSRS: Glen Mills, Pennsylvania
YouGov: New York, New York
Abt Associates: Washington, D.C.

All but one of the lot are based in Blue States, and every single one of them is located in a town which could fairly be said to be hostile to Republicans. If you want to understand why most political opinion polls demonstrate a bias favoring Democrat candidates, the location of their offices is an obvious hint.

So let me summarize: Of the seventeen polling groups reported by RealClearPolitics, five are produced by agencies which do not reveal any demographic information at all and are therefore rejected on that basis. Of the remaining twelve polling groups, all but one are in Blue States and all are based in towns known to be hostile to Republicans. The four academic polling groups have high turnover of their actual polling staff, with the final report controlled by only a few people at each school, with motive to lean Left. The eight remaining polling groups all concealed their methodologies, five of them essentially admitted they had abandoned classic polling practices in favor of cheap & quick methods that basically mashed data from one place into looking like it came from somewhere else, like the political version of sausage. Four of the eight had direct ties to Leftist political groups, and two specifically advertised on their own site that they work to get Democrats elected.

So here’s how these polls get produced. First, one of these firms is paid to run a poll. The money often comes from patrons who want a particular message sent, and the poll is controlled by one or two people before it is released to the media, who then screen the poll some more to highlight desired points and conceal some other points. The text of the poll is often offered in a link, a way down from the headline so you have to scroll to find it, and while methodologies may be very briefly discussed in a one of two sentence capsule, the actual weighting of demographic data is not presented except in limited and general terms. The poll is made to look like a careful cross-section of the voting public, but the actual data is massaged to give that impression, being rather different in its original condition.

Wizbang Weekend Caption Contest™
Primary Election 2020