Google Search

eobot

Search This Blog

Sunday, March 18, 2012

(RESEARCH) Can You Spot Junk Research?

3-15-2012

Recently, Radio Ink Editor Ed Ryan sent me an e-mail asking for some clarification on the methodology of a study we released. I was so thrilled by this request that it nearly took my breath away. After years of begging radio's trade press to not simply publish every piece of "research" they receive with no critical inquiry, finally someone is doing the right thing.

Of course, we were happy to send Ed our full methods statement, as we are willing to do with every study we do. Everyone at Edison Research belongs to the American Association for Public Opinion Research and adheres to their strict policies for disclosure of public polls. When Edison releases a survey result to the public, whether it be an exit poll we perform for the television networks on election night or a poll for the radio industry, you have access to our methods.

Frankly, what drives me a bit nuts is not that there are companies putting out research that could be termed "sketchy" from a methodological standpoint. It is that the trade journalists don't give them the kid of scrutiny they should. Junk research and good research stand side-by-side in radio's trades and seldom are they differentiated.
By comparison, last year we put out the fact (from the "Infinite Dial" series we perform with Arbitron) that Facebook had reached "majority" -- more than half of all persons age 12 and over in America were on this social network.

Two major television news networks wanted to run the story in their evening newscasts, but before they would, we had to supply them with extensive methodological back-up and proof. And as I said, we were glad to do it and impressed that they held our data to their standards.

Meanwhile, others are releasing studies with incredibly vague methods statements, and yet the journalists are not demanding more information. One radio "journalist" (not from Radio Ink) has actually told me that he sees vetting the information he receives as beyond his role -- that "caveat emptor" applies. The "news" is that the company released the information, he feels, and it is up to the reader to judge for himself if the data are credible.

To me, this is complete hogwash. If a company releases research, the radio trade press should be holding these releases to the same standards as ABC, NBC, CBS, Fox, CNN, and the AP are demanding from us at Edison.
One example will forever stick in my mind ... one company that has floated in and out of business (and I believe is currently out) put out a release with results from a "national" survey -- results that were entirely different from what we had found with our correctly sampled studies.

So I went to their website and saw the basis for their "national" estimates: a survey conducted in only five cities, not nationally. Then I looked at which five cities: They were Tampa, New York, Los Angeles, Denver, and ... Toronto! And yet despite this absurdity sitting right there on the company's site, several of radio's trade publications ran the survey's results as though they reflected the U.S. national population -- as the press release claimed.

So kudos to Ed for both asking me to show my work and beyond that inviting me to write some periodic articles about research for Radio Ink. I'll try to help readers understand the difference between good research and junk -- because I believe that in the end, the truth is what we seek.

Larry Rosin is President of Edison Research. The Edison website is www.edisonresearch.com  
-----------------------------------

Here is the LINK to the Pandora question below

(3/15/2012 11:03:01 AM)
David:

Our methodology for the Pandora ratings analysis is clearly posted on our site - you can search for "Pandora Methodology" or simply click on this link:

http://www.edisonresearch.com/home/archives/2012/01/pandora-local-market-analysis-by-edison-research-methodology.php

(3/15/2012 9:50:27 AM)
So answer me this Mr. Rosin. Why does Edison then conduct 'research' for Pandora to have 'ratings' and refuse to disclose the methodology? What you wrote is true, people should ask questions. But researchers like Edison are also part of the problem in this equation. Guess things are different when you can hide behind a client.
(3/15/2012 9:13:43 AM)
When I was in graduate journalism school, back in the dark ages, we were required to take a research methodology course for the exact purpose of learning to differentiate between credible and shoddy research. Now, I'm afraid, the 24 hour news cycle and the desire for "quick and cheap" makes people who should know better cut corners that shouldn't be cut.
(3/15/2012 8:55:08 AM)
Larry,

I spent a number of years working at The Research Group that was at the time run by Bill Moyes and I couldn't agree with you more. Flawed or poorly structured research is, in my opinion, worse than no research at all. The line "I can prove anything but the truth with statistics" certianly applies. Sample design and the structure of questions have a tremendous impact on the results. I think it is incumbent on the trade press to at least review the methodology of research they quote and consider not using "sketchy" studies or at least commenting on their methodology.


Add a Comment | View All Comments Send This Story To A Friend


View the original article here