Military Times, the Gannett-owned publisher of Army Times, Air Force Times, and several other military publications, is out with a new poll, showing decreased support for President Bush among members of the armed forces.
As Michelle Malkin (and others) have noted, the moonbat brigade is already spinning results of the survey as "proof" that President Bush is losing support among military personnel. However, every the pollsters admit that support for Mr. Bush (and the War on Terror) remains significantly higher among military members than the public as a whole.
Personally, I have serious misgivings about the poll's methodology. For starters it uses a mail-in survey, veritable horse-and-buggy technology in the polling business. While mail-in polls can be quite reliable--see Mark Blumenthal's 2004 analysis of a long-running mail-in survey of Ohio presidential voters by the Columbus Dispatch--there are still problems with this approach. While the paper carefully bases its electoral sample on voter registration and demography, it was clearly off the mark in the last presidential election, calling the race a toss-up between President Bush and John Kerry. Blumenthal crawled out even further on the limb, using the Dispatch results to predict that Ohio would break for Kerry. In hindsight, the poll apparently put too much weight on "new" voters in Ohio, who reportedly favored Kerry by a 2-1 margin. In the end, a massive get-out-the-vote effort by Republicans put Ohio in the Bush column by a 51-49 margin. While those results were within the margin of Dispatch poll's margin of error, they did not reflect the "leaning Kerry" trend identified by the mail-in survey.
There appear to be similar problems with the Military Times survey. The survey was sent to 6,000 subscribers of the company's publications, reaching a estimated 4,000 active duty survey members. Of that total, 1,215 military personnel completed and returned the survey, a participation rate of 30%. Not bad for a mailin survey, but how does the demography of the sample (readers of Military Times publications) compare with the military as a whole? For example, Chief Master Sergeants (E-9) comprise one percent of the Air Force enlisted force structure--does the respondent pool reflect that? Do the number of Captains (O-3s) who responded mirror their totals in the Air Force? What about the number of respondents who are white, black, male or female? Careful readers will note that Military Times caveats its poll with the statement that results "is not necessarily representative of the military as a whole."
Interestingly, Military Times claims that a mail-in survey is one of the most effective ways to poll military members, who are frequently deployed to hotspots like Iraq and Afghanistan, where reaching them by "conventional" polling methods (namely the telephone) might prove difficult. But ironically, the Times sample seems heavily based on responses from military members who have never deployed to those locations. An alert Michelle Malkin reader observes that 58% of the respondents say they have never deployed to Iraq or Afghanistan. Could those participants have been heavily influenced--or overly-influenced--by distorted media reporting on the War on Terror?
So far, the Military Times isn't saying; in fact, you have to request a copy of the poll's internals from their survey director, Robert Hodierne (rhodierne@atpco.com). Not unusual, but it hardly inspires confidence in the survey results. I plan to ask for a copy of the raw data from the publisher, and I'll analyze it in a future post. Without the internals--and a more detailed study of the sample--take the Military Times survey with a huge grain of salt.
No comments:
Post a Comment