What about the polls?

The #1 community for Gun Owners in Indiana

Member Benefits:

  • Fewer Ads!
  • Discuss all aspects of firearm ownership
  • Discuss anti-gun legislation
  • Buy, sell, and trade in the classified section
  • Chat with Local gun shops, ranges, trainers & other businesses
  • Discover free outdoor shooting areas
  • View up to date on firearm-related events
  • Share photos & video with other members
  • ...and so much more!
  • Old Dog

    Expert
    Rating - 100%
    2   0   0
    Mar 4, 2016
    1,408
    97
    Central Indiana
    And here are a couple more reasons that polls are not accurate; 1. Some of us just refuse to answer polls so they only get responses from willing participants, which skews the results right off the bat. 2. Some of us lie to intentionally skew the results. 3. Most poll questions are 3 to 5 parts, if you really study the question you will find that it would be very difficult to accurately answer it without explaining your answer. Have you ever seen a simple, single sentence poll question that you could answer yes or no to? They design them to get the results they want. "Figures don't lie, but liars figure".
     

    jamil

    code ho
    Site Supporter
    Rating - 0%
    0   0   0
    Jul 17, 2011
    60,607
    113
    Gtown-ish
    "Have you ever seen a simple, single sentence poll question that you could answer yes or no to?"

    yes
     

    Libertarian01

    Grandmaster
    Site Supporter
    Rating - 100%
    3   0   0
    Jan 12, 2009
    6,013
    113
    Fort Wayne
    I knew a guy here in Fort Wayne that started a polling organization out at IPFW.

    He was a great guy! Firearms owner, ex Navy. A professor in sociology. Most would call him a moderate who was definitely fiscally conservative.

    He explained how there are so many factors in coming up with a poll. Sample size, the length of the poll. If you really wanted to try to get the desired information in a neutral way you would ask the same question in different ways multiple times. That way one slanted(?) question would be cancelled out by another. There is always going to be some degree of a slant, so multiple questions will help ferret out the truth.

    Of course the pollster interpreting the results brings a degree of art to the process.

    Then you have the client. Is the client willing to pay for a larger poll that will get better results but cost more money? Ten (10) more questions may produce far better results but take five (5) more minutes per poll, thus increasing the cost of the poll.

    People called for the poll will sometimes answer how they think they should answer instead of what they really think. While pollsters know this and do try to adjust for it, they can only adjust so much. Which again brings us back to better polls asking the same question different ways to try to pry the truth out - if the truth is wanted by the people paying for the poll.

    In this last election we also have to take into account the overwhelming criticism that Trump and his supporters were receiving in the media. Anyone who supported Trump was a racist bigot, ignorant, and lacked an education. This would put pressure on some folks to answer counter to their true feelings. While not everyone would give say 5% did. This would skew the results even further. Again, a good pollster would take this into account, but perhaps the pollsters themselves didn't realize the degree to which people being polled would be reluctant to tell their true feelings. Was this a failure of the pollsters? Maybe. Could the degree of reluctance be accurately accounted for? Maybe not. We'll never know for certain. Maybe someday a college sociology project will take the time to review some of the polls and try to figure out what went from from an objective angle. Who knows?

    Ask a Fudd if he supports the 2A. Answer = yes. Ask many on these boards if a Fudd truly supports the 2A. Answer = no. Is the Fudd lying. No. But, is his interpretation of the 2A radically different from many 2A activists? Absolutely.

    Now, would a pollster take that into account by asking additional questions? Probably not. They have a budget given to them by the group conducting the poll. They only have so much time for questions, tabulation and analysis to determine what the group questioned believes. That is why they do have a margin of error, but that margin may not be good enough if they pollsters don't understand the topic itself.

    Here is a simple idea. A restaurant currently doesn't offer a bacon cheeseburger, but they are thinking of adding one. So they take a limited amount of money to conduct a poll to see if a bacon cheeseburger would go over well in their area.

    The first question is: Do you like bacon? Is this the end of the story? No.
    So the second question is: Do you like your bacon crisp? This is because we only want to give people yes/no answers to get a good idea.
    This to be followed up with the third question: Do you like your bacon limp?
    We can add some more.
    Do you like your bacon thick?
    Do you like your bacon thin?
    Do you like your bacon with pepper on the sides of it?
    I could go on but you get the idea.

    Let us presume that these questions were paid for and that an overwhelming response was 82% of people like bacon, and they like it thin, crisp and with pepper. In that case the restaurant may well come up with a good bacon cheeseburger that probably sells well.

    But what if they only ask the first question to keep it simple. And then when people say "Yes" they make their bacon cheeseburger with thick, limp, plain bacon because that's how the owner likes it. They're going to be shocked when the bacon cheeseburger doesn't go over well and will blame the stupid pollster for getting it wrong. Which is completely wrong because the pollster wasn't given the money to conduct a good poll. The restaurant owner thought the pollster was just trying to bilk more money out of him for more questions, which wasn't true. How many times does this occur with other professionals that try to get their clients to do better, but it will cost more?

    Regards,

    Doug
     

    jamil

    code ho
    Site Supporter
    Rating - 0%
    0   0   0
    Jul 17, 2011
    60,607
    113
    Gtown-ish
    Speaking of polls, sometimes the question asked isn't the one answered. For example, the other day, Michael Smerconish had a poll question on his website something to this effect: Will Trump's budget proposal adding $54B to military spending make the US stronger?

    It seems objectively obvious that spending more money on the military will make the US stronger. Objectively, there is a fair correlation between military spending and military strength. More people, more weapons, more strength, even if it's inefficiently spent, even if we spent $54B on one tank, we would be one tank stronger.

    But that's not the question most people answered as they justified their yes or no, mostly no. Judging from the responses of the people calling in, it was obvious they thought they were answering the question, "is it wise to spend $54B more on the military. Their emotions about Trump and their ideological views about the military colored their interpretation of the question.
     

    churchmouse

    I still care....Really
    Emeritus
    Rating - 100%
    187   0   0
    Dec 7, 2011
    191,809
    152
    Speedway area
    One thing I know now is, when the next election comes up, I'll pay no attention to the polls.

    They are an annoyance to us.
    Last 3 or 4 election cycles the polls have been skewed.
    Hell John Kerry was sure he was in the money with the Bravo Sierra exit polling.
    Just another media "Make" the news tool.
     
    Top Bottom