As someone who's engaged in plenty of research during the course of my career, I've learned to discount about 98% of political stats as complete bullshit. The "poll" designers begin with a premise, and structure the sample population, sample location, and questions in such a way to get the desired result rather than the other way round. As an example, it's like this crap about "90% of Americans support background checks" and the reporter uses CNN stats. They tend to survey in coastal areas where they're likely to get a good sample return for their effort (common sense in wanting a high sample return, and can't be faulted), but their population location isn't representative of the entire national population-not even close. Ask the question "do you feel we have enough gun laws" in Berkeley, , among "registered voters" and you're going to get a completely skewed response compared to Little Rock, where both are similar in population size. Sure as hell, they know more people in Berzerkley are liberals than in Little Rock, and are more apt to lean one way as opposed to the other. You notice they NEVER reveal their sample populations as anything other than "registered voters", or "likely to vote". If the true sample were so overwhelming, then you'd think they would have let their reps know, and the national vote would have certainly gone the other way.