Why do liberals dislike america so much?

:thinking:

rynoBxQ.png


When obama is in charge, liberals are barely over the 50% number.
Otherwise, way down in the 30's.

WIth republicans, they love america no matter who is president, even if it's obama who republicans obviously hated with a passion.

It seems that republicans/conservatives are more principled, and still love america more and don't let their hatred for a president make them dislike their coutry

Liberals on the otherhand, do not appreciate america. They don't like free speech, they don't like guns, they don't support our police, they always whine about moving to canada. Liberals are pathetic losers.

This thread is the exact kind of divisive, dishonest bullshit that is ruining the shreds of this once great nation.

You should be ashamed of yourself.
 
This thread is the exact kind of divisive, dishonest bullshit that is ruining the shreds of this once great nation.

You should be ashamed of yourself.

So long as the Democratic Party exists, America will always be at risk of becoming a once great nation. We should have ended it in 1865.
 
This thread is the exact kind of divisive, dishonest bullshit that is ruining the shreds of this once great nation.

You should be ashamed of yourself.

try being more proud in our country

I love the salty libs getting butthurt over this thread :laugh:

it's like.. the proof is right there in front of all your faces and there is nothing you an do about it except whine. It's a much different reaction than I normally get from liberals. This one stings you guys, I know it does.
 
Last edited:
she got 31% of the vote in 2018 and constantly talks about hating white people, hating america, etc. just a normal democrat america hater.
 
News polls generally are. The never show the questions that were asked, or how loaded they were, they never show who was polled, and they never show the raw data itself.

All polls are worthless, even if the methodology is cited.

Here's why: People lie all the time, and polls only record what people say. What people do is reality.
 
Leftys love America. we want to return to competition and remove the oligopoly the rightys are bringing us. Few of you were around before the wealthy merged and bought out the competition. When I was younger, we had price wars for nearly everything, including gas. Companies competed by offering better customer service. They advertised who had the best return policies. The mantra was "the customer is always right". Wouldn't it be nice to return to competitive capitalism?
nearly anything you buy is the same approx. price, has the same miserable customer service. Mantra today' gimmie your money and get out of my face"
 
try being more proud in our country

I love the salty libs getting butthurt over this thread :laugh:

it's like.. the proof is right there in front of all your faces and there is nothing you an do about it except whine. It's a much different reaction than I normally get from liberals. This one stings you guys, I know it does.

I'm not ashamed to admit that when I see the Flag of the US fluttering in the wind; my heart beats faster and my chest swells with pride of being in a Nation like ours, even with all it's faults.

Because the US will work towards solving those faults, as it has in the past.
 
All polls are worthless, even if the methodology is cited.

Here's why: People lie all the time, and polls only record what people say. What people do is reality.

please stop being an idiot. polls are very valuable and often times very accurate. don't be a science denier.
 
you are respectfully retarded then

How so?

Are pollsters able to determine that the responses they get are truthful?

Then there are the other issues:

20170617_IRC024.png

Predicting the outcome of elections is an inherently chancy endeavor.

“If you look into the crystal ball,” says an experienced pollster, “you’ve got to be ready to eat ground glass.”

But pollsters’ job is getting harder. The number of people willing to answer their questions is plummeting.

Of every ten people in rich countries they contact by telephone, at least nine now refuse to talk.

Far more intractable is the bias that creeps in when samples are not representative of the electorate. Taking bigger samples does not help. The margins of error cited by pollsters refer to the caution appropriate to sampling error, not to this flaw, which is revealed only on polling day.

New political fault lines are complicating their efforts to find representative groups to question, and voters’ changing behavior blindsides them as they try to discern the truth behind polling responses.

Old political allegiances are weakening and public opinion is becoming more fickle.

Confidence in polling has been shaken. Pollsters are scrambling to regain it.

Sam Wang, a neuroscience professor at Princeton and part-time psephologist, kept a pre-election promise to eat an insect on live television if Mr Trump won more than 240 electoral-college votes.

Statistical models of election outcomes attempt to quantify the uncertainty in polls’ central findings by generating probability estimates for various outcomes. Some put Hillary Clinton’s chance of victory against Mr Trump above 99%.

To deal with non-response bias, pollsters try to correct their samples by a process known as weighting. The idea is simple: if one group is likelier to respond to a survey than another, giving a lower weight to the first group’s answers ought to set matters right.

But adjusting weights is also one of the ways pollsters can do what political scientists call “herding”. If one weighting scheme produces a seemingly outlandish result, the temptation is to tweak it. “There’s an enormous pressure for conformity,” says Ann Selzer, an American pollster. Polls can thus narrow around a false consensus, creating unwarranted certainty about the eventual outcome.

To make weighting work, pollsters must pull off two difficult tricks. The first is to divide their samples into appropriate subgroups. Age, sex, ethnicity, social class and party affiliation are perennial favorites. The second is to choose appropriate weights for each group. This is usually done with the help of a previous election’s exit poll, or the most recent census.

But the old political dividing lines are being replaced by new ones. Increasingly, samples must be weighted to match the voting population for a much larger set of characteristics than was previously needed. Levels of education, household income and vaguer measures such as people’s feelings of connection to their communities have all started to be salient.

Earlier this century, online betting exchanges beat pollsters before several big elections. Economists argued that the forecasts made by punters with money on the line were likely to be more considered than the sometimes offhand responses given to pollsters.

Far more intractable is the bias that creeps in when samples are not representative of the electorate. Taking bigger samples does not help. The margins of error cited by pollsters refer to the caution appropriate to sampling error, not to this flaw, which is revealed only on polling day.

Spotting new electoral rifts and changing electoral habits will require much more data (and data science) than pollsters now use. And picking up changing social attitudes means measuring them, too—which will take never-ending checks and adjustments, since those measurements will suffer from the same problems as pre-election polls. Pollsters will also have to improve their handling of differential turnout and undecided voters. Most accept self-reported intention to vote, which turns out to be a poor guide. And they often assume that undecided voters will either stay away or eventually split the same way as everyone else, which seems not to have been the case in recent contests.

And dealing with declining response rates will probably require new ways to contact prospective voters. During the early days of internet polling, many feared that online samples were bound to be unrepresentative, mainly because they would include too few older people.

A striking example came in 1936, when Literary Digest, a weekly American magazine, asked its affluent readers whom they would vote for in that year’s presidential election. Nearly 2 million replied. But the sample, though large, was horribly biased. Based on it, Literary Digest forecast a landslide for Alf Landon. He went on to lose all but two states to Franklin Roosevelt.



https://www.economist.com/international/2017/06/17/britains-election-is-the-latest-occasion-to-bash-pollsters
 
How so?

Are pollsters able to determine that the responses they get are truthful?

Then there are the other issues:

20170617_IRC024.png

what I see is polling science getting more and more accurate over the last 70 years

Polling isn't make believe The issue is the media will take raw data and spin it how they want. Like for example how I always talk about the difference of sampling "american adults" vs. "registered voters" vs. "likely voters"

None of those polls would be wrong on their own, the problem is when people try to extrapolate from one set of data and pretend it's a different set. So sampling "adult americans" to demonstrate something about electoral chances is stupid, because if you want to know how an election is going to go you need to poll people that will vote, not randos that are staying at home.

Now lets talk oversamping. Lets say you live in a nation with 50 democrats and 50 republicans. You do a poll on whether or not you should ban pool noodles and you get responses from 60 democrats and 40 republicans. ALl you have to do is simply weight the republican answers higher so that you get more brought in line with your communities demographics. This is what happens all the time when people go "herr deerrr they sampled more liburals"... yeah they sampled more of them and weighed them less who cares. It's really not that difficult. it's very basic stuff.
 
Back
Top