I am dying to ask a question about health care reform, and I guess this is as good a thread as any to do it... so here goes...
We keep hearing from the left, that healthcare is a "right" ....I don't know that I accept the fact that we were endowed by our Creator with the right to never be sick, but just for the sake of argument, I am willing to give the benefit of doubt on this... it's a right! So, when did it become incumbent upon government to facilitate our every right? When is government going to start subsidizing my guns and ammo? I have the right to bear arms, shouldn't the government provide me with arms if I can't afford them? Shouldn't government be charged with ensuring there is a church in every town? We have the right to worship, but shouldn't the government provide the place of worship for us?