Only thing they could do is make it cost more by reemoving insurance companies.
That makes sense to you? Health insurance companies add many billions to the cost of health in America and do not provide any care at all. They stand between you and the doctors. They should be eliminated.