StoneByStone
It's OK to Be White
The Right often talks about how things would be so great if America just embraced Christian values like we supposedly did in the past. I know two Christian values they want to bring back are homophobia and a ban on abortion, but is that it? What else do you guys want to do to make America more Christian?
Last edited: