A lot of people seem to think that there is nothing Christian Americans would like more than to see every public school classroom display the Ten Commandments in order to inculcate Judeo-Christian values in our nation's children at the earliest possible age. Or to have "faith-based organizations" get the entire federal budget for social welfare while the rest are left to beg for handouts. Or to keep gay marriage illegal in every state for no other reason than "because the Bible says so".
This is not true. I don't, for starters. And I consider myself fairly conservative as Christians go. But the more I listen to the protests and outcries on both sides, from people who want more religious morality in American public life versus those who want it banished altogether, the more I think the argument needs to be disarmed entirely. This shouldn't be an issue. It shouldn't even be around to argue over.
My reasons are not the ones you might expect. I recognize that the United States is and always has been a land of religious freedom, and that it's important to respect the beliefs of people who don't subscribe to Judeo-Christian religion. In the context of government, it's important not to pass off any one religion's mores as the truth. It happens sometimes, but we have a Supreme Court to take care of things like that. And on the whole, they've done their job well. Roe v. Wade would never have happened if our government was as maniacally Christian as some people like to think.
But every time the word "God" with a capital "G" works its way into the national consciousness, some atheist somewhere works himself into a fury and loudly demands its removal. News stories are written and read. Letters to the editor appear which seem to imply that all Christians are conspirators in a movement to force their beliefs down every American's throat. And a few more people begin to believe, not knowing any better, that evangelism is just a euphamism for brainwashing.
Which is proof enough, I think, that we Christians need to spend more time evangelizing our faith one-on-one or through church-sponsored ministry and less time trying to get their government to do it for us. We need to spend more time instilling our morals and beliefs in our own children and less time demanding that public schools and the media do it for us. Yes, Christians would like to live in a country which supports the values that they themselves hold--but then, so would everyone else. You can't always get what you want.
I want my God, my faith, and my morality to be as far removed from the government as possible. I want people to know that I say "one nation under God" because I believe it, not because it's habit. I'd rather have "In God We Trust" hanging on the wall of my house than written on the face of my money. I'd rather see a hundred nativities on a hundred privately-owned lawns than one in front of a public courthouse.
I'm tired of people suggesting that my religion is only successful because my government endorses it. I don't want sponsorship for my faith. I want a grassroots campaign. I'd rather see Christianity be illegal in this country than have Congress take it over.