Question of the Day #67: A Christian Nation
The United States of America is claimed by many conservative Christians to be a "Christian nation," implying that it was founded on principles of Christianity. Others have argued that it was not the intention of the "Founding Fathers" to create a nation for and by Christians and/or, even if it was their intention, we as a modern society embracing diversity should reject the notion. Yet despite the legal status of religious discrimination in the government, there has never in more than 200 years been a non-Christian President, and very few non-Christians are voted into other national offices.
So getting to the actual question, do you think that the United States is a Christian nation in practice if nothing else, and what does this imply for non-Christian U.S. citizens today and in the near future?