So is (or was) America a Christian nation? If by that we mean that America is a Christian theocracy, that our government should give Christians preferential treatment, or that members of other faiths aren't welcome, the answer is an emphatic "no."
But if we are talking about the ideals that led to the very colonization of this land, our declaration of independence from Britain, and the formulation of our Constitution, then the answer is certainly "yes."
In the words of professor John Eidsmoe, author of “Christianity and the Constitution: The Faith of Our Founding Fathers,” "If by the term Christian nation one means a nation that was founded on biblical values that were brought to the nation by mostly professing Christians, then in that sense the United States may truly be called a Christian nation."
Why does this matter? Simply because our dominant secular culture delights in demonizing Christianity, distorting its character, conflating it with less tolerant faiths, and associating it with all our societal woes. History revisionists have convinced many that we mainly owe our liberties to secular humanist ideals and those borrowed from the Greeks, Romans, and the French Enlightenment.
Monday, April 13, 2009
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment