We just had thread titled "For anybody that thinks America was not founded under Christianity" (which is way over limit already and soon to be closed I assume) and of course the OP and others claming we were but the fact is no discussion can really be had until those making the claim can explain exactly what they mean by "founded under Christianity". How did this "founding under Christianity" manifest itself in our founding? Where is it found in our founding? What principles we are founded upon are the Christian ones, unique to and indicative of Christianity. Then a discussion of whether we are founded on those principles, Christian ones, can be had. So...............
The Founders of this nation had ample opportunity to insert the word "God", or "Jesus" into the founding documents if their desire was for this to be a theocracy. That they used the word "God" only once, and then only in the context of "nature", and never once mentioned Jesus, it's plainly clear that, whatever their personal beliefs, it was unanimous that this was not to be, in any shape or form, a "Christian nation".
Or you could say that the majority of founders had very strong religious beliefs and that anything they came up with would inevitably be in alignment with their morals, values, and ethics derived from biblical teachings. Whether or not they peppered the constitution with the word God is irrelevant.
` ` Like other myths about the United States, but particularly this one, it may hang on for awhile but eventually fade into the obscurity it so justly deserves.