The United States was not founded upon Christianity.
It is not a Christian nation.
You can lie to yourself about it all you want, vtool -- the simple, historical, blatantly provable, obvious fact is that the United States was founded largely by men who were not Christian, and the last thing they wanted to see, even the Christian ones, was a nation where a single sect of Christianity took over and became the state religion.
Since you think the Christianity of this nation is "well documented", let's see the evidence. I can shoot down any argument you can make.
The Treaty of Tripoli -- which had the force of law over the entire United States at the end of the 18th Century -- is the most obvious evidence that your argument is flawed. You can read the treaty here, but Article 11 is the most important bit for this argument. It begins:
As the Government of the United States of America is not, in any sense, founded on the Christian religion...
For a Treaty that the United States entered into, within a decade of the country's birth, to include such specific language clearly shows what the founders intended.