The whole debate over whether the United States is (or was) a "Christian nation" can often be settled - or at least eased - by clarifying what it is we mean by "Christian."
If by "Christian" nation, we mean that all the Founding Fathers were orthodox, Trinitarian, evangelical, born-again Christians, then....we can't say that the US was founded as a "Christian" nation.
If by "Christian" nation, we mean that the Founding Fathers intended to establish an orthodox, Trinitarian Christian GOVERNMENT, then...clearly, the United States was not founded to be a "Christian" nation.
And if by "Christian" nation, we mean a nation that has always lived up to Christian ideals and has made - as its primary policy goals - the advancement of Christianity, then....once again, we can't say that the United States is a "Christian" nation.
But if by "Christian nation," one means simply that Christianity (broadly defined) has been the most influential religion on the development and shaping of said nation and/or that a majority of the nation's citizens align themselves with Christianity in one form or another, then....I think one would have to say that the United States is a Christian nation.