Since the first English Christian fundamentalists arrived in the 1600s on the shores of what would become the United States, Christianity has become increasingly embedded in the nation's social and cultural fabric. Freedom of-and from-religion is the American promise to all its people whatever their belief-or disbelief. This is how the Founding Fathers wanted it to be, not the undemocratic theocracy zealous evangelicals are trying to force on American society.