For generations, Christianity was not just the dominant religion in America. It was a pillar of national identity.
From town halls to the White House, its language, values, and authority shaped laws, social expectations, and cultural norms. Pastors were community leaders. Churches were strongholds for society. Christian morality was marketed as synonymous with American virtue.
That era ended decades ago, even though its false illusion has lingered since. And while church leaders and conservative pundits have blamed secularism, liberalism, or modern distractions for the slow-motion collapse of Christian affiliation, they have routinely avoided the real culprit.
The Christian faith is not under siege from without. It is rotting from within.
Data from the Pew Research Center and Public Re