In the nineteenth century, Christianity was a powerful oppositional force, adding strength to many social movements. In the twentieth century, its impact has been largely negative.
I'm putting together a course on Comparative Social Movements in 19th-century Britain and the US, and in doing that have been reading a really interesting book about the Democratization of American Christianity.
In the nineteenth century, the author (Nathan Hatch) argues, as a result of the individualism and anti-traditionalism spawned by the American revolution, there were a plethora of new Christian movements. Individuals felt as though trained clerics no longer had all the answers--a person could read the Bible instead and come up with his own interpretation of the truth. There was a sect to suit every flavor of belief. It was a time of true religious populism (explaining how even truly fringe individuals like William Miller and Joseph Smith were able to garner disciples).
At the same time, the 1830s and 1840s saw a flourishing of reform movements and humanitarian impulses pushed forward by evangelicalism. These reforms spanned the spectrum from what we might consider "left" (antislavery, prison reform and factory reform) to what we might consider "right" (temperance and sabbatarianism). One would be hard-pressed to argue that during these decades, evangelicalism was not a force for good.
What happened to the power of Christianity to foster critical thinking and social action? Why is Christianity so forcefully identified with the Right? How has it become such a top-down endeavor? I often wonder about these things.