Throughout the course of my life I've heard Christians, especially Evangelicals, argue that God has removed the blessings and covering from America we took for granted. They say all of the violence, addiction, death and destruction we are seeing is a result of losing favor with God. Let's say, for the sake of agreement, that's all true. Where was God during the genocide of this continent's original inhabitants, chattel slavery, Black codes, Jim Crow, misogynistic treatment of women and segregation?
Did God sanction those atrocities? Did a loving God turn a blind eye to the undeserved suffering of those people? Did God bless and cover those responsible for the crimes against their humanity?
When did God turn his back on America? What was the final straw? I have questions.
These aren't questions about Theodicy, bad things happen to good people all the time and we are left wondering why God allows the innocent to be victimized. These are questions about a nationalist ideology that ignores history and prioritizes the culture wars. Too many Christians refuse to address these kinds of questions. Did God bless the dehumanizing and barbaric behavior that was a fundamental part of the formation of this nation?