As a liberal Christian, I'm disturbed at the attacks on DEI (Diversity, Equity, and Inclusion) and the concept of 'woke’ in the early days of the Trump administration.
The average person seems to not understand what DEI is about. Diversity simply acknowledges the many kinds of people in the workplace by race, gender, beliefs, backgrounds, and health/disabilities, etc. Doing away with these programs doesn't make the "other" go away. It simply makes life harder. Equity is not a scary term. There have always been people of privilege: mostly wealthy, educated white males. Equity means making sure everyone gets the same opportunities and earns the same pay as others doing the same job earn. Inclusion creates policies that promote a sense of belonging.
The more I think about it, DEI sounds like a Christian concept. Perhaps Jesus was 'woke,' because he hung out with sinners. And he was no white blue-eyed boy himself. Woke is not antithetical to Christianity, with Paul’s exhortation in Galatians 3:28, “There is neither Jew nor Gentile, neither slave nor free, nor is there male and female, for you are all one in Christ Jesus.”
Most Christians agree this means that in Christ, all people are equal regardless of their gender, ethnicity, or social status. Furthermore, Jesus gave us the Golden Rule – treat others as you want to be treated. "A new commandment I give to you, that you love one another: just as I have loved you, you also are to love one another." (John 13:34-35)
What would our communities be like if we were to follow those two scriptures? If we were to show respect and love and look beyond the labels we put on people?
So far, I’ve seen little evidence of those two scriptures in the actions taken in the first week of the new administration, that was supposedly out to unify America.
What do you think?
Share this post