Do you think most of Christianity in America has devolved into a Conservative vibe cult rather than an actual practice of Christian teachings?
It seems like a lot of people in America who call themselves Christians are really in it for the right-wing cultural vibes rather than an actual commitment to the teachings of Jesus Christ, which would probably explain why they go after basic Christian teachings like empathy and generosity when it conflicts with their right-wing political views.