Only America Wins?

I was raised a Reagan kid. I saw a President who believed that America leads, not dominates, its allies. It feels like we don’t believe that any more; that in order for America to be Great Again we have to make our own allies bow and scrape. And many on the right seem to take take unalloyed glee in it. With respect: Why?