What’s Happening in America?

I have been wondering why so much disrespect, even hatred, appears to be rising in America, the land of the free, the country that embraced “Send us your poor” for so long, the nation that was founded on religious tolerance and understanding.

What is causing so many people to curse their neighbors, their leaders, even their President? Peace starts within, by first loving our selves. Is all this due to lack of self respect? Do Americans as a whole not feel confident in their own self worth?

To the media: we need to see role models of people who do love themselves and do respect others because they do not need to be defensive. Please, put the light on the right way to live, not the worst-case scenario you are focused on now.