How do you feel about the world today? When you look around at your life, the media, and historic or current events- what conclusions do you come to about global trends? Does the world seem like it’s getting worse?
In a world of White Nationalism, Atomic Bombs, global warming, patriarchy, mass shootings, Election meddling, and increasingly autocratic government policies it can seem hard to view our world as anything but a stream of negativity that has only gotten worse over time.
Here in the US, our media system seems set up to bombard us with constant negativity to the point that frequent news check-ups (whether you are conservative or liberal in your ideologies) leaves you with an understanding that our world is increasingly becoming amoral, that more people are against your views, and that more disasters are befalling us every day than had decades ago.
Today I’m going to argue that this world is not getting worse, that instead it is getting safer and that this trend may be a psychological phenomena that will continue into our future.