Member-only story
Is the United States in Decline?
The United States has been a powerful leading force for democracy and world order since the Great Depression, but there have been clear signs of decline.
This article is not intended to be a biased bashing of the United States. I have no axe to grind. My family and I will be eternally grateful to the U.S. military forces, which assisted UK military forces in the liberation my family from a POW camp in Hong Kong where we were prisoners of the Japanese for four years during WWII. I have family and close friends in the U.S. I greatly admire and appreciate the United States and Americans for their significant contributions to the world in science, democracy, global cooperation, economic growth, and culture. This has prompted me to write this article, partly hoping that the decline does not continue, which could have disastrous effects on the US and the rest of the world. As much as I possibly could, wherever I describe American decline, I provide documented evidence or expert opinions.
The Decline
For decades, the United States has been in relative decline, facing the prospect of someday being overtaken by a rival power. Its main problem, however, is not the relative decline itself — it’s a natural phenomenon occurring as companies, sectors, regions and countries grow at uneven rates. Instead, its main problem is a failure to recognize this…