In most people's minds, America-- as the original concept meant-- no longer exists. Liberty has been replaced by permission, which can be denied. Every area of life is now subject to government oversight and control. And background checks and licenses. Things intolerable in a free society.
This happened because the concept of America was replaced by a state-- a political government that destroyed the liberty that defined the idea of America. It happened with the adoption of the Constitution, if not earlier.
"The United States" replaced America; took its place. You could say The United States is the rotting corpse that is left for all to look upon. The dead remains of what was once America; still a physical thing, but not what it once was. Not what it could have been.
When you hear "The United States", think: "The rotting corpse". Saying "The United States of America" just refers to "the rotting corpse of America".