The US has always been a settler-colony, but it became more Imperialist after World War I with the inter-ally debts. It became world hegemon after the dissolution of the Soviet Union, however.
The US has always been a settler-colony, but it became more Imperialist after World War I with the inter-ally debts. It became world hegemon after the dissolution of the Soviet Union, however.