How America Became Capitalist
Has America always been capitalist? Today, the US sees itself as the heartland of the international capitalist system, its society and politics intertwined deeply with its economic system. Parisot’s book looks at the history of North America from the founding of the colonies to debunk the myth that America is ‘naturally’ capitalist.