What happened to america after the second world war?

What happened to america after the second world war?

Answer #1

Wow… that’s kind of a broad topic - can you be more specific?

In short, it used all the manufacturing techniques it learned (and re-learned) during the WWII and became one of the world’s foremost super powers, culturally, economically and militarily. It became more free in terms of social issues (women’s rights, rights of minorities) but less free in terms of economics (higher/more intrusive taxes, higher/more intrusive government controls over economic concerns).

More Like This
Ask an advisor one-on-one!
Advisor

The Lawyer World

Lawyers, Attorneys, Law Firms

Advisor

World Veterans

Nonprofit Organizations, Veterans Services, Charity

Advisor

Katzner Law Group

Estate Planning, Probate, Trust Administration

Advisor

Kellogg Brown & Root

Government Contracting, Construction, Financial Services

Advisor

Storify News

Politics, Entertainment, Science