What were the important social, political, and economic changes in the U.S. b/t the end of WWI & end of WWII?
I know after WWI, the 20s were booming, which led to the Great Depression, which WWII helped us get out of. But I have to write a 2 page papers and I am stumped..
No comments:
Post a Comment