The United States in World War II
A Documentary Reader
Samenvatting
World War II profoundly changed America: not only did it serve as the impetus for far-reaching changes in all aspects of life at home, but it also dramatically altered the perception of America internationally.

