Respuesta :

WWI changed America by changing the way of fashion, also by making the employment of women more