Well, we kicked out almost everybody who was originally here like a bunch of arrogant meanies then we formed it into a place where every race other than white is oppressed. And then we have always stuck with the idea that women are weaker than men which isn't true. So America is a place where white men live freely and happily while everybody else is kicked off to the side.