The history taught at U.S. universities must have changed drastically since 
I attended school in the 1960s. I was never taught this "PC" history. For 
instance, it was rarely if ever mentioned that many of our great and 
glorious leaders were slaveowners--that they held other human beings as 
their property, like farm animals. (If we had not been slaveowners 
ourselves, how would our history books have described this uncivilzed 
behavior in other societies?)

Our history never spoke of Columbus's genocide, never questioned later U.S. 
interference in South America, Woodrow Wilson's racism, or anything remotely 
negative about the U.S. As I stated in a previous post, my senior class 
could not think of any immoral act which the U.S. had committed in its 
entire history. What mistakes were we supposed to learn from?
Paul