Sunday, January 15, 2012

Is America's past colonialism in Africa to blame for that continent's wars and famine?

America has colonized nearly every nation on earth, especially in Africa. The American colonial masters have upset the native habit, killing plant and animal species and introduced to Africa the tyrant. How can America correct this horrific mistake it has made?

0 comments:

Post a Comment