I never do what I'm told, besides that looked more like an assignment than a quick read, but if it had to do with the way England colonized the world and reaped the riches of the lands they occupied thats probably how London turned from mud flats to a metropolis. I'm also sure the way China is drugging everyone with fentanyl is payback for the way they were drugged with opium by the British all those years back.
My point is more about how because it's never talked about the general opinion is that whites went inland and rounded them up, as if the white slave owners were running around with big nets and lassos grabbing all they could then bringing them back and putting them to work. I remember watching a PBS show a while ago and a group of people were at the point of no return in Africa and it was pointed out that blacks were sold into slavery by other blacks and this American black woman looked like she was about to cry, she had never heard that, not back here in our public education system, and couldn't believe her ears. I'm fine with the history of slavery being taught, just as long as all the facts are presented and not just the cheery picked ones. I also think it's more important to point out America was where it ended first around the world, paid for in the blood and limbs of lots of white people and is still practiced today, by blacks in Africa.
Who SOLD them into slavery? Thinking.....thinking....oh, that's right, it was other blacks in Africa. Why is THAT never talked about? Just an inconvenient truth?