Why? As I originally said, Jesus wasn't white and white people ate Christianity right up. White people didn't even take notice for about 600 years which is when Islam first appeared anyway. Then there was this weird period where the Christian churches were all controlled by secular, pagan, or atheistic kings who didn't give a hoot about Jesus but found the church influence to be super useful.
I just still don't get the joke. Why don't we just ditch the entire idea of a "white man's religion" entirely?