Over the years, many Black actors have voiced their concern over how Hollywood treats them and their talent. Hollywood often props up Black actors,musicians, etc without even taking their talents seriously, and that shows in who they give awards to and why.
It seems this has been the pattern regarding how Blacks are viewed in this country.Sure, some of them may have money, but they will never ever be fully respected or treated with dignity. Look at who got awards throughout the years, and what roles did the Black actors play that got them the award in the first place?
That being said, I've wondered why the actors haven't pooled their money together and created a Hollywood that centers on their talents without letting it get bought out (cough, BET, cough)?
I guess now's the time, according to Jada Pinkett Smith. That being said, we've got to change our standards regarding what constitutes as "good'. Better late than never, I guess?