It was a dirty little secret that's really come to light in the past year. Not only is Hollywood less than friendly to people of color, it remains, for the most part, a men's club.
After bad publicity, like the OscarsSoWhite campaign, the industry promised change. "Inclusion" became a buzz word, and there were seminars and various initiatives designed to open up the playing field.
All for naught, it seems.
A new report from a researcher at San Diego State University (pdf) finds the number of women in key positions on major films has actually dropped. For instance, only 7% of top films last year were directed by a woman. That's down from 9% the year before.
The bottom line: Things for women in Hollywood are about the same as they were way back in 1998.
Vanity Fair's Rebecca Keegan has been covering the gender gap in Hollywood for years. Click on the blue bar to listen to our complete conversation with her.