Hollywood pushes a liberal agenda to the rest of the country. And, whether we like it or not, Hollywood dictates the culture of the country. Stacey Dash
No Comment! Be the first one.