One Min Read
6 Views

Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.

Mike Royko

Related Items