One Min Read
14 Views

Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.

Mike Royko

Related Items