Hollywoodism: The Ideology that has Devoured Western Culture
Western culture has come to define itself through an alien ideology disseminated through Hollywood. It is a superimposed culture, a fantasy structure that we’re all living in. True Western European values have been replaced by another, a new global religion called Hollywoodism. Who is behind this new ruling ideology, re-defining western culture?