I'm going to ignore the fact that the ramblings in the opening thread didn't much match the original question asked and just answer the question. Hollywood has come a long way over the decades that movies have existed, but one thing it has always done and always will do, is move in circles. The original Hollywood had no regulation at all, so it was pretty much a free-for-all, where actors and actresses danced nude on tabletops and did weird things with each other on film. Then everything was over-regulated to the point where nobody could do anything on film except look sexy. Over the years that sexy envelope was pushed back further and further until we reached the point where Hollywood could do almost anything they wanted to on film again. Is Hollywood satanic? Well... yes and no. Since many of the directors and producers in Hollywood are actually Jewish, they have their own ideas about God and the devil and work accordingly. Plus, right now there seems to be a Christian movement in Hollywood as seen in such films as Jesus, The Passion of the Christ, God's Not Dead, Miracles From Heaven, etc. etc. There was a point in the past when Hollywood was making a plethora of Christian oriented movies, so I guess that's the stage we are currently at in the Hollywood endless circle cycle. At the same time, there are horror films being made all the time that continually push the envelope of darkness a little further toward the evil side of things. So again, my answer is yes... and no.