What a great question! I believe it is definitely both. Of course, movies are designed to entertain the audience. After all, the producers, directors, etc. all are seeking to generate a lot of money from the film, so of course they want it to be entertaining so people will first, go to see the movie, and second, recommend it to friends. But there is a darker side to the movie industry, in my opinion, and that comes into play when you look at the concept of influencing the audience.
Many movies coming out of Hollywood these days are designed to create or promote a particular 'world view'. Movies today are filled with underlying messages designed to change opinions on various topics. Hollywood does a great job of disguising this agenda by wrapping it into the story and making it appear acceptable and, indeed, normal.
If you love watching movies (and I do!), the best thing you can do is pay attention to the underlying agenda that may be "pushed" in the movie you are viewing, so you won't allow yourself to be unduly influenced.