How does Hollywood change people for the worse?

3

3 Answers

Arthur Wright Profile
Arthur Wright answered
Mostly with false dreams and hopes
thanked the writer.
Maxine Chan
Maxine Chan commented
I can see that happening but once they make it in Hollywood, they become egostatical and think they are better than everyone else because they are famous.
Arthur Wright
Arthur Wright commented
Yes fame and money does ruin people if they let it
Moo C. Profile
Moo C. answered
Most actors enjoy being in the spotlight (lucky them it's their job) so they will either get into lots of movies or do something else to get a ton of publicity. It's all about self restraint and being able to acknowledge that the world doesn't revolve around (twitter doesn't help with this)
thanked the writer.
Maxine Chan
Maxine Chan commented
Twitter makes them more famous:)
Moo C.
Moo C. commented
True but at the same time it also makes them feel more important than they are/ should feel.
Lianna Lins Profile
Lianna Lins answered
It ****s them up. They feed them so much psychological junk that most people don't come out of their experiences weirded-out in some way. Being famous is really the main thing, it usually makes actors feel that they are the best thing on earth. I really don't think it's healthy to venerate normal people the way Hollywood does, no matter how talented they may be.
But that's Hollywood's aim in the first place...to mess up perfectly sane people. Sad isn't it? =(

Answer Question

Anonymous