Ask Question
12 September, 06:19

Most aspects of foreign culture, such as language, religion, gender roles, and problem-solving strategies, are hard for the casual observer to understand. In what ways do Hol-lywood movies affect national culture outside the United States? What aspects of U. S. culture do Hollywood films promote around the world? Can you observe any positive effects of Hollywood movies on world cultures?

+4
Answers (1)
  1. 12 September, 08:12
    0
    First, what is culture?

    Explanation:

    In antropology one learns that ¨culture is the continious confrontation of social movements¨ (Richard Fox).

    Another Antropologist said that we can safely draw the conclusion that the United States Americanized the rest of the world.

    When looking at national culture (outside the U. S.) and external culture (U. S) we see the ongoing confrontation of the national culture with the external culture, visualized by Hollywood. Taken in account that all around the world people watch Hollywood movies, they obviously are influenced by the Hollywood culture.

    The aspects of U. S. culture that Hollywood promotes are wide-ranged: from cloths to music to cars to counterculture to living the American Dream to coca cola ...

    Positive effects? Let me think. In order to answer this we have to decide what can be called positive. Ah! The English language.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Most aspects of foreign culture, such as language, religion, gender roles, and problem-solving strategies, are hard for the casual observer ...” in 📘 Social Studies if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers