The history and evolution of Hollywood makes for an interesting subject of inquiry. The origins of film industry in the United States can be traced back to the 1920s, when the first silent movies were opened for public viewing. The movies released in the first decade were free of any censorship and hence contained controversial…