Is CGI ruining movies?
- As early as the 1970s, computer animation in film began with visual effects and short animations that were created using the layering of 2D images.
- The first use of CGI was during a scene in Westworld in 1973.
- Toy Story became the first full-length CGI feature in 1995.
- Some of the most groundbreaking movies to use CGI are as follows: Cats, The Irishman, Star Wars, Jurassic Park, Avatar, The Matrix, and The Lord of the Rings.
Due to the normalized usage of CGI technology available to filmmakers, the overuse of CGI ruins live-action entertainment. The cost of these effects has decreased from over $73 million per production in 2009 to $33.7 million in 2018. This results in the churning out of films - where everything from clothes, props, and story environments to even main characters - that are completely computerized and manufactured. CGI may have become more affordable for entertainment, but the material story quality of these releases suffer as constant spectacle distracts from the story.
Cohesive storytelling, character development, and a strong narrative should take precedence over flashy effects. Few current releases prioritize these qualities. Even if a film appears dated, if the plot holds up, the film will be celebrated by future generations even with a lack of CGI effects. In recent years, studios have released films from a series in quick succession, such as those from the Marvel Universe, which have been very hit or miss with fans but financially successful. Many of these have been over-stylized with the plot suffering as a result. Even the action scenes can be tough to follow and disorienting. How many fight scenes or even the plot of Transformers do you remember?
Many films, which have held up well without CGI, are now being ruined by becoming franchises. The Alien franchise has been destroyed from the original Ridley Scott production. There is talk of making a sequel for Labyrinth. Studios seemingly have very few new ideas and are only interested in a quick, flashy turnover. Also, if Jar Jar Binks isn’t enough of a reason to examine our use of CGI I don’t know what is.
We are drawn to movies because of how we can lose ourselves in the story and experience. It’s an escape from reality, so why should it be limited to reality’s constraints? Movies are continually and now regularly shaped by cultural demands and altered by technology to fit visual expectations. Storytelling has even shifted to video games and VR (Virtual Reality), allowing viewers to become participants into immersive visual and interactive stories.
CGI (Computer-Generated Imagery) is a form of digital technology that allows moviemakers to create imagery beyond what can be done physically. Past movies like Avatar and The Matrix, when released, altered the landscape of filmmaking due to how they utilized CGI to create something completely unique from our reality. The most popular movies of the past few years, such as Avengers: Endgame, would be impossible without CGI.
The possibilities of what CGI can do include de-aging actors, allowing for more stories that span generations, and creating realistic animal personification. CGI can also be used to recreate people who’ve died before they could finish an important role, like Carrie Fisher in Star Wars.
CGI isn’t taking anything away from traditional storytelling. Instead, it challenges Hollywood storytellers to learn and adapt to the demands of our time. In fact, the technology relies on all previous filmmaking tools to create anything admirable. As effects artist Rick Baker said, CGI is “only as good as the artist behind it.” It’s the same filmmakers, merely using an altered medium that expands their options. Without CGI, filmmaking would be a much duller landscape, stuck in the limited creativity of the past.