If you think Jennifer Aniston's hair requires a lot of attention for a "Friends" shoot, consider this: Of the four years it took to make the sci-fi film "Final Fantasy: The Spirits Within," nearly a year was spent coiffing the 60,000 hairs on the head of its new digital Hollywood star, Aki Ross.
"Final Fantasy," the first digitally animated movie to feature photorealistic characters like Ross, is just one of many animated movies to challenge Disney's dominance in the brush-strokes and pixels domain this year.
In the past, only a trickle of animated feature films made it to the big screen. Now, studios like Sony, DreamWorks, and Nickelodeon have joined Disney in producing such a rich variety of animated films geared toward kids and adults, that Oscar has taken notice. Disney's coming "Atlantis: The Lost Empire" and "Monsters, Inc." are possible Oscar contenders in the new feature-length animation category next year. But they'll face stiff competition from films like "Osmosis Jones," "Jimmy Neutron," and "Shrek."
Animation is experiencing vibrant changes in both style and direction. "I think being able to go to worlds you've never been before, and to places you've never been before keeps people's imaginations alive," says "Shrek" co-director Andrew Adamson. "It's the public wanting to be refreshed with something."
Veteran Disney animation producer Don Hahn says the animation boom started when studios took notice of the success of "The Little Mermaid" (1989) and "Beauty and the Beast" (1991).
"I also believe that [with] movies like 'Men in Black,' 'The Phantom Menace,' and 'The Mummy Returns' ... I think you're seeing a blurring of the lines between what is animated and what is a live-action movie these days," Mr. Hahn says. "All that adds up to a reinvigorated medium."
In particular, people are responding to cartoon movies of the CGI (computer-generated imagery) kind.
"It's sort of the year of the CGI," says Nickelodeon president Albie Hecht. "Between 'Shrek,' 'Final Fantasy,' 'Jimmy Neutron,' and 'Monsters, Inc.,' [there are] probably more movies done in CGI now than [hand-drawn] cel animation."
The first movie to employ computer-generated imagery from beginning to end came in 1995 with Pixar studio's "Toy Story." It revolutionized animation because it allowed computers to create three-dimensional models of characters that could then be manipulated by the artist. Director John Lasseter won a special Oscar for this animation milestone.
But the traditional divide between hand-drawn animation and computer animation is increasingly an artificial one - cel artists are quickly becoming as fluent with a mouse as they are with a sketchpad or paintbrush.
Computers were first used in Disney's "The Great Mouse Detective" in 1986 and were later used to create sequences like the ballroom scene in "Beauty and the Beast." The three-dimensional, deep-focus backgrounds of Disney's "Tarzan" and coming "Atlantis" are also computer generated.
But although traditional animated movies of recent years, such as "The Prince of Egypt," "Mulan," "The Emperor's New Groove," "Anastasia," "The Hunchback of Notre Dame," and "Pocahontas" have performed well at the box office, few have replicated the sheer drawing power of earlier bonanzas like "The Lion King" (1994) or "Aladdin" (1992). Other features, like the acclaimed "The Iron Giant," and Japanese "Princess Mononoke" failed to find a US audience, while 2000's "Titan A.E." sunk Fox's animation studio weeks after its dismal release.
At the moment it's the eye-dazzling, fully computer-animated films, such as "Antz," "A Bug's Life," and the "Toy Story" movies that have a "buzz" factor working in their favor.
"Video games have changed kid's aesthetics with film. They are used to deep focus and detailed backgrounds," says Chris Lee, producer of the $100-million-budget film "Final Fantasy." He says that only a gaming company like Square, the Japanese makers of the popular "Final Fantasy" video-game series, would have had the vision to create photo-real characters, something traditional animated studios had once thought impossible. There is so much CGI now in a live-action movie, "it looks half like a cartoon. So we wanted to go the whole way," Mr. Lee says.
But the very concept of realistic-looking human characters in "Final Fantasy" is proving controversial to some.
"I think we're just fascinated with computers right now," says Hahn, who recently completed Disney's hand-drawn adventure epic "Atlantis." "I think the ridiculous conclusion of computer graphics is to take them to re-create reality. You can create a graphic world on-screen that's like an impressionist painting of life ... it's almost more moving and involving that way."
Mr. Adamson says people frequently ask him about whether it is possible to create a realistic human character.
"I say to them, 'Why would you want to?' " Adamson says. "There's a general public interest in creating CGI humans that I don't think I understand."
Adamson worries that the scientific challenge of being first to do it could prove too much of an end in itself rather than being merely a means to drive the storytelling.
Decades ago, animators didn't have the tools to focus on dazzling visual effects, realistic textures in hair and clothing, and richly detailed surroundings. Bruce Johnson, a former executive at Hanna-Barbera Productions and executive producer of PBS's "Jay Jay The Jet Plane," recalls that in the 1960s, the dialogue and the story came first.
"The look and animation mattered less," he says. "Hanna-Barbera focused on a very flat style of animation."
While dazzling visual effects are still needed to attract kids, viewers will stay in their seats "not because of style, but because of the stories," Mr. Johnson says.
An argument can also be made that flashy visual effects needn't result in stories that are, well, sketchy. Used in the right way, computer effects can enhance the plots.
Darla Anderson, producer of "Monsters, Inc.," Pixar animation's follow-up to "Toy Story 2," says Pixar didn't start out with the intent to make a technological leap with digital animation. Instead, the team thought carefully about ensuring that the technology first matched the requirements of creating creatures for the story, a tale in which a little girl discovers that there really are monsters underneath her bed.
"The one thing about animation is that because it takes a long time, and it's very expensive to do, we really work on our stories," Adamson says. "We really try to hone them."
While "Final Fantasy" is selling its "gee whiz" technical breakthrough as its major marketing point, producer Chris Lee maintains that the raison d'etre of the film's revolutionary animation is to create a new format for telling stories in different ways. "This is not the future of filmmaking," Mr. Lee continues, "just a part of the future of film."
Disney's Hahn predicts that animation will move in a number of directions. For one thing, the lines between live-action and animated films will continue to dissolve. For example, Disney's "Dinosaur" grafted computerized dinosaurs onto real scenery filmed across the world. And stunt actors and extras are getting a reprieve as films like "Pearl Harbor" populate their backgrounds with computer-animated figures.
Will CGI banish hand-drawn animation to the realm of Saturday morning TV fare, instead of as glorious features?
Very unlikely, say industry animators. DreamWorks and Disney, among others, remain committed to traditional animation. Japan's anime-style of animation in movies like "Akira" and "Princess Mononoke," meanwhile, continues to inspire and influence new generations of animators.
"I hope there's always a place for [traditional animation]," says Adamson, "because it's a beautiful art form, and we all grew up on it."
For "Final Fantasy" producer Lee, however, the future may include the bold move of casting 'Aki Ross,' his computer-generated character, in other animated movies - just as one would recast a real actor in another movie in a new role. "It's not for every filmmaker," Lee says, "we don't see Aki in a Nora Ephron movie. She won't be putting Julia Roberts out of business anytime soon."
(c) Copyright 2001. The Christian Science Monitor