Trending
    • Jasmin Mozaffari’s short film ‘Motherland, wins TIFF award
    • Poor Thing, Wins Golden Lion at the Venice Film Festival
    • Toronto Film Festival 2023
    • Iranian Influential Women: Rakhshan Bani-Etemad
    • Sundance Film Festival Asia
    • Enea, review
    • French rising star, Adèle Exarchopoulos, 4 top favorite movies
    • Cinematography director Morteza Pursamadi dies at 70
    Facebook Twitter Instagram
    Cinema Without Borders
    • Home
    • Feature Story
    • News
    • Conversations
    • Festivals
    • Cinema Tech
    • Film Reviews
    • CinéEqual
    • Other Arts
    • Archives
    Facebook Twitter Instagram
    Cinema Without Borders
    You are at:Home»Cinema Tech»Machinima Gets a Day Job – The Emerging Use of Game Technology in Feature Films

    Machinima Gets a Day Job – The Emerging Use of Game Technology in Feature Films

    0
    By Steve Boelhouwer on 02/10/2007 Cinema Tech
    We’ve all thought about it. Sitting in a dark theater watching the latest Hollywood blockbuster, how many times have we been completely caught up in the onscreen action, only to be brought back to reality by an ill-conceived scene or confusing turn of events. I mean, what was the director thinking? We would have known better than to put that clunker of a scene in the movie. After all, that’s what makes games so engaging – we have control over the action and the eventual outcome. But what if the people making movies could achieve that same level of interactivity during pre-production, filming and even editing of feature films? Increasingly, realtime game technology is now being asked to achieve that goal.

    Making Hitchcock Proud
    Hollywood’s creative re-purposing of game tech can be mostly summed up in one word: previsualization, or previs for short. In a nutshell, previs is the pre-production process where scenes in a movie are rendered beyond the simple de scri ptions on storyboards or a scri pt. As a discipline, previs is nothing new. Alfred Hitchcock preplanned every aspect of his productions so methodically that he sometimes considered the actual film shoot pure drudgery. Today, previs essentially means rendering low-resolution proxies of scenes using 3D software, then using these as an aid in planning the actual production. Technical previs has become essential on big-buck effects films that use green screen composites, motion control cameras, and other advanced CGI techniques. Recently, 3D software has also been used to previs non-CGI movies, planning scene blocking, camera rigs, and set design. A great example of this is the work done by Pixel Liberation Front for David Fincher’s Panic Room, as many of the complex camera moves in that film would have been very difficult (and expensive) to create had they not been previsualized prior to any sets being built.

    But it’s always been pretty much a linear, non-interactive process – shots were rendered, reviewed, and filed. And the visual quality of most previs was simple at best – flat shaded, low-resolution models that most gamers would consider pre-Wolfenstein in quality. That’s fine for creating a set of technical instructions for set builders or a camera crew, but what if you also needed previs for test screenings or shopping for additional financing? Through advancements in realtime rendering, it’s now possible for movie creators to interactively plan out and render scenes in the same way that game prototypes help designers build out game levels. And soon, realtime rendered scenes may begin to show up in the final cuts of the films themselves.

    Not surprisingly, the driving force in this shift is the emerging power of the GPU. By combining interactivity with drastic increases in rendering speed and quality, the movie industry has begun to realize what a powerful tool realtime can be. Production costs on feature films are mind-numbingly expensive, so a tool that allows movies to be developed faster and cheaper is a real jackpot. The ability to walk through a set in a real-time simulation and plan out lighting setups, camera locations and other factors before production ever begins can literally save hundreds of thousands of dollars. Those kind of numbers really get Hollywood excited, and pushes previs farther up the movie-making food chain. And it allows for better creative decisions too. A scri pt may call for a photorealistic 300-foot tall wall of water to swamp New York City, for example, but thankfully no one has ever actually seen, much less filmed, such an event. So what a director wants in those situations are choices. And hardware rendering simply offers the ability to offer a director more choices within a given time frame.

    Global Destruction in Realtime
    On the leading edge of realtime previs is Joshua Kolden of Crack Creative (www.crackcreative.com). Coming together during the previsualization of last summer’s blockbuster The Day After Tomorrow, Josh and his crew developed a realtime engine that runs as either a standalone application or a custom display host within Alias Maya.

    Crack didn’t use an existing game engine, citing different goals in the software design. They didn’t need to optimize for 60fps rendering or allow for an AI interface, for example, but did need advanced control over lighting and shadows. And while frame rates can dip on complex scenes, they nevertheless represent huge speed improvements over traditional software renders (For gamers, previs rendering may seem slow, but remember, a single final frame of a CGI film can take hours or even days to render).

    Their engine takes advantage of custom, multiresolution shaders, written in Cg, as well as a unique load balancing system that juggles computational tasks vs. screen rasterization. This allows for control of output quality independent of hardware capabilities. The engine’s custom shaders allow them to quickly switch between various shading models and features, and proprietary tricks for ambient occlusion, reflections and other attributes bring a level of visual quality beyond what’s typically associated with a realtime renderer. Animation data is imported and played back as in a standard game engine, with Maya, Kaydara and mocap libraries being the most popular formats. Under development next is a generic face generator, allowing for lip synch and emotive facial expressions in the real time environment.

    Because the film industry operates on a variety of platforms, a realtime engine should not be locked to a single OS or hardware vendor. As Josh explains: “We only ever use OpenGL as the interface. Portability is an absolute requirement…we can’t find ourselves or our clients restricted to NVIDIA-only features or Windows®-only applications. We will not knowingly limit our own options to a single platform. As it is, all of our tools use multithreading, plugins, and advanced graphics calls, and can compile without change on Windows, Linux, Irix, and Mac.”

    So is the day close when a director or visual effects supervisor will sit down with a joystick or game controller and explore an entire scene in the virtual realm? Josh thinks so. “We expect to provide that in the near future. Right now we’re almost there. We provide a guided tour, but close to full interactivity. Even the most basic of 3D games require some skills to understand how to use them. For example, you often have to do a training level before the game really gets going. Even then the game logic can help by restricting actions to easy to control movements; back, forth, right, left, jump, crouch etc.” Of course, not every director will be that hands on, just as many today would never actually touch a camera or an editing system. “With previs, control is an even bigger issue because of the absolute freedom. We are designing new ways to interact with the computer to put the tools in the director’s hands in a way they understand and that leverages their talents.”

    The improved visual quality of realtime rendering, such as the ability to render soft shadows, has pushed previs from being strictly a pre-production technical guidebook to a medium that stops just short of being included in the final frames of a feature film. This new technique, called post-vis, involves cutting in previsualized scenes into a mostly completed film to give a sneak peek to studio executives or focus group audiences. Previously, when an incomplete film was being screened, a simple text message was often displayed onscreen to communicate what should be happening in a missing scene. Now, realtime rendered scenes can be edited into the actual film to preserve the drama and emotional impact of the story. Imagine the difference between reading an onscreen card that reads “Giant sea eels now converge on the city and destroy all sushi bars” vs. seeing a game-quality render of that actually happening. Josh explains, “Previs is all about communication at an emotional level – flat, plastic characters don’t cut it. Visuals need to be rich enough to elicit an emotional response from the viewers, be it a director or test audience.”

    Lights, Camera, Render
    A second application for realtime rendering is during on-set filming and production itself. With so much imagery in today’s movies now being generated in post-production, shooting live action on a green screen set can sometimes leave performance and setup lacking. Engine Room studios (www.engineroomvfx.com), a facility that houses a large green screen production stage as well as a visual effects studio, has built a pipeline that integrates live images from a motion control camera with 3D software to produce realtime rendered output on set. This allows for a bridge between live action and CGI, as the scenes are being shot.

    Dan Schmit, owner, cinematographer and visual effects supervisor at Engine Room, explains that their system imports streaming data from a live camera rig into a PC running Maya. This streaming data controls Maya’s virtual camera, in the same way that mocap data drives an animated character. Another PC then renders a composite of the motion control camera’s live action and Maya’s virtual scene in realtime. Think Drew Carey’s Green Screen but in realtime. And although the render is visually simple (the Maya scene is flat shaded and has a 75,000 polygon maximum), the ability to view the composite scene on set is a tremendous advance, allowing actors to view the virtual scene they’re to about to interact with. Problems with the virtual set can also be recognized and changes to the Maya scene files can be done on the production set, saving huge post-production headaches later on. At the end of each shooting day, Engine Room produces a DVD which contains the digitally captured live action, the encoded camera motion data, the Maya scene files and the realtime rendered composite scenes. These new “digital dailies” represent another step in Hollywood’s march from celluloid to pixels.

    Machinima Gets a Day Job
    The preceding techniques may sound very familiar to those involved in the emerging art of Machinima, and in fact previs can be viewed as the professional cousin to the hobbyist Machinima movement. By definition, Machinima is the art of creating animated movies using realtime game engines, typically Unreal Tournament or Quake III:Arena (read a review of the new book, Game-Based Filmmaking: The Art of Machinima online at Gamasutra at http://www.gamasutra.com/features/20041020/kane_01.shtml). Many gamers are familiar with Rooster Teeth Productions’ Red vs. Blue episodic web series (created with Halo), and their new The Strangerhood series (produced with The Sims 2). But machinima is often constrained by storylines too closely based on the games they are built with, limiting its audience to those predisposed to those games in the first place.

    Breaking those boundaries is a small studio in the heart of old Hollywood founded by three game industry veterans whose vision is to bring a realtime workflow to the mainstream computer animation and effects industries. Extra Large Technology (XLT), comprised of principals David Koenig, Yoni Koenig and Robert Knaack (formerly of Gigawatt Studios), has developed a game-style pipeline not for previs or a behind-the-scenes production task, but rather to produce final, audience-ready animation.

    Like Crack Creative, XLT opted to write custom software rather than repurpose a commercial game engine. Their Windows-based system, MachStudio, could best be described as Maya meets Final Cut Pro. The software features a nonlinear animation timeline providing for multiple layers of animation channels, unlimited camera setups that can be animated via splines or keyframes, and a lighting system that uses a proprietary raycasting shadow technique. An interactive materials editor and full-featured particle generator round out the package. Models, animations and textures are imported directly from Maya. Once loaded, MachStudio’s DirectX-based interactive renderer allows for smooth realtime manipulation of scenes in excess of 3 million polys (including shadows) on a sub-2GHz Pentium box. A muscle-based system for animating skinned characters is in development.

    XLT’s goal with this system is to offer professional-level CG animation in a fraction of the time (and a fraction of the price) of traditional animation methods. It also frees animators from traditional game engine creative restrictions, allowing for a visual experience that feels closer to Toy Story than Tomb Raider. Their first project, a series of PSA’s for the Los Angeles Transportation Authority, features a bright, saturated environment, lip synch animation on multiple characters, soft shadows, and Jimmy Neutron-style camera work that’s dead on for the target audience. A subsequent comedic short features a babysitting urban ogre and his cranky child.

    But the price of being on the cutting edge in an established industry can involve real financial risk, or at the very least a lot of extra work. “People need to take the leap with you” says David Koenig, noting that machinima has not really produced commercial content as of yet. “We’re trying to push the envelope in a very specific way, and so not only are you working to get everything done, but you’re also evangelizing, so the challenge is for everyone to see [our work] as it is as opposed to focusing on how it was made.” He adds: “The real win is building a tool and process that allows many people to be included in the development of animated content, just as Flash® changed web animation, so a tool like ours can change the way 3D animation is done.” A lofty goal, to be sure. But there’s also no doubt that realtime tools have the potential to change CG filmmaking just as affordable digital video cameras have revolutionized independent live-action films.

    Tools Without Artists
    So what does this paradigm shift mean to the games industry? After all, games and films can be uneasy bedfellows, with the offspring of previous unions often being less than the sum of the two parts. And Hollywood has sometimes perceived game artists and programmers as “low res” versions of their own CG hotshots. But realtime is slowly changing that. Both industries are now competing for the same talent pool, a trend sure to continue as toolsets merge. Visual effects studios will now need expertise in pixel and vertex shaders as well as high-end software shading languages such as Renderman and Mental Ray. As Joshua Kolden concludes, “The technology has been in place for a couple of years now, but missing is the science of the process. We are leading the charge there, but we’re only halfway done. As the process becomes more codified, the next obstacles will be two-fold. On the one had we will have tools without artists. As a new field no one will have any skills in it. We will be looking hard for people who have a talent for the broad set of skills and creativity required. These will be people who like to tell the story more then they like to do CG, but they’ll have to be rooted in the technology.”

    ###

    Contact info for sources quoted:

    Crack Creative
    723 N. Cahuenga Blvd.
    Los Angeles, CA 90038
    Ph. (323) 962-6402
    www.crackcreative.com

    Engine Room
    655 N. La Peer Dr.
    West Hollywood, CA 90069
    Ph. (310) 860-9100
    www.engineroomvfx.com

    Extra Large Technology
    contact@extralargetech.com:

    *Steve Boelhouwer is the Vice President of Creative Services for Vendare Media, a Los Angeles-based games and media company. He has been involved in digital content creation since 1994 and is a contributor to Game Developer magazine and other publications. Contact Steve at sboelhouwer@sbcglobal.net

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Steve Boelhouwer

    Related Posts

    Sound Devices MixPre II Series

    Perfect Lighting

    ExpressVPN wins Cinema Without Borders’ Seal Of Excellence

    Leave A Reply Cancel Reply

    You must be logged in to post a comment.



    Most donations are tax deductible.
    Latest Stories
    09/19/2023

    Jasmin Mozaffari’s short film ‘Motherland, wins TIFF award

    09/10/2023

    Poor Thing, Wins Golden Lion at the Venice Film Festival

    09/07/2023

    Toronto Film Festival 2023

    09/07/2023

    Iranian Influential Women: Rakhshan Bani-Etemad

    Bridging The Border Award

    At a time when physical, religious, racial, cultural, and economic borders divide the population of our planet, efforts to bridge those borders should be appreciated. In that spirit, Cinema Without Borders presents Bridging the Borders Award to the films that are most successful in bridging and …Read More

     

    I, Immigrant, International Film Festival
    CineEqual

    CinéEqual represents filmmakers, institutions, and community members with a focus on social justice cinema. As an integrated unit of CWB, it promotes a diverse, inclusive, and equitable democratic society that values the worth of all humans…Read More

     

    About
    About

    Cinema Without Borders is a meeting place of independent cinema. Based in Los Angeles, CWB puts the spotlight on rising talent around the globe to achieve its mission, which is to serve and strengthen communities of filmmakers and film students across real and virtual borders.

    Copyright Cinema Without Borders@2018

    Popular Posts
    01/02/2001

    Cinecon 46-The 46th edition of the Classic Film Festival

    10/09/2006

    An Interview with Jonathan Wolf, Managing Director of AFM

    10/11/2006

    Film & TV production in Afghanistan

    Article Photos
    NasserFarhoudiWP
    SiggrpphSlider
    FundingCoverImage
    6-RADUSlider
    Nouredin-WP-Slider
    NOHOFestival-WP-Slider
    MiamiFF-WP-Slider
    MarkTamez-WP-Slider
    LouderThanBombs-WP-Slider
    HP-Rick-WP-Slider
    HP-Bridging-2-WP-Slider
    HP-Bridging-1-WP-Slider
    Contacts & Credits

    Type above and press Enter to search. Press Esc to cancel.