dimanche 26 octobre 2014

BEHIND THE SCENES OF GUARDIANS OF THE GALAXY


gotg-header
Ben Davis, ASC was the cinematographer on “Guardians of the Galaxy” and was tasked with translating the comic book into reality. The majority of the film was photographed with the Arri Alexa XT shooting ARRIRAW (framed for 2.40:1)with Zeiss Master Primes. However, to give the intro a different feel they employed a set of JDC Cooke Xtal (Crystal) Express anamorphic prime lenses.
James Gunn framing up with a JDC Cooke Xtal (Crystal) Express anamorphic prime lens
James Gunn framing up with a JDC Cooke Xtal (Crystal) Express anamorphic prime lens
The cinematographer mentions that he was incredibly inspired by the concept art for the film and that he tried to really bring that feeling into the photography. Concept art is a somewhat secret art in the film industry, but it shows the power it has to inspire the entire crew.

EXT. Morag Temple

The production built a large portion of the Morag Temple set that was surrounded in blacks or chroma blue screen and later digitally extended by MPC. It is usually best to photograph the character on a physical floor because creating dust/foot prints/shadows interactions can be difficult in post, especially on uneven terrain.
Morag Temple
Morag Temple – Live-Action Plate
If you examine the original live-action plate, you’ll see two large units (18ks?) suspended by a crane. Each light is creating a hot spot on the ground and is creating some atmospheric lighting effects. While the general composition of the frame in regards to the “hot spots” is preserved, the final CG lighting changes the direction of the light from backlight to three-quarter front light.
Final shot after CG environment extensions
Final shot after CG environment extensions
To get perfect crepuscular rays or “God Rays,” on a set that big, you would need the actual sun and then a huge set piece to control how much sunlight came into the scene. In CG, it is much easier to control the amount of atmospheric perspective and art direct the the direction and quality of the rays.
A scale model of the Morag Temple set
A scale model of the Morag Temple set
On a union feature film, it is common practice for the production designer to build a scale model of the sets. This is a great tool to communicate with the director and to ultimately build the actual set. Looking at this particular model, one might design the set in a digital sculpting program like Z-Brush or Mudbox and then 3D print it out. And then the previs team or post-vis team could use the 3D model of the set too.
Chris Pratt running full speed while a Libra Head keeps the shot smooth
Chris Pratt running full speed while a Libra Head keeps the shot smooth
To shoot FAST running shots this combo Picture Car + Libra head is pretty popular. For moderate speed shots a steadicam may be sufficient, but to handle the extreme bumps and accelerations the stabilized Libra head is required. However, for the normal Technocrane shots the production used a stabilized Scorpio head.

INT. Klyn Prison

The Klyn Prison was a huge multi-tiered set that took over 15,000 DMX channels to control all of the lights. For the ambient “space light,” there were several 20×20 overhead soft boxes with Panalux FloBank tungsten fluorescents (gelled with ½ CTB) through a full grid (dyed Lee Filters 728 Steel Green). For architectural accents and oppressive top-lighting there are hundreds of par cans hung above the set. The rest of the lighting is built right into the set and is a combination of LED and fluorescents.
Klyn Prison wide shot
Klyn Prison wide shot
In this wide shot you see the blue soft boxes overhead and the par cans beaming down on the set. The scene is being filmed with a SuperTechno 50 and a Stabilized Scorpio Head. You can also clearly see where the spotlight is focused.
cdb-gotg-01
CDB GIF Breakdown: In this establishing shot of the prison the production employs a SuperTechno 50 and an overslung Stabilized Scropio Head. The shot begins wide and pushes into some inmates fighting, it then moves down to reveal Star-Lord looking around.
Klyn Prision - Handheld shot with the Alexa
Klyn Prision – Handheld shot with the Alexa
Klyn Prison, mulit-camera fight scene
Klyn Prison, mulit-camera fight scene
In this shot you can see three cameras filming a fight sequence.
Wide Shot: Fisher10 Dolly with a fluid head
MCU Drax: Fisher10 Dolly handheld
MCU Drax (low angle): Handheld laying on the floor
And three 1st AC’s pulling focus with their Preston FI+Z Remote Lens Control Systems.
Klyn Prison - Behind the Scenes
Klyn Prison – Behind the Scenes

EVERYTHING WE KNOW ABOUT THE RGB+Z ARRI MOTION SCENE CAMERA

arriscene-header2
In 2013, Arri debuted the Arri Motion Scene Camera at IBC. The system is a combination of traditional Arri Alexa Studio (Mirror Shutter) and a time-of-flight IR depth sensor. The camera is the result of a European research project called SCENE. The Motion Scene camera is capable of capturing traditional RGB color data along with Z-axis depth data from the same entrance pupil (lens/sensor) with the same field of view.
If you want to see the indie version (DSLR and an Xbox Kinect sensor) and what can be done with the new hybrid RGB+Z format, check out The Camera of the Future with Specular.
Add a real-time position/rotation tracking system, like the Google Tango, and you have the making of a futuristic virtual production camera system. More on that later, let’s jump right into the tech behind the Arri Motion Scene camera.

Time of Flight & Trifocal

Arri Motion Scene Camera at IBC 2012
Arri Motion Scene Camera at IBC 2012
Arriflex has teamed up with SCENE to create an extended Arri Alexa Studio that allows depth information to be recorded through the same lens as the RGB color data. To gather the depth data there are two key technologies.
1. The time-of-flight sensors that rely on sending infrared light (IR) into the scene and then recording the reflections. This is a similar technology that is utilized in the Primesense depth sensor used in the first Microsoft Kinect sensor.
Arri Motion Scene Camera with early time-of-flight IR sensors
Arri Motion Scene Camera with early time-of-flight IR sensors
You will see in the early prototype they have several small white exposed breadboards with IR lights and sensors. In the IBC prototype, they have a very cleaned up version on the sides.
2. Stereo or trifocal camera data. By using multiple cameras with identical focal lengths and a known-fixed distance, using photogrammetry, additional depth data can be extracted.
Arri Motion Scene Camera with an Alexa M and 4 trifocal cameras
Arri Motion Scene Camera with an Alexa M and 4 trifocal cameras
You will see an additional Alexa M camera using the same focal length lens as a primary stereo pair. In addition there are approximately 4 trifocal cameras setup arbitrarily, depending on the scene.
Alexa M used as a stereo pair with the Arri Motion Scene camera
Alexa M used as a stereo pair with the Arri Motion Scene camera

Callibration

Arri Scene Camera calibration charts
Arri Scene Camera calibration charts
Getting all of these different cameras to speak the same language, is one of the challenges of using this approach. First all of the cameras must be using a common coordinate system, which is accomplished by photographing a large checkerboard and calculating their offsets.
Second, each camera has a different frame rate and refresh rate, so you’ll see on their slate they use an LED array to measure the delay of each camera. Then erase that delay in post.
Traditional timecode slate with an LED array to calculate the delay of the different camera systems
Traditional timecode slate with an LED array to calculate the delay of the different camera systems
Finally each camera system has an inherent signal to noise ratio that must be respected. Several of these cameras do not have adjustable iris/shutter/exposure, so the scene has to be lit to a certain and constant light level.

Arri Motion Scene Cinematography

Set lit by KinoFlos and LED lights
Set lit by KinoFlos and LED lights
One of the major limitations of using time-of-flight IR sensors for measuring depth, is that you can’t use tungsten or HMI light sources. Both of those sources emit IR and would confuse the sensor. So as a cinematographer you are left using KinoFlo (fluorescent) or LED lighting.
Arri Motion Camera with a IR coated Zeiss lens
Arri Motion Camera with a IR coated Zeiss lens
The primary Motion Scene / Alexa Studio camera uses a specially coated IR lens to help enhance the quality of the depth capture. The rest of the cameras use “normal” unaffected lenses.

Processing Depth Data

Currently, TOF/IR sensors are very low resolution and relatively grainy. Software companies who are part of the SCENE project are busy at work figuring out ways to get the best quality depth data from the Arri Motion Scene camera.
At this point in time, they have the ability to view and work with the color mapped depth data in real time for virtual production/monitoring.

Applications

There are a lot of possible applications of the RGB+Z format but the two most relevant to cinematographers are as follows.

Eliminating the need for chromakeying (blue/green screens)

Arri Motion Scene camera Zdepth data for isolating elements of the frame
Arri Motion Scene camera Zdepth data for isolating elements of the frame
The post production team would be able to isolate people/backgrounds using the depth data, eliminating the need to light up large chroma key blue/green screens. The “mattes” are very noisy at this point in time but with better hardware/software solutions this technique could very well replace chromakeying.

Relighting in Post / Virtual Production

LIDAR scan of the set with 360 color / texture data
LIDAR scan of the set with 360 color / texture data
Using a combination of traditional long range LIDAR scans and 360 HDR photography, it is possible to create a 3D model of the set with the lighting. Using the depth data from the Scene Camera, it would then be possible to interactively change the lighting of the set AND the talent in post. Or on a virtual production set, you could change the lighting of the set in CG, instead of using real world lights.
A "Ladybug" 360 camera for capturing the lighting and texture of a scene
A “Ladybug” 360 camera for capturing the lighting and texture of a scene
A demo of this technology is shown in a demo at SCENE. The geometry and lighting is very rough, but the concept is clear.
A demo of relighting a scene using the depth data in post
A demo of relighting a scene using the depth data in post

mercredi 15 octobre 2014

A Short Documentary featuring Behind The Scenes footage.

Iron Man,the Movie © 2008 MVL Film Finance LLC.
Iron Man,the Character:™& © 2008 Marvel Entertainment.
All Rights Reserved.

Note:I do not own this content.I uploaded this video for entertainment and educational purpose.It is the copyrighted material of Paramount Motion Pictures and Marvel Entertainment.
It is featured in Iron Man Blu-ray.