BEHIND THE SCENES: first feature film shot entirely on Prius backup camera
Green Screen Tips, Tricks and Materials – Chromakey Tutorial
The camera is used to manipulate the elements of the scene, focusing viewers on what they need to see and know, how they feel about what is going on – the story. The composition of a shot affects the mood, sets what elements in a shot are visible and most important.
Film gate setup for photographic matching. In this article we will try to understand how to set the film gate in the cameras in maya in order to virtually recreate the optical conditions of a real camera. In this way, no matter what device is being used to capture the images, from the smartphone to the professional camera, we will be able to reproduce the same angle of view in 3D.
The Kúla Deeper stereo lens is attached to a camera lens thread for high quality stereoscopic 3D photography.
- Use existing lens features like VR, autofocus and metering.
- View images in 3D on the camera display using the included stereo viewer
- Generate any 3D format using the accompanying image processing software Kúlacode
- Compatible with Nikon, Canon, Sony, Sigma and the rest of the gang.
Kúla Bebe 3D lens is attached to any smartphone with a simple clip. It comes with a paper stereoviewer for smartphones, the CinemaBox for viewing the 3D content right away. To make sure you have the fun you deserve, Kúla Bebe also comes with old school red/cyan anaglyph glasses. Kúla Bebe is in production and the limited first batch will be delivered autumn 2017.
Making of Mad Max: Fury Road from ACS Victoria with John Seale ACS ASC and David Burr ACS
Until you’ve experienced being on the other side of the camera, moviegoers might not realize that they’ve been engaged throughout the storyline because of the range of camera shots in the film. It’s not just writing and acting that pushes the narrative forward (don’t get me wrong — they’re very important too!), but the actual composition of each frame can mean something and keep viewers interested.
I’d done Wolf Creek 2 with Greg McLean here in Australia, and then I moved to the US. I’ve got two kids, teenagers now, so it was a big call and there was risk involved, but I was lucky. Soon after I got there, within two or three months, Greg rang up and said, I’m doing a Blumhouse movie – are you interested? I did a good job on The Darkness – Greg was happy and they were happy – and then they asked me to do some reshoots, which Blumhouse almost always do. They’ll change the ending or fix something that’s not quite right.
Making his feature debut behind the camera on 1994’s black and white prison drama, Everynight… Everynight, Toby Oliver has worked in a wide variety of genres and modes over the subsequent couple of decades, having shot the likes of Looking for Alibrandi, Tom White, the teen series Lockie Leonard, Last Train to Freo, and Beneath Hill 60. But of late he’s been in demand as a cinematographer of horror, culminating in his work on the most successful and culturally resonant chiller in recent memory, Jordan Peele’s Get Out.
Here’s how shooting with a shallow depth of field can immerse your audience like nothing else. Choosing how you will shoot your project might be just as hard as writing the script. With all the different lenses and cameras available today, choosing the right piece of equipment can be difficult. The creator and director of 2017’s The Handmaid’s Tale is a shining example of knowing exactly how to communicate ideas and tell stories — all through the lens.
Avengers: Age of Ultron Scenes Shot with BlackMagic PCC. Cinematographer Ben Davis used Blackmagic Design Pocket Cinema Camera for much of the footage for super hero blockbuster.
Daniel Smukalla: Director of Photography Seoul, South Korea.
DJI Inspire 1: Everything you need for aerial filmmaking, integrated into an elegant, ready-to-fly system.
Kick Starter: Color 1280×1024 High Speed Video Camera – Up to 17,791 Frames/Sec. – Powerful – Easy to Use – Price Starting Under $5K
GoPro Shrinks the Camera Again: Hero4 Session Review. It’s not GoPro’s best-quality camera, but it’s simple to use and the size of an ice cube; $400 sticker shock
‘BRINDABELLAS | edge of light’ features the sky and landscapes of the Canberra region of Australia – in particular the Brindabella Ranges – captured in monochromatic (near) infrared. This feature-length film (140+ minutes in total) focuses on the interplay of mountain light, air and water as these elements are transformed across the seasons – from clouds to mist, rain and snow – then frost and ice – and onto creeks and rivers. It explores both the wider montane vistas of the Brindabellas and the more intimate details of the natural flows that are created by these mountains and, in turn, shape the very landscapes they arise from.
40 Incredible Examples Of Infrared Photography Because everyday objects reflect infrared in proportions that differ sharply from that of visible light, the tonal relationships are wildly unexpected. Such near-infrared techniques used in photography give subjects an exotic, antique look. Green vegetation becomes white, whereas human skin becomes pale and ghostly. The resulting images look alien.
Exploring Infrared Cinematography opens up a whole new spectrum of light not visible to the unaided eye. This has the potential to give otherwise ordinary scenes a surreal and dream-like appearance. In this article, we explore several of the unique applications and technical hurdles.
How to Interpret Common False Color Images Though there are many possible combinations of wavelength bands, the Earth Observatory typically selects one of four combinations based on the event or feature we want to illustrate. For instance, floods are best viewed in shortwave infrared, near infrared, and green light because muddy water blends with brown land in a natural color image. Shortwave infrared light highlights the difference between clouds, ice, and snow, all of which are white in visible light.
Staples VR is a complete VR production studio providing both end to end solutions and consultation services for a variety of clients and industry leaders. We operate with the newest and highest quality equipment for image capture and 360 degree VR capture such as the Jaunt, Nokia OZO, Red, Arri, Blackmagic, Z-Cam, Sony, Panasonic and custom built solutions such our world fire fire proof VR capture system, and our underwater and aerial solutions. We specialise in dynamic camera movement, from aerial drone, cable cam and crane to dolly and underwater, enabling us to add that something special to your production. We pride ourselves on our ambisonic audio capture and offer full spatial post production mixing and sound design.
Jaunt ONE is a professional-grade camera system specifically designed for capturing high-quality stereoscopic 360º cinematic virtual reality experiences. Built from the ground up with visionary VR creators in mind, Jaunt ONE has proven itself in the hands of the world’s top filmmakers, studios, and networks.
Jaunt ONE offers industry-leading stereoscopic capture quality and a suite of tools for camera control and data management. It features 24 camera modules with frame-sync, global shutter, 10-stops of dynamic range, and custom 130º FOV lenses with a fixed f/2.9 aperture. Additional features include support for 120 frames per second capture and a live viewfinder. Jaunt ONE and its complementary workflow present content creators with an end-to-end solution for filming high quality cinematic virtual reality content.
Introducing Jaunt ONE – Cinematic VR Camera – Jaunt One Demo Sydney by Staples VR
The Jaunt One is a professional grade, stereographic cinematic VR build from the ground up and designed with visionary VR creators in mind. Staples VR will put the camera through its paces and explain the good the bad and the amazing when it comes to the system. Client support to visualise, post production, time line cost, testing, consulting, access to equipment. and do well in collaboration.
Staples VR lead the way when it comes to live action 360 video and your content creation. What can you do with the technology? They have worked with clients to make high end quality VR including live action capture, gameified interactivity, photogrammetry, lidar scaning, installations, training, equipment resource, live streaming VR that can be mix in with prerecorded material and swapping between different camera systems. They have build a huge range of skills and techniques to get the most of your capture systems and post workflow for the entertainment, medical, architectural, OH & S, forensic training, military training, building installations, airlines, factories, education and telecommunication industries to name a few and are on the fore front of this rapidly expanding industry.
New Zealand Fire Service today launched a world-first initiative – a 360 degree and virtual reality (VR) experience – Escape My House available online now. For the first time ever, the public can experience a real house fire first-hand and, along the way, learn why they need an escape plan.
LYTRO CIMEMA Imagine working on footage in visual effects and having the ability to change the depth of field or focal point of a scene on the fly. Change the frame rate and have physically accurate shutter angle that would have been appropriate for the rate. Reposition the camera in the scene. Drop 3D objects in your scene and have them properly occluded by the live action content. Pull mattes by setting a near depth and a far depth and extracting an object. Effectively, think of having many of the benefits of deep compositing for live action scenes.
Proof of Concept (PoC) is a realization of a certain method or idea in order to demonstrate its feasibility or a demonstration in principle with the aim of verifying that some concept or theory has practical potential. A proof of concept is usually small and may or may not be complete.
Photogrammetry is the science of making measurements from photographs. The input to photogrammetry is photographs, and the output is typically a map, a drawing, a measurement, or a 3D model of some real-world object or scene. Many of the maps we use today are created with photogrammetry and photographs taken from aircraft.
There are different camera systems with camera modules are getting smaller, some examples along the way are:
- optic cluster based systems where VR started, the GoPro rigs, mini bundle.
- cluster based, panoptic camera systems – multicamera face tracking system suitable for large wired camera networks
- mirror rigs with parabolic mirrors, larger cameras improving quality, expensive and difficult to use
- light field systems such as Lytro recording refractions and reflections of the entire area of space. Light field camera, also known as plenoptic camera, captures information about the light field emanating from a scene; that is, the intensity of light in a scene, and also the direction that the light rays are traveling in space. This contrasts with a conventional camera, which records only light intensity. Lytro is building the world’s most powerful Light Field imaging platform enabling artists, scientists and innovators to pursue their goals with an unprecedented level of freedom and control. This revolutionary technology will unlock new opportunities for photography, cinematography, mixed reality, scientific and industrial applications.
- volumetric capturing an area, volumetric video with large storage and at the moment no suitable way of playing it back.
JauntONE camera has 24 independent camera systems, resolution 8K, minimum 4K in a stereoscopic field of view. It uses every second camera module to give a slightly parallax view for the left and right eye giving a sense of depth and only playable back through head sets. Can shoot different frame rates up to 120 for slow motion which can only do 4K output. Can preview the camera system, change the ISO values, 12 volts, workflow with changing batteries and cameras. The parallax information becomes your depth map in the full 360 arc to incorporate in vfx elements.
Can get artefacts in the stitch line when something is close or further away from the camera, looking for the ability to stitch together with only one stitch line or as few as possible. Optical flow form of stitching, analyses every pixel of the frame and comparing to the frame before and after to track the movement to know what is moving and what is staying static. Not a perfect system, issues such as the algorithms not being able to recognise pixels from one camera to the next when things getting to close to the lenses within the minimum parallax distance, when two lenses cannot see the same material. The other one is motion blur, going to quickly through the frame. It is harder when the camera cannot see the image in its entirety is harder to get the stitch points. Shooting a higher frame rate, will give clearer pictures even if playing back at 30.
Needs to be level, stereoscopic around the middle and tapers off top and bottom. Assume the audience is on the same eyeline as the camera, if the audience choose to look at the skyline then you cannot control the angle the are looking at that object and will not be able to play back correctly. Create depth through the left and right eye and our brain focuses somewhere in the middle.
Some issues when considering which camera to use: safe distance from the camera 2.5 meters, not too fast moving images, camera moving to minimise and avoid haloing, artifacting, morphing, repeating textures, semi-transparent, moiré effect (are large scale interference patterns that can be produced when an opaque ruled pattern with transparent gaps is overlaid on another similar pattern. For the moiré interference pattern to appear, the two patterns must not be completely identical in that they must be displaced, rotated, etc., or have different but similar pitch.) Use 8 modules out of the camera, does not like single orbit.
It pulls up every camera module to see the placement, exposure and correct accordingly. The camera has up to 10 and 1/2 stops, the system can go up to 18 stops of dynamic range in a gradient across the system. Situations such as the camera placed between a window and action, you can expose correctly for skin tone and then for the window, outside exposure and the cameras in-between will incrementally so it works together accurately and doing multiple exposures.
BREKEL Tools for Markerless Motion Capture
FACESHIFT STUDIO is a facial motion capture software solution which revolutionizes facial animation, making it possible at every desk. The software analyzes the facial movements of an actor and describes them as a mixture of basic expressions, plus head orientation, and eye gaze. This description is then used to animate virtual characters for use in any situation where facial animation is required, such as movie and game production.
VICON Global leader in Motion Capture Cameras, software and Mo-cap systems for the VFX, life science … Vicon technology used to create groundbreaking AAA game …
Motion Capture Tutorial (Xbox Kinect)
Guardian photojournalist Sarah Lee was granted exclusive access to the set of Coldplay’s groundbreaking, Mat Whitecross-directed music video at the Imaginarium, a digital motion capture studio set up by Andy Serkis and Jonathan Cavendish in London
Cinematography, Film Craft Series by Mike Goodridge and Tim Grierson
Framed Ink: Drawing and Composition for Visual Storytellers by Marcos Mateu-Mestre
Near Map delivers high-resolution aerial imagery direct to your device, within days of capture.
Professional Aerial Photography or PAP, can perform all of your photographic needs. While we specialize in aerial photography using our state of the art drones.
Capturing Digital Images (The Bayer Filter) – Computerphile
Lighting with one light.
Information about filming shots, cameras, lighting and composition.
The Changing Shape of Cinema: The History of Aspect Ratio
Robert Rodriguez – Ten Minute Film School
Steady Feathers: LG G2 – The most extreme camera ever “Chicken”
Is when equal amounts of what and black paint are mixed together.
Reflected and incident
H and D CURVES
Toe and shoulder
Highlights, middle tones and shadows.
Contrast or tonal difference between the highlights and shadow areas. Is dependent on the amount of exposure of the shadows as will as development.
Density is the difference between the amount of light striking the film and the amount of light that actually passes through the film or how much transmitted light passes through the film. The differences in the density determine its contrast.
Each zone receives withe twice or half the light of the some before or after.
Film is 24 fps and video is 25 or 30 fps. Film cameras can be overcranked to higher frame rates and when played back at normal frame rates the images appear slower or undercranked, where the footage appears faster when played back. The shutter exposure is 1/48th of a second, half the speed is 12 fps with the exposure increased to 1/24th of second, doubled is 48 fps with the exposure reduced to 1/96th of a second.
Film/light sensor where the light-sensitive material of the camera is located.
Lens, optical part of the camera focusing the rays of light onto a recording media such as film which then runs through a film gate which consists of the pressure plate and the aperture plate which contains the film aperture, in front of the aperture is the shutter. A camera lens contains multiple lenses, elements for focusing the light onto the plane or back of the camera, an iris controlling the size of the opening or aperture setting which is controlling how much light is getting into the camera. The film aperture defines the shape of the image on the film.
Shutter controls the amount of light that enters the camera through the speed at which it opens and closes using varying amounts of time for light exposure, does not control the amount of light. Increasing the shutter speed decreases the exposure, decreasing the shutter speed increased the exposure, how long light flows. 1/8th of sec, 1/12,000th of a second, controlling the motion of objects. Slow shutter speeds slow objects if moving faster than the shutter speed. Leaf or focal plane shutter and movie cameras use a rotating circular shutter with a 180º cut out.
Aperture or diaphragm controls the amount of light entering the lens, with variable opening, the size of the opening. Changing this opening changes the amount of light that passes through, controller of intensity or brightness of light. The f-stop is the number that equals the focal length of a lens divided by the diameter of the effective aperture, or relative aperture. f-8 to f/11 the light is halved, or from f/11 to f/8 the light is doubled, f/2.8 is faster than f3.5, with each change in exposure representing a “stop”. The aperture also controls the depth of field.
Critical Plane Focus which is where the actual focus of the lens falls.
Depth of fields the area or region that has the sharpest focus, the near plane and the far plane bracket the depth of field. This is not the Critical Plane Focus which is where the actual focus of the lens falls. The depth of field lies before and after the critical plane focus, functioning as a regional sharpness compressor and expanders dictated by the aperture opening. Wide apertures (smaller f stops numbers) have a narrow depth of field because the far and near planes of focus are closer to each other. Small aperture (larger f stop numbers) the depth of field increases with the near and far planes moving further apart resulting in more areas of the scene in sharper focus.
Focus control mechanism controls the distance of the lens to the film either moving the lens assembly away from the film plane or towards
The focal length is the distance between the centre of the primary lens and the film. A shorter length or wider focal length gives a more pronounced perspective and larger sense of depth with more of the scene visible in the image, about 24mm or less. A longer focal length produces images that exhibit a flatter look with less sense of depth and less of the scene, about 50mm or greater. Prime lenses have a fixed focal length. Variable or zoom lenses where the focal length can change.
When you zoom in it looks like you walked towards the subject with a prime lens, it is a two dimensional move that is simply magnifying the image. Dolly move is a three dimensional move that displaces objects in the frame. Zooming and dollying in have different relationships of foreground and background elements, they have different perspectives. Where depth is concerned, parallax dictates that moving a camera forward (i.e. tracking shot) distorts closer objects more prominently than those further away from view; however, zooming in effectively works as a crop by affecting the entire image in equal measure (note the size of the vinyl record in the images below). Check the relationship between the foreground and background, If they do not move independently then it is probably zoomed.
When an image is in focus the lens is bending the light so that the beams of light converge exactly at the film plane otherwise it will be blurry or out of focus. The light from closer objects enters the lens at different angles than light from distant objects meaning it is possible to have some objects in focus and some out of focus. Having a sharp photograph is dependent on something called “depth of field”. To put it simply, depth of field is the amount of the photographic scene that is in focus, or sharp. It can be greater, with much of the scene in focus, or shallow, where very little is in focus. Depth of field changes with the angle of the camera’s lens, the distance from the lens to the objects in the image, and the camera’s settings. It is especially dependent on the aperture. (Remember aperture is the size of the opening of a camera’s lens, and it controls the amount of light entering into the lens at one time.)
Nodal Point or optical centre is at the centre of the lens. When a camera moves or translates the resulting image will exhibit parallax when the camera pans or rotates there is not parallax, rotating around its nodal point. If the pivot point is the tripod it is not the optical centre in the lens. 3D cameras are mathematically perfect cameras and they need to reflect a small amount of parallax. Measure the focal distances and distances to the subject from the nodal point of the lens.
Film Backs and CCDs, the film aperture is directly in front of the film, the size and shape of this opening dictates the size and shape or format of the image. The film back and focal length together help define the filed of view which represents what section of the scene will be visible with a particular lens and film aperture configuration. The Angle of View measures the 360º visible circle and is expressed in degrees. A longer lens has a lens centre further away from the film back giving a narrower angle of vision.
Digital cameras record incoming light as an electrical signal, CCD with its shape defining the shape of the image. With smaller digital sensors affecting how focal lengths are measured and changing the relationship between the lens and the sensor different lenses are required to create the same image on the digital film back.
Lens Distortion causes images to become stretched or compressed near the edges of the frame, inwards is barrel distortion, outwards or pincushion distortion. One type of distortion is known as “Barrel Distortion.” Barrel distortion is a side effect of the round glass elements within a lens and the effect of bending light more near the edges of the lens then we encounter near the center of the lens. The smaller the lens diameter gets the more drastic the effect of Barrel Distortion becomes. It occurs more with wide-angle and zoom lenses.
GoPro HERO4: The Adventure of Life in 4K
Making of Mad Max: Fury Road from ACS Victoria with John Seale ACS ASC and David Burr ACS
THE HOBBIT, Production Diary 4