‘It is so important that we see things that are about us, that talk to us, that make us think about our lives’
Gillian Armstrong, The Age, 18 September 2017
The potential benefits of a post-production pipeline in the cloud are irrefutable. As we touched on in our last article in this series, cloud technology has the ability to completely remove that tricky planning stage – where studios are forced into ‘finger in the air’ judgements on big decisions like workforce and compute capacity – while enabling global collaboration in real time.
But Black Panther presented new challenges because the African nation of Wakanda—Black Panther’s homeworld—is far more technologically advanced than any society in the Marvel Cinematic Universe. In this case, Perception collaborated with Marvel Studios for 20 months and used Cinema 4D, X-Particles, Houdini and Redshift to design, develop, animate and render the visionary technology seen in the film and end title sequence.
Vary your video composition: 10 types of camera shots engaged throughout the storyline because of the range of camera shots in the film
William Hoy, ACE got into the editor’s seat on feature films back in the mid-‘80s. One of his first major feature films was Dances With Wolves. Since then he’s edited a string of box office and critically acclaimed hit films: Star Trek VI: The Undiscovered Country, Patriot Games, Se7en, The Man in the Iron Mask, The Bone Collector, Dawn of the Dead, Fantastic Four, 300, Watchmen, and Dawn of the Planet of the Apes. Art of the Cut caught up with William recently to discuss his work on War for the Planet of the Apes.
Do I take the time to do this “the right way,” or is “good enough” going to have to suffice in order to make air? It’s a day of calculating; a typical two- to four-minute package takes a certain amount of time to color, polish and prep. It’s a day of assessing: What’s the likelihood of things happening within the timeframes needed, and what do I do when those timeframes are compromised? I must have backup plans in place.
I come from a background in documentary editing, which, on the surface, is the polar opposite of animation editing. In documentary you will be given two hundred hours of footage and told to go make something. There’s no script and the story is found in the edit. Discovering the power of editorial in that environment, I came to animation with the sense that nothing is set in stone and that perhaps the director and I are Dr. Frankenstein, and all the various components of the episode from storyboards, to dialog, to sound and music are the various organs. It’s our job to knit them together in a way where our monster can come to life.
For Miles, it all began with a love of cinema and television. He declares legendary director David Lean’s inspiring movies as a big influence. “I can’t tell you how many times I’ve watched The Bridge on the River Kwai.”
The first Sydney Edit Royale event, hosted by Digistor, is available on-demand here.
Five emerging editors competed to show off their editing artistry and speed. Each edited a complete music video from unseen media in only two hours. The editor whose edit was voted the best edit of the night was Marlena Ianni using Avid Media Composer on PC. Congratulations Marlena!
My brother and I were fortunate enough to stumble upon Machliss’s presentation at NAB 2017 at the Avid booth. He discussed Baby Driver being the most difficult edit he’s ever had to manage. He also talked about his previous experiences working with director Edgar Wright on Scott Pilgrim vs. the World, The World’s End, and Spaced. You may also recognize his comedic editing style from other television series, like The IT Crowd.
Joe Walker has established himself as the go-to editor for directors Denis Villeneuve and Steve McQueen. His relationship with McQueen includes cutting 12 Years A Slave (2013) and his short film Ashes (2014), while Villeneuve’s films include Sicario (2015) and more recently, Arrival. He’s currently working with Villeneuve on next year’s release, Blade Runner 2049, and recently took a short break to speak with Post about his career, his work with the director and how he achieved the final cut for Arrival, which has taken in more than $80 million since it debuted in mid November.
The ASE is a national organisation formed in 1996. It has members across Australia as well as overseas. The Guild is an association of professional screen editors and assistants working in a wide range of disciplines including: Feature Films, Documentary, Television, Commercials, Corporate Video, Short Films, Music Video, Multimedia, Online Content, Education and Training
Nashville Filmmakers Guild NLE Debate, 3 local filmmakers taking the stage and each representing their NLE of choice: Adobe Premiere Pro CC, Apple Final Cut Pro X and Avid Media Composer.
Useful Tools for Editors By Scott Simmons 11.21.15
Film and Video Post-Production technology explained – link to videos
HOW TO BURN SUBTITLES INTO VIDEO: If you get you captions as a standard captions file (.srt, .ass, etc) then you can do it with ffmpeg, or one of the gui versions of ffmpeg. You can combine the compression to whatever output format you’re creating into the subtitle burning stage. So for example if you wanted to create a good quality mpeg4: ffmpeg -i input.mov -c:v libx264 -preset slow -profile:v main -crf 20 -pix_fmt yuv420p -vf “subtitles=yoursubtitles.srt” output.mp4. SUBRIP: Leave it as a separate SRT file, most TV’s now would just play ‘movie’ with a ‘movie.srt’ file in the same folder…Or as with vast number of mkv files, use
mkvtoolnix to merge in the srt file into the mkv file, you’ll still retain vid/audio/subs tracks, not burnt in.
SUBTITLE EDIT: is a free (open source) editor for video subtitles – a subtitle editor. Has OCR for corrections, edit/resync any subs. Export stl/scc formats ? if so,
then read it in and you can edit it like a text editor….then you can save it out as another bunch of other formats.
CHOCOLATEY Software Management Automation: A package manager like apt-get on linux or home-brew for mac. A script that will install it for you, and from then you can add tools like ffmpeg by just typing choco install ffmpeg. Unfortunately ffmpeg doesn’t seem to like .stl or .scc subtitles, but it’s pretty easy to translate stl to srt using a word-processor and find and replace (or sed / awk on the command line).
You need to find out what the actual Codec is. You can do this by finding an application that can query the MXF.
For example: http://www.opencubetech.com/page20/XFReader
Or if you know the camera it came from you can determine the format this way. If the files are from a camera that has a plugin for premiere then you should just be able to download the plugin from the camera vendors website.
A satirical look at the process of video editing that delves into areas not usually covered in your typical instructional video – the cycles of despair, procrastination and accomplishment that many editors experience
It aims to promote, improve and protect the role of editor as an essential and significant contributor to all screen productions.
Jill Bibcock: The Art of Film Editing. Focuses on the life of Jill Bilcock, one of the world’s leading film artists and Academy award nominated film editor. Features commentary from Cate Blanchett, Baz Luhrmann and Rachel Griffiths.
“Editing is the essence of cinema.” Francis Ford Coppola
“Relating a person to the whole world: that is the meaning of cinema.” Andrei Tarkovsky
“The film editor is the gatekeeper to what the audience gets to see.” Peter Greenaway
- Have to establish style, the most important thing. Way above sensational. Wild, innovative, unusual and visually extraordinary making something emotional.
- Having hundreds of options, what emotionally touches you, edit for structure and rhythm, guiding the audience’s perspective through where they look, how are they going to look and how they will feel about that moment.
- How much backstory and character detail do you need? A bit more detail about the character to invest in the character.
- What to do in chronological narrative and how that will relate to audience.
- How to make something travel from A to B, is about rhythm, delivering a story, the best way to deliver that story depending on style and what the content is that has been supplied.
- Great sense of observation, extraordinary amount of patience, tune into how a director sees their vision and put something together that even surprises the director.
- A director that has something to tell that no one else has told or can tell something in a way that has not been told in that style or context before.
- Create a bit of magic in the workplace in order to create magic. Live the vision.
- Teams only get boring when one member stops growing, pushing yourself into uncertainty. Nourish creative life, bring together the strengths creating something new.
- Language that is all about story telling.
- Using style to keep the rhythm, creating devices to create the rhythm. Creating patterns out of chaos and uncertainty. Conducting the orchestra and also playing all the instruments.
- Sometimes overt and sometimes invisible, not only about the large explosions also about the internal explosions that people might have. Understand the character with the same ownership as the characters do.
- Parallel stories and making you believe them.
- Know when to get out, what do you want to achieve by the end of this and to know that you have go it.
- Creating the tension between the visual and the rhythm is what makes exciting cinema. Rhythm, classical or chaotic. Cutting style that does not interfere. People want to communicate, a story that connects and emotionally touches the audience. It is the audience in the end and the editor is responsible for that.
Mie Moth-Poulsen, ‘Can you Cut It? An Exploration of the Effects of Editing in Cinematic Virtual Reality’
The vast development of Virtual Reality (VR) displays and 360 degree video cameras has sparked an interest in bringing cinematic experiences from the screen and into VR. However, Cinematic Virtual reality is a new and relatively unexplored area within academic research. Historically editing has provided filmmakers with a powerful tool for shaping stories and guiding the attention of audiences. However, will an immersed viewer, experiencing the story from inside a 360 degree fictional world, find cuts disorienting? This question, founded two iterative studies investigating the application of editing in Cinematic Virtual Reality and if this causes disorientation for the viewer.
This paper details two studies exploring how cut frequency influences viewers’ sense of disorientation and their ability to follow the story, during exposure to fictional 360° films experienced using a head-mounted display. The results revealed no effects of increased cut frequency which leads us to conclude that editing need not pose a problem in relation to cinematic VR, as long as the participants’ attention is appropriately guided at the point of the cut.
Editing 360 deg VR to guide the viewers attention can be too disorienting for the viewer, even experiencing motion sickness and there needs to be some kind of adjusting.
Does editing work in 360 deg films, will it work in VR?
Remember The Lumière Brothers Arrival of the Train where people thought it would come out of the screen and hit them so they ran out of the cinema in fear.
Arrival of a Train at La Ciotat (The Lumière Brothers, 1895)
What is the average cut length for the right pacing and how can this be applied to cinematic VR? How to join shots together to produce the narrative? With new mediums it takes some time and effort for user experiences.
Traditional editing – the cuts to be true to emotion, to the story, the rhythm, eye trace, 2d plane of the screen and 3d space. Still need to created smoother transitions and avoid disorientation.
With cinematic virtual reality things start to shift around, the effects of scale change and need to consider staging, setting, lighting and editing in the VR world.
Shot two films, for 360, two 180 deg films that are stitched together. Need to considering the camera placement, thinking about the camera as a person and to be as natural as possible for testing and not make the participance more disoriented. Later introduced a third camera.
Lighting is another challenge because when the two images are stitched together the exposure is different affecting the user experience.
Editing is no longer done frame to frame, it is done roll to roll and gives production challenges. Need to consider the shot length, does being cut on the action still work, what about when cutting from one world to another world.
Uses to look around the world, not guid the attention too much, having a concept of presence about being in another place and have the cuts match the viewers attention.
Spacial sound, 3D audio effects are a group of sound effects that manipulate the sound produced by stereo speakers, surround-sound speakers, speaker-arrays, or headphones. This frequently involves the virtual placement of sound sources anywhere in three-dimensional space, including behind, above or below the listener.
Walter Murch Multiple Oscar winning film editor & sound designer Walter Murch‘s distinguished 50-year career reads like a ‘best of’ list of feature films. His work as both editor and sound designer on classic films such as Apocalypse Now, The Unbearable Lightness of Being, Ghost, The Godfather II & III, The English Patient and the Talented Mr Ripley mean his word is virtually gospel when it comes to filmmaking.
Premiere Pro offers support for viewing VR video in the Monitor panels. It also detects if the clip or sequence has VR properties and automatically configures the VR viewer accordingly. You can publish VR video directly to the web from Premiere Pro to sites such as YouTube or Facebook.
How Star Wars was saved in the edit
3 Mistakes All Beginner Editors Make
State of the NLE: Which Editing Software is Best?
Final Cut Pro vs Adobe Premiere: Best Video Editor?
The Rule of Six
IN THE TRENCHES clip CUTTING ON MOTION
Jump cut examples
Blade Runner: Pan & Scan vs. Widescreen
Testing the new Grease Pencil tools for upcoming Blender 2.73
Red Giant – go to 1 December 2014
Autodesk 2015 Releases
Nuke 8 Recording of Life Digital Event
LTO Archiving for Digital Media
The Great Gatsby – VFX before and after
SOME EDITING SYSTEMS OVER THE YEARS
STEENBECK > VIDEO TAPE > SHOTLISTER > HEAVYWORKS, LIGHTWORKS > FINAL CUT PRO > QUANTEL
Template-based edit decision list management systems. These days, linear video editing systems have been superseded by non-linear editing (NLE) systems which can output EDLs electronically to allow autoconform on an online editing system
Edit Decision Lists or EDL is used in the post-production process of film editing and video editing. The list contains an ordered list of reel and timecode data representing where each video clip can be obtained in order to conform the final cut. EDLs are created by offline editing systems, or can be paper documents constructed by hand such as shot logging.
Double-system recording is a form of sound recording used in motion picture production whereby the sound for a scene is recorded on a machine that is separate from the camera or picture-recording apparatus.
A flatbed editor is a type of machine used to edit film for a motion picture. Picture and sound rolls load onto separate motorized disks, called “plates.”