On the latest episode of CookeOpticsTV, cinematographer Peter Suschitsky ASC talks to about his work on The Empire Strikes Back. Peter explains how he lit Darth Vader, and how he used smoke to hide the fact the set had no background. As a long time Star Wars fan, it is fascinating hearing about the challenges … Continued
Over the past few months, I’ve had the opportunity to test out storage arrays of various speeds and sizes. Invariably, when I write about my experience, I include an obligatory sentence about the increasing size of the footage folder created from a shoot and, by way of ingesting, the size of edit projects. Over the next few posts, I’d like to present an editor’s perspective on the impact of increasing project sizes.
Project sizes are growing. It’s true that some of that can be attributed to the increasing resolution of cameras and their ability to output raw files. But there’s another reason for that—the director.
The director calls the shots—literally. Certainly, a director’s vision can have an impact on how much footage ends up on my desk. And I’m all in for as much coverage as I’ll need to tell the story. So, I’m not really talking about fewer shots.
What I’m talking about is cutting every once in a while. It used to happen in the past, but now? Not so much. When shooting film, calling “cut” was important since it saved on film and time.
Film camera loads are limited to about 11 minutes. Once the roll is getting down to the end, you either make sure the take will be short enough—so the film doesn’t run out mid-take—or you stop everything and wait for a camera reload. Reloading film isn’t as quick as swapping out a memory card.
With digital capture and the aforementioned larger and larger storage options, saying “cut” isn’t top of mind for some directors. So “takes” are getting longer and longer.
I get that directors have a lot on their plate. And I also get that many have learned their craft completely on digital—they never had to be concerned about cutting the roll. But not cutting can have an impact on their project.
Next time, how “never having to say cut” impacts the workflow.
Panasonic announces a big firmware update that will improve performance on their Lumix Digital Single Lens Mirrorless cameras.
Launching on July 9th, Panasonic’s firmware updates will provide significant improvements to the Lumix DC-S1 and S1R and incremental improvements to micro 4/3 cameras like the GH5 and GH5S. The full frame updates focus primarily on image stabilization and improved video enhancements.
Editor Evan Schiff shares his timelines for “John Wick: Chapter 3” and “John Wick: Chapter 2”.
We all know that editing is a painstakingly tedious job…even those who are not editors are well aware. However, it’s often hard to quantify or even describe the sheer amount of work it takes to edit, say, a feature-length Hollywood blockbuster.
But editor Evan Schiff gives a rare glimpse inside his Avid post-production timeline of John Wick: Chapter 3 to see what it takes to edit a cult phenomenon and box office sensation.
Here’s Evan’s locked timeline for John Wick: Chapter 3.
The Art of the Cut podcast brings the fantastic conversations that Steve Hullfish has with world renowned editors into your car, living room, editing suite and beyond. In each episode, Steve talks with editors ranging from emerging stars to Oscar and Emmy winners. Hear from the top editors today about their careers, editing workflows and about their work on some of the biggest films and TV shows of the year.
Recently, Steve had a chance to talk with Todd Miller about his award winning documentary feature Apollo 11. You can listen to the full podcast below:
If you’re wondering whether to treat yourself to Sony’s new 600mm f/4 GM lens, you probably want to check out this hands-on video from Jared Polin. If a $13,000 lens isn’t for you, you might simply want to see how much of a difference the autofocus tracking on the Sony a9 can make when shooting sports.
Ever wondered how the filmmakers of probably the best-looking Star Wars movie managed to light Darth Vader’s blacker than black costume without seeing the light fixtures in the helmet? The answer to this and a few more nuggets of film history and cinematography tricks are revealed in this short but fascinating interview.
Canon took the photography world by surprise this week when the imaging giant announced that it would try to fund one of its upcoming cameras through the crowdfunding website Indiegogo. The mysterious IVY REC will be a “clippable, go anywhere camera” that Canon has deemed interesting enough to try, but not necessarily promising enough to pay for ahead of time.
Details about the upcoming campaign are sparse. For now, the only info available on the IVY REC exists on an Indiegogo landing page where you can sign up to receive a notification when the campaign launches. What we do know is that the IVY REC will take the form of a tiny point-and-shoot camera that looks kind of like a USB thumb drive with a plastic clip on one end.
According to the landing page, the camera will feature a 13MP 1/3-inch CMOS sensor, the ability to capture 1080p Full HD video at up to 60fps, and Bluetooth and wireless connectivity that will allow users to capture and transfer images to a mobile device via the CanonMini Cam App.
It also seems to boast an IP67 waterproof rating, making it waterproof to a depth of 1m/3.3ft for up to 30 minutes, and a dial that will allow users to switch between photo, video, multi shot, and wireless modes.
Finally, if you don’t want to use the companion app to frame your photos, the landing page claims that the square clip on one end of the diminutive camera “doubles as a viewfinder.”
If you’re wondering why a camera company with a $31.375 billion market cap is using a crowdfunding website to finance their latest mass-market idea, you’re certainly not alone. But the announcement certainly seems to have generated a lot of buzz, and the campaign will allow Canon to make a strange little concept camera without taking on any major financial risk.
This idea is either silly or brilliant, and we can’t quite make up our minds as to which it is. Let us know your thoughts in the comments, and if you’re interested in learning more about the clippable IVY REC, head over to the Indiegogo landing page here.
Canon managed it. Nikon managed it. And, thinking about it, the new mirrorless cameras from Panasonic would feel weird if they didn’t have it. If I could change one thing about my otherwise awesome Sony a7 III, this would be it.
The eye level shot can be used in many different ways within film and television. Let’s go over some examples together so you can add this camera angle to your shot lists!
One of the most import things to keep in mind when making a film or television is the point of view of the audience. Since your goal is to connect with people, you have to find new and unique ways to get the audience’s attention. One of those ways is inserting an eye-level shot.
Today we’re going to go over that camera angle, look at some examples, and beef up our point of view, medium shots, and other angles by adding them at eye-level.
Let’s start by asking a question…
What is an eye-level shot?
Eye-level shot definition
An eye-level shot is a camera angle where the point of view is set at the eye-level of the subject you are capturing. It should feel like we are actually within the scene, observing the actor’s face as if it were close to our own. The head of the subject or the object in focus should be level with the camera.
‘Our Planet’ cinematographer Doug Anderson reveals what it took to get some of the series’ most iconic shots.
Netflix’s Our Planet is one of the most spectacular nature series ever attempted. For one, each episode ends with a sobering call to action that implicates humanity in climate change. Each episode’s awe-inspiring scenes play almost as if they are elegies to a disappearing natural world. When we learn that almost all of the biodiversity onscreen is threatened, if not irrevocably damaged, by global warming, the series assumes a tenor that distinguishes it from Planet Earth and other wildlife shows.
But the series is also incomparable when it comes to the sheer craft and artistry of the filmmaking. Our Planet was famously four years in the making, filmed in 50 countries with more than 600 crew members. Each unit boasted some of the most skilled professionals in the industry—including Doug Anderson, one of the most renowned underwater wildlife cinematographers in the world.
The THANOS gimbal support vest with dual arm is supposed to take the weight off of operators’ hands when using a one-handed gimbal like the Ronin-S, Zhiyun Crane 2, or MOZA Air 2. The vest’s size and tension of the springs in the arm is adjustable. THANOS is available for pre-order now.
THANOS Gimbal Support Vest with Arm. Source: Digitalfoto
One handed gimbals are becoming heavier and capable of carrying more payload. The Ronin-S, for example, with its maximum payload can weigh up to 5.5kg (watch our extensive video review of the Ronin-S here). This might not be very pleasing to work with on longer shoots. Even holding the gimbal with two hands can cause back or hand pain after a while.
For handheld operations of heavy camera setups, we already have ergonomic back-relieving tools like the Easyrig or Ready Rig. For one-handed gimbals, there is not much there yet. One of the few solutions is the announced Steadimate S from Tiffen, which should come out soon.
The Chinese company Digitalfoto has also presented their solution by introducing vest support for one-handed gimbals. Their new gimbal support vest with a dual arm is called THANOS. Am I the only one who feels a little Avengers inspiration here? Let’s take a look at its specs.
The technology behind THANOS is not entirely new. Steadicam operators have been using vests with spring-loaded arms to distribute weight on the shoulders and waist for years. Digitalfoto has just slightly tweaked the design to accommodate lower weight and a different mounting system of one-handed gimbals.
THANOS Gimbal Support Vest with Arm – Examples of Use. Source: Digitalfoto
The whole system weighs 2.5kg and can handle payloads of 3.2 – 5kg, so it is compatible with the Ronin S, the Crane series, the Feiyu AK series, the MOZA series of gimbals and more. The round baseplate, which comes with the supporting system can be useful for mounting almost any gimbal (as long as it has a 1/4″ thread on the bottom). After inserting the baseplate in the arm’s clamp, it offers a standard 1/4″ screw on top for this purpose. The adapter for thinner handles also comes with the device.
The triple-jointed aluminum spring arm has two knobs for adjusting the spring’s tension according to the weight of the load used. At the end, it has a clamp to mount the gimbal handle.
THANOS Gimbal Support Vest with Arm – Vertical Rail on the Vest. Source: Digitalfoto
On the frontside of the vest, there is a large aluminum element with a vertical moving rail. The arm can be fixed on the rail and moved up or down, depending on how high you want your camera to be. The shoulder and waist straps are padded and thanks to the velcro design, the vest size can be adjusted.
There is a good chance that (with some practice) this supporting device can also slightly reduce the up and down movements of the camera, when walking with the gimbal.
THANOS Gimbal Support Vest with Arm – Complete Kit. Source: Digitalfoto
To be honest, I am not entirely sure if a vest with arm is the right solution for heavier one-handed gimbals. As opposed to traditional Steadicams, the majority of the weight is located above the arm’s clamp. That’s why you’ll need to hold the gimbal firmly in the hand at all times to prevent any damage.
Shooting a Micro Budget Film in a War Zone with Benjamin Gilmour Today on the is one of the craziest and bravest indie filmmakers I’ve ever had the pleasure of meeting, writer/director Benjamin Gilmour. His film Jirga was shot with a tw0 person crew, a he purchased at a local camera shop and an ever-changing…
We were already impressed with the Sony a9 when we reviewed it, giving it a score that put it on par with its two very capable rivals, the Nikon D5 and the Canon EOS-1DX II. In April this year, nearly two years after the camera’s launch, Sony introduced a significant firmware update that largely revamped the autofocus system of the camera, adding a new ‘real-time tracking’ AF mode that works seamlessly with face and eye detection. Sony also updated face and eye detection algorithms by using machine learning to understand human subjects and features more accurately.
We’ve spent some time shooting with the updated a9 in a variety of situations, and have previously written and in-depth look into what the new AF system brings. After further testing, we’ve re-scored the a9 with the boosted autofocus in mind, and it brings the score up to 90% (from 89%). This makes the a9 the highest-scoring camera in its class, out-ranking the Nikon D5 and Canon EOS-1D X II.
The increased score reflects the precision of the updated a9’s subject tracking system, as well as its ease-of-use that makes it valuable for nearly all types of photography. Click ‘Read our review’ above to jump to our full review (originally published in 2017), and read on for a description of the new real-time tracking mode, with some examples and videos of the system in use.
Real-time tracking in use
‘Real-time tracking’ refers to the ability of the a9 (and a6400) to understand the subject you initiated focus on, and track it in three dimensions, much like 3D Tracking on Nikon DSLRs, and the respective subject tracking modes on various mirrorless cameras. What sets the a9’s system apart are both its performance (we found it to reliable enough to be useful for portrait, event, candid, sports and even landscape photography), and its ease-of-use.
To pick a target, you can simply reframe your composition to place your AF point over your subject, half-press the shutter, and real-time tracking will collect color, brightness, pattern, distance, face and eye information about your subject so it can use it to keep track of your subject.
It’s robust enough that it will even, again reliably, switch in and out of Eye AF as necessary if a face or eye is detected on the subject you are tracking, as you can see in the video above.* Collectively, this means you can concentrate on the composition and the moment. There is no longer a need to focus (pun intended) on keeping your AF point over your subject, which for years has constrained composition and made it difficult to maintain focus on erratic subjects.
In practice, the system excels. While many professional sports photographers that know their sport, and can anticipate the action, have successfully used Single Point or Zone AF for years, real-time tracking can help both the amateur and the pro achieve potentially better results. First, it frees up the photographer to compose freely, as composition is no longer constrained by having to keep an AF point over the subject. But perhaps more importantly, not having to keep a fixed AF point or zone over a fast moving subject is a boon when it comes to fast, erratic subjects shot using long telephoto lenses, where framing is increasingly difficult. The sequences below were shot with the 600mm F4 GM lens at a soccer match:
Unpredictable motion combined with a 600mm focal length makes it difficult to keep a fixed AF area over your subject. Here, real-time tracking tracked our players even as others passed in front of them, switching in and out of Eye AF, and reverting to generic subject tracking, as necessary so as to not lose the original subject.
In the shot below, tracking the red player meant it was easy to capture him in focus right at the moment he fell after kicking the soccer ball – which you see here approaching the camera.
Photo credit: Barnaby Britton
Away from sports and burst photography, we found the performance of Sony’s ‘real-time tracking’ to be beneficial for even more stationary subjects, as it frees you up to try different poses and framings quickly, as we’ve done below.
Most of the 20 shots above were captured in under 19 seconds, without ever letting off the AF-ON button. The camera never lost our model, and the seamless transitioning between Eye AF and general subject tracking allowed the AF system to remain on our subject throughout the series. By not having to think about focus, you can work faster, and come home with a greater variety of images to choose from.
*This video demonstrates ‘real-time tracking’ on the a6400, but the principle is the same on the a9.
In January, Insta360 unveiled Titan, an 11K 360-degree cinematic camera featuring eight lenses with Micro Four Thirds sensors. The camera is designed for VR video production, offering shooting modes ranging from 5.3K/120fps through 11K/30fps with 10-bit color. The model is now available to order from Insta360.
Titan offers a number of high-end features for professional productions, including an integrated 9-axis gyro with FlowState stabilization for smooth shots without a gimbal. Insta360 boasts that its Titan camera offers superior low-light performance, color depth, and clarity compared to ‘conventional’ VR cameras.
The 360-degree camera produces stitched 2D videos at up to 10,560 x 5280 pixels and 3D stitched videos at up to 9600 x 9600 pixels. Stitched 2D images are processed at 10,560 x 5280 and stitched 3D images at 10,560 x 10,560 pixels in JPEG and DNG formats.
Titan supports capturing in-camera HDR images, 10 continuous burst shots, and time-lapses in addition to single shot images. The model offers multiple exposure modes (auto, manual, etc.), 12 stops exposure range, ISO 100 – 6400, and records data to nine full-size SD cards.
When the camera was announced in January, interested customers could reserve a unit with a $150 deposit. Titan is now available to directly purchase from Insta360 bundled with the Farsight live monitoring device for $14,999 USD; there’s also a bundle that includes memory cards with Titan and Farsight for $15,339 USD.