Peak Design is responsible for running some of the most successful Kickstarter projects to date. The company has now launched updates to its existing series of popular Everyday bags while adding a few new ones into the mix. This time, they’re even skipping Kickstarter: everything ships today.
Hive Lighting has released the new CX line of lights starting with the Wasp 100-CX & Hornet 200-CX. These are omni-color LED lights which are new and improved versions of the original Wasp 100-C & Hornet 200-C and come in at a lower price point. The Wasp 100-C uses 75 watts to create 320FC at … Continued
Our first Black Friday deals post features interesting lenses from Canon, FUJIFILM, Olympus, Panasonic, Rokinon, Sigma, and Sony. Additionally there are a couple of Black Friday offers from Apple and one Lenovo laptop. This is the second out of three Black Friday posts. In the next post we will take a look at more filmmaking gear – gimbals, tripods, monitors, etc…
I browsed the current deals on our partners’ websites and have selected the best offers currently available which I think could be interesting for you. Some offers are only available at the American store B&H, others are only available at the European store CVP, and some are available globally with both our partners. By shopping at our partners’ stores, you are supporting cinema5D through our buy links, as we get a small affiliate commission when a purchase is completed.
This year we decided to divide the Black Friday deals into three posts, which we will publish in the next days before Friday November 29:
- Cameras, Drones, FilmConvert, MZed
- Lenses, Computers, Laptops, Tablets
- Gimbals, Tripods, Monitors, Wireless Video
Today’s post is the second out of three and it features only lenses, computers, laptops, and tablets. Most of these posts should be active til Cyber Monday on December 2nd, but there might be exceptions. Please always check the link to see when the deal expires.
Now, let’s take a look at the top Black friday camera deals for filmmakers:
First, let’s take a look at Canon EF full-frame lenses and then Canon EF-S lenses for APS-C cameras.
Canon EF 16-35mm f/2.8L III USM Lens – $300/€313 Discount
B&H Buy link: Canon EF 16-35mm f/2.8L III USM Lens
Was: $2,199.00 Now: $1,899.00 (Savings: $300.00)
CVP Buy link: Canon EF 16-35mm f/2.8L III USM Lens
Was: €1,860.38 Now: €1,546.52 (Savings: €313.86)
Canon EF 24-70mm f/2.8L II USM Lens – $300/€247 Discount
B&H Buy link: Canon EF 24-70mm f/2.8L II USM Lens
Was: $1,899.00 Now: $1,599.00 (Savings: $300.00)
CVP Buy link: Canon EF 24-70mm f/2.8L II USM Lens
Was: €1,632.08 Now: €1,384.49 (Savings: €247.59)
Canon EF 70-200mm f/2.8L IS III USM Lens – $300/€254 Discount
B&H Buy link: Canon EF 70-200mm f/2.8L IS III USM Lens
Was: $2,099.00 Now: $1,799.00 (Savings: $300.00)
CVP Buy link: Canon EF 70-200mm f/2.8L IS III USM Lens
Was: €1,944.48 Now: €1,690.47 (Savings: €254.01)
Canon EF 100-400mm f/4.5-5.6L IS II USM Lens – $400/€299 Discount
B&H Buy link: Canon EF 100-400mm f/4.5-5.6L IS II USM Lens
Was: $2,199.00 Now: $1,799.00 (Savings: $400.00)
CVP Buy link: Canon EF 100-400mm f/4.5-5.6L IS II USM Lens
Was: €1,887.93 Now: €1,588.67 (Savings: €299.26)
Canon EF 17-40mm f/4L USM Lens – $300 Discount
B&H Buy link: Canon EF 17-40mm f/4L USM Lens
Was: $799.00 Now: $499.00 (Savings: $300.00)
Canon EF 50mm f/1.4 USM Lens – $100 Discount
B&H Buy link: Canon EF 50mm f/1.4 USM Lens
Was: $399.00 Now: $299.00 (Savings: $100.00)
Canon EF 85mm f/1.8 USM Lens – $150 Discount
B&H Buy link: Canon EF 85mm f/1.8 USM Lens
Was: $419.00 Now: $269.00 (Savings: $150.00)
Canon EF 100mm f/2.8L Macro IS USM Lens – $200 Discount
B&H Buy link: Canon EF 100mm f/2.8L Macro IS USM Lens
Was: $899.00 Now: $699.00 (Savings: $200.00)
Canon EF-S 24mm f/2.8 STM Lens – $50 Discount
B&H Buy link: Canon EF-S 24mm f/2.8 STM Lens
Was: $149.00 Now: $99.00 (Savings: $50.00)
Canon EF-S 17-55mm f/2.8 IS USM Lens – $330 Discount
B&H Buy link: Canon EF-S 17-55mm f/2.8 IS USM Lens
Was: $879.00 Now: $549.00 (Savings: $330.00)
Canon EF-S 10-22mm f/3.5-4.5 USM Lens – $250 Discount
B&H Buy link: Canon EF-S 10-22mm f/3.5-4.5 USM Lens
Was: $649.00 Now: $399.00 (Savings: $250.00)
Canon RF 24-105mm f/4L IS USM Lens – $200/€175 Discount
B&H Buy link: Canon RF 24-105mm f/4L IS USM Lens
Was: $1,099.00 Now: $899.00 (Savings: $200.00)
CVP Buy link: Canon RF 24-105mm f/4L IS USM Lens
Was: €1,001.64 Now: €826.42 (Savings: €175.22 – CashBack)
Next up, FUJIFILM APS-C lenses for the X-mount.
FUJIFILM XF 16-55mm f/2.8 R LM WR Lens with UV Filter Kit – $200/€175 Discount
B&H Buy link: FUJIFILM XF 16-55mm f/2.8 R LM WR Lens with UV Filter Kit
Was: $1,199.00 Now: $999.00 (Savings: $200.00)
CVP Buy link: FUJIFILM XF 16-55mm f/2.8 R LM WR Lens
Was: €972.43 Now: €797.22 (Savings: €175.21 – CashBack)
FUJIFILM XF 50-140mm f/2.8 R LM OIS WR Lens with UV Filter Kit – $200 Discount
B&H Buy link: FUJIFILM XF 50-140mm f/2.8 R LM OIS WR Lens with UV Filter Kit
Was: $1,599.00 Now: $1,399.00 (Savings: $200.00)
FUJIFILM XF 80mm f/2.8 R LM OIS WR Macro Lens with UV Filter Kit – $250 Discount
B&H Buy link: FUJIFILM XF 80mm f/2.8 R LM OIS WR Macro Lens with UV Filter Kit
Was: $1,199.00 Now: $949.00 (Savings: $250.00)
Now Olympus and Panasonic lenses for Micro four thirds mount.
Olympus M.Zuiko Digital ED 9-18mm f/4-5.6 Lens – $150 Discount
B&H Buy link: Olympus M.Zuiko Digital ED 9-18mm f/4-5.6 Lens
Was: $699.00 Now: $549.00 (Savings: $150.00)
Olympus M.Zuiko Digital ED 12-40mm f/2.8 PRO Lens – $200 Discount
B&H Buy link: Olympus M.Zuiko Digital ED 12-40mm f/2.8 PRO Lens
Was: $999.00 Now: $799.00 (Savings: $200.00)
Olympus M.Zuiko Digital 25mm f/1.8 Lens – $150 Discount
B&H Buy link: Olympus M.Zuiko Digital 25mm f/1.8 Lens
Was: $399.00 Now: $249.00 (Savings: $150.00)
Olympus M.Zuiko Digital 45mm f/1.8 Lens – $150 Discount
B&H Buy link: Olympus M.Zuiko Digital 45mm f/1.8 Lens
Was: $399.00 Now: $249.00 (Savings: $150.00)
Panasonic Leica DG 8-18mm f/2.8-4 ASPH. Lens with UV Filter Kit – $200/€320 Discount
B&H Buy link: Panasonic Leica DG 8-18mm f/2.8-4 ASPH. Lens with UV Filter Kit
Was: $1,097.99 Now: $897.99 (Savings: $200.00)
CVP Buy link: Panasonic Leica DG 8-18mm f/2.8-4 ASPH. Lens
Was: €1,021.11 Now: €700.27 (Savings: €320.84 including CashBack)
Panasonic Lumix G X Vario 12-35mm f/2.8 II O.I.S. Lens with UV Filter Kit – $200 Discount
B&H Buy link: Panasonic Lumix G X Vario 12-35mm f/2.8 II O.I.S. Lens with UV Filter Kit
Was: $997.99 Now: $797.99 (Savings: $200.00)
Panasonic Lumix G X Vario 35-100mm f/2.8 II O.I.S. Lens with UV Filter Kit – $200/€320 Discount
B&H Buy link: Panasonic Lumix G X Vario 35-100mm f/2.8 II O.I.S. Lens with UV Filter Kit
Was: $1,097.99 Now: $897.99 (Savings: $200.00)
CVP Buy link: Panasonic Lumix G X Vario 35-100mm f/2.8 II O.I.S. Lens
Was: €860.49 Now: €791.57 (Savings: €68.92)
Panasonic Leica DG Nocticron 42.5mm f/1.2 O.I.S. Lens with UV Filter Kit – $400/€298 Discount
B&H Buy link: Panasonic Leica DG Nocticron 42.5mm f/1.2 O.I.S. Lens with UV Filter Kit
Was: $1,597.99 Now: $1,197.99 (Savings: $400.00)
CVP Buy link: Panasonic Leica DG Nocticron 42.5mm f/1.2 O.I.S. Lens
Was: €1,167.12 Now: €868.47 (Savings: €298.65 including CashBack)
Cine lenses offer some nice deals too.
Rokinon 24, 35, 50, 85mm T1.5 Cine DS Lens Bundle for Canon EF Mount – $800 Discount
B&H Buy link: Rokinon 24, 35, 50, 85mm T1.5 Cine DS Lens Bundle for Canon EF Mount
Was: $2,346.00 Now: $1,546.00 (Savings: $800.00)
Rokinon Xeen 50mm T1.5 Lens for Canon EF Mount – $1,000 Discount
B&H Buy link: Rokinon Xeen 50mm T1.5 Lens for Canon EF Mount
Was: $2,495.00 Now: $1,495.00 (Savings: $1,000.00)
Sigma EF lenses are next.
Sigma 18-35mm f/1.8 DC HSM Art Lens for Canon EF – $160 Discount
B&H Buy link: Sigma 18-35mm f/1.8 DC HSM Art Lens for Canon EF
Was: $799.00 Now: $639.00 (Savings: $160.00)
Sigma 105mm f/2.8 EX DG OS HSM Macro Lens for Canon EF – $500 Discount
B&H Buy link: Sigma 105mm f/2.8 EX DG OS HSM Macro Lens for Canon EF
Was: $969.00 Now: $469.00 (Savings: $500.00)
Sigma 35mm f/1.4 DG HSM Art Lens for Canon EF – $250 Discount
B&H Buy link: Sigma 35mm f/1.4 DG HSM Art Lens for Canon EF
Was: $899.00 Now: $649.00 (Savings: $250.00)
Last but not least, Sony zoom lens.
Sony FE 24-105mm f/4 Lens with UV Filter Kit – $200 Discount
B&H Buy link: Sony FE 24-105mm f/4 Lens with UV Filter Kit
Was: $1,398.00 Now: $1,198.00 (Savings: $200.00)
Apple 27″ iMac with Retina 5K Display (Mid 2017) – $700 Discount
B&H Buy link: Apple 27″ iMac with Retina 5K Display (Mid 2017)
Was: $2,299.00 Now: $1,599.00 (Savings: $700.00)
Apple 27″ iMac with Retina 5K Display (Early 2019) – $250 Discount
B&H Buy link: Apple 27″ iMac with Retina 5K Display (Early 2019)
Was: $2,399.00 Now: $2,149.00 (Savings: $250.00)
Apple 15.4″ MacBook Pro with Touch Bar (Mid 2019) – $400 Discount
B&H Buy link: Apple 15.4″ MacBook Pro with Touch Bar (Mid 2019)
Was: $2,399.00 Now: $1,999.00 (Savings: $400.00)
Apple 15.4″ MacBook Pro with Touch Bar (Mid 2019) – $450 Discount
Configuration: 2.3 GHz Intel Core i9 Eight-Core, 16GB of 2400 MHz DDR4 RAM | 512GB SSD, 15.4″ 2880 x 1800 Retina Display, AMD Radeon Pro 560X GPU (4GB GDDR5)
B&H Buy link: Apple 15.4″ MacBook Pro with Touch Bar (Mid 2019)
Was: $2,799.00 Now: $2,349.00 (Savings: $450.00)
Apple 13.3″ MacBook Pro with Touch Bar (Mid 2019) – $230 Discount
B&H Buy link: Apple 13.3″ MacBook Pro with Touch Bar (Mid 2019)
Was: $1,809.00 Now: $1,579.00 (Savings: $230.00)
Apple 12.9″ iPad Pro (Late 2018, 256GB, Wi-Fi Only) – $100 Discount
B&H Buy link: Apple 12.9″ iPad Pro (Late 2018, 256GB, Wi-Fi Only)
Was: $1,149.00 Now: $1,049.00 (Savings: $100.00)
Apple 11″ iPad Pro (Late 2018, 64GB, Wi-Fi Only) – $100 Discount
B&H Buy link: Apple 11″ iPad Pro (Late 2018, 64GB, Wi-Fi Only)
Was: $799.00 Now: $699.00 (Savings: $100.00)
HyperDrive DUO USB Type-C Hub for MacBook Pro/Air – $40 Discount
The HyperDrive Duo USB Type-C Hub from HYPER converts the Thunderbolt 3 ports on MacBook Pro (2016-2019) or MacBook Air (2018 & 2019) into additional connections. The slim hub provides one HDMI port, two USB 3.1 Gen 1 Type-A ports, and two USB Type-C ports to access printers, external drives, external displays, and other compatible peripherals.
The top USB Type-C port, which is closest to the HDMI port, supports 40 Gb/s Thunderbolt 3, 5K video output, and 100W of power delivery. The second USB Type-C port supports 5 Gb/s USB 3.1 Gen 1 and 60W of power delivery. There are also SD and microSD card slots. Please note that this hub does not support Apple’s SuperDrive and the USB ports will not charge an iPad.
B&H Buy link: HyperDrive DUO USB Type-C Hub for MacBook Pro/Air
Was: $89.99 Now: $49.99 (Savings: $40.00)
Lenovo 15.6″ IdeaPad S340 Laptop – $400 Discount
B&H Buy link: Lenovo 15.6″ IdeaPad S340 Laptop
Was: $949.00 Now: $549.00 (Savings: $400.00)
Did you like any of our deals? Did we miss something interesting? Let us know in the comments underneath the article.
The post Top Black Friday Deals for Filmmakers – Part 2: Lenses, Computers, Laptops, Tablets appeared first on cinema5D.
Moment fisheye 14mm lens is the newest addition to Moment’s line of smartphone lenses. It features 170° FOV and bi-aspherical design for edge-to-edge sharpness. With the new Moment Pro Camera app, the distortion of this lens can be corrected. It is available now for $89.99 the first 48 hours, later for $119.99.
The US company Moment is well known for their smartphone lenses and accessories. When I take a look at Moment’s webshop, I can see they offer already a wide range of lenses for their system. They have 1.33x anamorphic lens, 18mm wide lens, 58mm telephoto lens, macro lens, and Moment even already offers 170° FOV fisheye lens in their portfolio. Now, the company is announcing another fisheye lens. Why? Let’s take a look at its features and try to find the answer.
Moment Fisheye 14mm Lens
The new Moment Fisheye 14mm (full frame focal length equivalent) Lens features an entirely new optical design created for the latest camera phones (like the Pixel 4, iPhone 11, One Plus 7, and Galaxy S10). According to the company, the lens includes a unique, bi-aspherical design that uses 15% more of the image sensor than Moment’s fisheye 15mm (previously called Superfish Lens).
Thanks to the bi-aspherical optical design, the image should be sharp from edge to edge. The front of the lens is slightly curved to support this new aspherical element.
The field of view is the same as the previous Moment fisheye lens at 170°, which is 30% wider image than the 120° found on ultra-wide phone lenses. The company claims, that this lens is sharper than the ultra-wide phone lens and works in Night mode (the ultra-wide does not).
Moment is marketing this lens towards users who are thinking about getting a GoPro. They say it will deliver similar results for 1/3rd of the price (given you already have the Moment phone cover to mount this lens).
Moment Pro Camera app is introducing distortion correction for this new lens. You can slide to correct the bending of the image. This feature is currently available for iOS only. It is coming to Android in the next few weeks.
The lens is compatible with Moment phone covers, which are available for most smartphone flagships.
Pricing and Availability
The new Moment fisheye 14mm lens is available now. The price has been set to $119.99, but Moment offers this lens exclusively at a reduced price of $89.99 for the first 48 hours.
What do you think of the new Moment fisheye 14mm lens? Do you use Moment gear with your phone? Let us know in the comments underneath the article.
The post Moment Fisheye 14mm – Sharper 170° FOV Lens for Phones Introduced appeared first on cinema5D.
Peak Design is at it again – this time, updating their iconic lineup of bags and backpacks for creative professionals on the go. Indie, doc, and indie doc filmmakers, you should pay attention to the Everyday Line v2.
What We Liked in the Peak Design Everyday Line v1
The Peak Design Everyday Backpack was very well-received when it first landed, and it has become a popular bag in the photo and video communities. The original bag hits a sweet spot of ergonomic quality and stylish design that made it appealing to independent professionals all over. The key feature is their flex-fold dividers paired with full side-access to the bag’s inner workings. This allows for easy and customizable access to your camera, lenses, and more in a hurry. All that is still here in the Everyday Line v2, but with a few key updates.
What We’re Excited for in the Peak Design Everyday Line v2
The Everyday Line v2 has a few cosmetic updates, for sure. More colors and the slightly updated design can’t be ignored. But you’ll find the more important updates are in functionality and in the new bag models.
The Peak Design Everyday Backpack v2, the workhorse of the lineup, now features expandable side pockets, which in promotional materials are shown holding water bottles, and yet also travel tripods (like Peak Design’s very own tripod), and presumably anything in between those two sizes. All of the bags also feature lots of hooks and mounting points for external carry – places you can clip on an extra pouch, or strap on a larger tripod. Add in upgrades to the zippers and “MagLatch” magnetic top clasps, and you have a solid improvement on an already well-designed bag that can carry almost anything.
Besides, Peak Design also added the Everyday Backpack Zip, a smaller and cheaper model that lacks the Maglatch top access of its’ bigger brother, but makes up for it with 270-degree zipper access. It still carries many of the same exciting design features, like the mounting points and durable “ultrazips,” but at a reduced price tag. This smaller bag also comes standard with two dividers instead of 3.
New Bag Types
In addition to all of the new features, Peak Design also has created new bag types, such as the Everyday Tote, Everyday Sling, and the Everyday Totepack, a back somewhere between a backpack and a tote (as you could probably have guessed from the name). These bags all share the same basic features of the Backpacks – durable zippers, clever compartments, and clean designs. They bring the Peak Design touch to a bunch of different formats and sizes, to fit many diverse use cases.
Who is This For?
As I hinted at the top, the side-access and mag-release top compartment make this bag great for people traveling light and shooting quickly. In just a few seconds, you can unzip the side and pull out a pre-built camera and lens combo to start shooting. The whole setup is even more modular than before, with lots of mounting points for extra gear like tripods or monopods. And the chest and hip straps, as well as the luggage-mounting straps, make this a bag for all occasions. Add the durability and flexibility, and Peak Design is fighting to provide your one true bag, the one that fits every occasion. We’ll have to see if they live up to that expectation when they start shipping.
The Everyday v2 bags are available for order now and expected to arrive in time for the holidays. They also have holiday deals through December 2nd, or until they run out of bags – whichever happens first.
How do you carry your camera gear? What features do you look for in a bag? Would you use one of the Everyday Totes or Slings? Let us know in the comments!
The post Peak Design Everyday Line v2 – Great Bags Deserve Great Upgrades appeared first on cinema5D.
The Xiaomi Mi Note 10 is now available for preordering, and it boasts a total of 5 rear-facing cameras and an additional 32 MP front-facing camera. It seems like this mobile phone opens a new era of mobile photography. For just $549 the phone includes a primary camera with 108 MP, an ultra-wide-angle 20 MP (FOV 117°), a long telephoto lens with 5MP (+ 10x hybrid zoom, 50x digital zoom), a short telephoto lens with 12 MP and 2x optical zoom, and a macro lens with 2 MP. Shipping has started or starts soon in most countries.
We’re still just talking about a 128 GB storage (+ 6 GB RAM) phone, that costs a bit more than one-third of an iPhone 11 Pro Max 128 GB.
Let’s take a closer look at the main camera features and decide whether this is a feasible product for your purposes.
Xiaomi Mi Note 10 – Primary Camera Features
108 MP sounds more as if we were talking about a middle format Hasselblad characteristic. In the case of the Xiaomi Mi Note 10, the 108 MP primary camera has a sensor size of 1/1.33″, uses a maximum aperture of f/1.69, captures a field of view of 82°, sports 1.4μm 4-in-1 Super Pixel, and also includes an optical image stabilizer (OIS).
The rear camera video features include recording at up to 4K @ 30fps and slow-motion capabilities of 120 fps and 240 fps at 1080p or a maximum of 960 fps at 720p. Aside from the missing 4K @ 50/60 fps recording options, the specifications listed above still sound a bit surreal to me.
Relevant when testing the quality of these camera(s)
Some key aspects when looking at the actual quality of the pictures taken with the phone would be to analyze the defacto quality and determine the levels of chromatic aberration, lens distortion or whether and if the camera has focusing issues. It will be necessary also to inspect the levels of sharpening, denoising, and other cleaning effects, that affect the final result of the 108 MP files and will determine the factual detail level that is available. Although the sensor size is pretty big for a phone, every single pixel is about 0.8μm in size, and this will most certainly have an impact on light capture overall. The default mode of the camera will use a technique known as “pixel-binning” to combine four adjacent pixels that will result in a 27 MP image and thus leads to an increased effective pixel size (4-in-1 Super Pixel).
Since the cameras can additionally make use of the capabilities of the phone and its Qualcomm® Snapdragon 730G processor with its Qualcomm® Kryo 470 architecture and an Octa-core processor (up to 2.2 GHz) with 6 GB of LPDDR4x RAM, it also comes with a wide range of advanced and AI features.
Xiaomi Mi Note 10 – Extras, Modes & AI Functions
Aside from a lot of expected features like the usual panorama mode, face recognition or HDR, there are some more advanced options available to the end-user.
Examples are the “Moon Mode” which allows for both handheld and steady night photography. There is also another smart ultra-wide angle mode and ultra-wide-angle edge distortion correction. Group pictures are now automatically corrected with enabling the AI feature for group photo face correction. And there are some more features, that sound appealing for influencers and beauty bloggers: “AI Beautify” and “AI smart slimming”.
First Thoughts & Questions
Without being able to test the phone and its cameras, it’s tough to tell what this spec powerhouse can accomplish in real life. There are two aspects, though, which seem to be interesting about this release:
1) The camera, features, and pixel-war in mobile phones seem to have just entered a new era.
2) In the future, maybe even fewer people will be able to discern the differences between high-class real professional photography and AI-supported, pixel-boosted, digitally enhanced and through sophisticated apps automatically improved and intensified photos (at least in some cases like nature or landscape photography).
For more information, head to Xiaomi’s website.
This release finally also leads to the following thought and question, that I would like to pass to you: Will this again lead to extreme Mega-Pixel expectations of regular DSLR/DSLM cameras? Are < 50 MP in a DSLR/DSLM still acceptable? Please leave your comments below.
The post Xiaomi Mi Note 10, a 108 Megapixel Phone – A New Era? appeared first on cinema5D.
If you’re using multiple image uploads on Instagram, perhaps you could be using it more effectively with seamless carousel collages.
Family can make or break you. But no matter what, they’ll always give you something to write about.
It’s that time of year again.
The holiday season has the same connotations for people around the world. We gather with our families or surrogate family to celebrate the holidays. These times can be the culmination of year-long drama, periods of mourning for those lost, and once in a while the happiest memories that last a lifetime.
No matter what you feel in regard to the holiday season — it always offers something to write about.
Let’s talk about how you can use the next few weeks to inspire what you’re going to write next — whether it is a movie, an hour pilot, or half-hour TV spec. Maybe even a comic to be turned into a movie or show.
Sometimes it’s not so bad to be home for the holidays!
I love the holiday season, and not just because of all the amazing food. It marks a time where you can sit and binge on some of the best movies about other families to make you feel like your own is a little less crazy.
Julian Clarke, ACE was nominated for a BAFTA, an ACE Eddie and an Oscar for his editing of District 9. He was nominated for an ACE Eddie for Deadpool and won an ACE Eddie for the pilot of The Handmaid’s Tale.
His other work includes Skyscraper, Chappie, Project Almanac and Elysium. He also cut episodes of Love, Death & Robots and Altered Carbon.
Today, we discuss his work on Terminator: Dark Fate, which, like Deadpool, was cut on Adobe Premiere.
I’ve previously interviewed Julian when he edited Deadpool.
This interview is also available as a podcast, though, because of technical difficultis, only the first 30 minutes (out of 60) is in the podcast.
HULLFISH: You’ve done a lot of sci-fi. Do you feel like you’re being pigeon-holed? Or are you just happy that you’re getting the work?
CLARKE: Nobody wants to be a one-trick pony. But it becomes comfortable for you and it becomes: “Well he’s the guy who does this.” So it becomes easy to carry on the pattern. I also carry on the pattern because I like that stuff. I’m attracted to highbrow genre stuff — whether it be horror, fantasy, sci-fi, crime movie. To me, I like that stuff that’s smarter adult-oriented R-rated but immersed in genre. A lot of that stuff tends to happen in science fiction, but I’m kind of interested in that across the board. I’m also interested in comedy and drama as well. Deadpool was a fun project because it was so multifaceted. I got to dip my toes into comedy more on that one, and I think I’ll do a bit more of that in the future too for sure. After I worked on Deadpool, I worked on the pilot episode of Handmaid’s Tale, which technically called science fiction but it was almost like working on a period piece. It was dealing with the real minutia of what the actors are doing with their eyes and what’s said and what’s unsaid. So it was a very different type of editing than robots and explosions.
When I worked on Handmaid’s Take it was so nice to work on something that’s not effects-driven and is all about like real subtle stuff happening. The club of people making well-budgeted dramatic movies is very small. It’s a very small number of movies happening in Hollywood and it’s a very small club of people working on those movies. It’s a very exclusive little niche to work on the well-funded interesting dramatic movies out there. I’d like to join that club at some point.
HULLFISH: It also has to do with the directors you work with, right? Those guys are working on those types of projects.
CLARKE: Sure yeah.
HULLFISH: Something I noticed going through your IMDb page was that you’ve done a bunch of shorts. What’s the value? What’s the purpose? Obviously District 9 started as a short and became a feature. Why is Neil and why are you working on those shorts?
CLARKE: I was working on them because I like Neil and he does cool stuff and why wouldn’t I? For him, the money he’d make or the pragmatics of things really don’t play a major factor.
He got a bunch of money from Valve — the video game company — they were interested in exploring making some content and for him, it allows him to do a bunch of ideas and not have to spend a year doing them. So it just became kind of like an “idea lab.” He could do a bit of a Vietnam sci-fi thing. He could do another one in South Africa.
When you’re the director it means committing multiple years if you want to do a feature. There’s the development and the writing and then there’s the shooting and then the editing and then you’ve got to go out and tour around with it.
HULLFISH: Does editing those shorts stretch you in different ways or allow you to use different muscles?
CLARKE: They’re a lot more impressionistic. They’re less narrative. You’re don’t really need to have this beginning-middle-end thing or pay off a big character arc. You can have something that is more of a fever dream when it’s short. A fever dream that’s two hours long? People make them, but that has very something that like you know as a very niche audience. But people will watch a cool 10-minute fever dream that’s on YouTube or something like that. You’re less beholden to this sort one plus two equals three kinds of storytelling.
HULLFISH: Can you pull that stuff into your feature work? The things that you learn on those shorts?
CLARKE: Maybe in a particular type of sequence where you’re kind of doing something less linear.
HULLFISH: You’re talking about being less linear, and with this movie — Terminator: Dark Fate — if you’ve got a storyline that is based on time travel, does that allow you to change structure or be less linear in storytelling.
CLARKE: You would think that maybe the Terminator movies would be like that but it’s not Back to the Future II. Genesis was trying to become more of a time-bending movie like Looper or something like that.
But time travel is just a means to set the table, but there’s not like a lot of bouncing around. Aside from the fact that we have a couple that kind of flash-forwards that we could shuffle around in the movie since it’s a chase, it really is actually: “this leads to this leads to this leads to this” right? And so in a way it was actually kind of a movie that — even though it is time travel — it was quite linear.
The only structural stuff you could do with it you would be “where do we check in with the villain?” “Where do we put the flash-forwards?” Those were the elements that could be moved around. And then the rest of the stuff is just what you could move around within the sequences — like getting out earlier.
Deadpool was a much more structurally-malleable movie than this one was.
HULLFISH: So you’ve kind of jumped back and forth between directors you’ve worked with several times. Do you have to get into “Neil mode” or “Tim mode” when you switch?
CLARKE: I’ve worked enough with both of them that it’s in the muscle memory, so I can just slide right into it. They’re also not a radically different way of working.
Tim likes to sit in the room a lot, though he didn’t get to sit in the room as much as he did in Deadpool on this one because of the sheer quantity and scope of the VFX work. He would just be in six hours of VFX reviews a day. So he didn’t get to hang out with me as much as he likes to hang out with the editor. Some directors are a bit too fidgety to sit in the edit room for too long. It’s all a bit too glacial for them — the process.
But Tim likes the glacial process and going through the minutia with you when he has the time, but he didn’t have quite as much time to do that on this one. I’m sure he would’ve happily edited it for several more months and gone through the dailies with me more. I’m gonna get it
HULLFISH: In The Conversations, Walter Murch mentions that each director has specific foibles or things that they hate, like, for example, cutting in the middle of a word, or wanting to wait for action to stop before cutting. What are some rules that Tim and Neil have that you need to remember?
CLARKE: Tim Miller hates double-cut action, which is tricky because when stuff happens really fast, double-cutting is something that you often go to just so you can kind of slow the action down long enough that the shots are long enough that you can kind of read what’s happening. But Tim really likes it to not be double-cut. Sometimes he lets you get away with a little double-cut, but VERY double-cut — like you have the thing explode three times in a row — he’s not a fan of that.
HULLFISH: Murch mentions a director that hates pre-lapping dialogue over the establishing shot of the building, like “What? Is the building talking?”.
CLARKE: Yeah. I’m not a big fan of pre-laps either but they sure help you speed up the pace or if a transition feels kind of sloppy, they can smooth it, so sometimes I do it, but I don’t love it.
HULLFISH: You’re back on Premiere again. Are you on Premiere for good?
CLARKE: I’m not a disciple of one platform. On Deadpool, Tim said, “We’re editing on Premiere,” and I was like, “Okay. Cool!” There were a lot of challenges to that one, so once we did it we talked to Adobe and told them, “These are the things that would be great to work on.” Tim’s definitely sticking by Adobe for sure.
So when we came on to Terminator Adobe had basically made a whole new version for us — which I think is probably going to be publicly released pretty soon. They basically did a bunch of stuff that lets you do kind of Avid Unity-style networking and shared project stuff. Basically, it’s like bin-locking. It’s not a very sexy feature because it’s been around for a long time with Avid, but it is unique in that the other NLEs haven’t bothered to make that happen, and it is integral to a movie of this size that you have that functionality, because we had four assistant editors, a VFX editor with two assistants, so we have eight or nine people like in the project file updating VFX, and pulling lists.
On Deadpool, we were doing it all at the finder level. We got away with it on Deadpool, because we were smaller, but this one was had 2300 VFX shots. We really needed that functionality, and it really worked. So even though this movie was a lot bigger than Deadpool it actually was a much smoother process. And Adobe’s going to work on it and evolve it further, so on the next Miller project, it’ll be even more souped-up.
HULLFISH: The last time we talked about using Premiere, I remember there were issues and we talked about what some of the challenges were but that was quite a number of years ago.
CLARKE: On Deadpool, that was 2015.
HULLFISH: So in four years it’s come a long way. Would you say bin-locking or project sharing is probably the biggest thing?
CLARKE: I told them to essentially focus on the non-sexy stuff because really the non-sexy stuff is often the stuff that causes your assistant editors to work, till 2:00 in the morning, right?
They worked on getting the project files to open way faster. On Deadpool, we had really big project files and they would take like four minutes to open and now — because of the combination of working on how fast they open with the way that they are now splitting all these things in separate projects — essentially a bin is a project — so now they open up in 10, 15 seconds.
That combined with the networking and then having it integrate more successfully into ProTools so that turning over to sound was a much smoother process. There’s still a little bit more work to do there in terms of change lists.
Across the board, all that not particularly exciting having-the-whole-machine-churning-efficiently-and-non-frustrating stuff advanced light-years,
HULLFISH: So you really are using essentially a different project for every scene?
CLARKE: Yeah. If you’re an Avid user, think of a project as a bin. Then there’s a new type of project which holds all those projects: a master project. So you create a project file for scene one for scene two and a project file for reel one and two, et cetera. So it’s a project within a project basically.
HULLFISH: Got it. Is there anything that you really find that you love in Premiere that when you go back to Avid you miss?
CLARKE: There’s a lot of stuff in Avid where I use it and it just like feels ancient like audio suite makes me want to kill myself like you know Dever but the energy was like he was like like Why the hell is that thing still around right. I mean the way just like that you say, Okay, here a filters tab and here’s a stack of audio filters. That makes way more sense. It’s just simple. Intuitive. AudioSuite is horrendous. And the same thing about stepping in for using filters. All that stuff is so clunky and slow and feels like it was designed like 20 years ago — which it was.
Avid’s sort of the beat-up car that keeps running forever and will get you there, but there’s some pretty creaky stuff that’s been in it for a long time.
HULLFISH: Let’s talk a little bit about your approach. Has it changed any since the last time we talked? When you’re looking at a blank timeline what’s your approach to viewing dailies and then starting on a scene?
CLARKE: It kind of depends on how much has been shot and how much time I have. If you have five hours of dailies you could watch all of them and then start cutting and you’d basically be nowhere because you’d sunk a huge portion of your day into watching everything. If they’ve shot a little bit then you watch it all or if they shot a ton, then you kind of have to prioritize — what’s the stuff I’m going to watch all the way through? Watch the stuff I’m going to kind of scrub through? So it’s kind of just time management in terms of what you’re up against.
It’s a nice luxury to get to watch everything, but sometimes watching every single frame is not possible based on the quantity they’re shooting. If you have multiple editors then you can split it up. But I was on my lonesome on this one.
HULLFISH: If you listen to many of the Art of the Cut interviews, people will say, “I watch everything EVENTUALLY.” When you’re cutting dailies, it’s a different story.
CLARKE: When you come back and you’re revisiting the scenes, then you definitely finish seeing what you didn’t see in the first go-round.
HULLFISH: So, you were saying that you have to decide what to watch completely during dailies. What do you watch completely? What do you scrub through? Glancing at wide shot masters and diving into the closeups?
CLARKE: Yeah. Wide shot masters are often — depending on how the director works — that’s often where he’s kind of warming up the actors. So if they did 15 of those I might not watch all of those. Often the actors are saving their good stuff for the close-ups as well as the fact that sometimes the director’s trying to find it in those first masters. Sometimes you can find some stuff that is quite different in those early masters that is interesting but more often than not they’re more disposable. I’m definitely going to watch the circle takes, and a bunch of the close-ups probably. Less of the masters and if there’s boring stuff like inserts of phones and stuff like that, that’s something you can definitely scrub through.
HULLFISH: Once you know you have the insert of the phone, that’s all you really need to know.
CLARKE: Yeah. And the close-ups are interesting because often the last takes are the circled takes and those are the ones that are closest to what the director’s vision is. But then sometimes those are ones that are the most polished in a way that they can be slightly artificial too. So sometimes you find the earlier takes are more off-the-cuff and they’re more naturalistic. They kind of hit the beat harder than the later takes. So that’s a toss-up of whether you want a performance that sells it really hard but has a little bit of artifice to it or something that’s kind of more naturalistic than the earlier takes.
HULLFISH: Do you learn the working habits of your actors? Like, Ryan Reynolds is always great on take one and two but by take 12 and 13 he’s bored but his co-star is always rough on takes one and two and is on point in the final takes.
CLARKE: There are certain actors who are very consistent and then they modulate a little bit, or they improvise, so you have to watch everything, then the circled take really means that the director prefers this improvisation the most. And then there are other actors that are totally erratic and you have to kind of piecemeal the performance together.
HULLFISH: When you start piecing that stuff together, do you use selects reels or are you just going through bins or do you put locators in? How do you remember all those great moments?
CLARKE: I find with action stuff, they do it so many times and it’s often very quickly cut and so when you’re trying to figure out what’s the best punch, then I’ll get my assistants to break it down to each beat. So here is that punch from every single camera angle and take all in a run and then you can decide, “This is the best. This is the best. This is the best.” And then even make a mini-string-out between those and then pick and do that for all the action stuff.
That gives you a good starting point for finding your favorite pieces because the action stuff can kind of be overwhelming with how much they shoot and how many angles — and sometimes the differences between the takes are very subtle: like this punch is 10 percent more credible than this one. The ability to decipher that from memory is something I don’t have. So the string-outs are kind of essential with that stuff.
Then with the dialogue stuff, It really becomes more of a question of how much they shoot. I can kind of hold the dialogue performance stuff in my head if it’s not a horrendous amount of takes, but if it’s a huge amount of takes then you need to start putting in locators or stringing that out as well.
I kind of think it would be great to have string outs of the dialogue stuff as well, but I’ve kind of felt like that’s inflicting a lot on the assistant editors to do that on top of the action string-outs, so I only ask for it when I’m desperate. It is actually very helpful. In some scenes, like where the director is resetting a lot or doing a lot of alts or jumping around in the scene, then it really becomes very hard to know where stuff was in your bins and then stringing it out is kind of helpful because when they’re not making a take be a take, and they’re jumping around a scene in a given take, then locators only help you so much. It becomes pretty confusing and you can be pretty inefficient in finding stuff, so string outs can definitely help that kind of situation.
HULLFISH: How are you organizing things in bins? Do you use the same workflow or the same methodology in Premiere as when you’re working in Avid?
CLARKE: Actually, yeah. That was another feature that Adobe made for us. I don’t think it’s been generally released yet, but they basically made a custom bin setting, so you can move stuff around like in Avid instead of having this retiling thing that has been the Adobe standard. So, I like to do that layout where you’ve got an A cam and B cam and C cam below and then the group clip of all of them on top. Then you have your masters and then your close-ups and then your inserts at the bottom.
HULLFISH: I haven’t seen that feature yet.
CLARKE: That might be part of the next release.
HULLFISH: The Premiere folks I spoke to before this interview said they’d done a bunch of new features just for this movie.
CLARKE: That was one of the ones they did for me. Well, me and probably a bunch of other editors.
HULLFISH: Oh I’m sure of that.
CLARKE: When your NLE has your organizational system be very intuitive and well dialed-in, whether it takes five seconds or 30 seconds makes a really big difference in terms of how the vibe of the room is.
HULLFISH: I love that idea about how important the vibe of the room is, because — as good of an editor as you can be — if all of a sudden something goes wrong in that room that’s not your fault necessarily — it makes a difference in the way your work is perceived.
CLARKE: Absolutely. The negative energy in can be brought in to how people look at your work and in terms of how you’re pitching your ideas and stuff like that. The psychology of how people perceive things is weird. For a long time I had a don’t pitch attitude when you’re doing changes, because sometimes people kind of talk themselves out of the new idea before you even show it. They think, “That’s not going to work because of this, this and this.” Now you’re going rogue by doing it. So it’s better to show them first than to talk about it and pitch the idea and THEN show it.
HULLFISH: I really am fascinated by that concept because I’ve heard different people say different things about how they pitch a new idea or when to pitch a new idea. So let’s talk about that a little bit. Especially in an assembly, if you’re cutting something that a director is going to see for the first time, do you make sure that you deliver it as the script is, or the way that you think is best? Or do you deliver it in a way that you think the director would want it?
CLARKE: I think you need to have a read on the director’s personality, because if the director is going to be horrified by you deviating, then you take that on board. And if they’re not going to be horrified, then you kind of feel a little more free to experiment.
I definitely feel obliged to try to execute the plan. Sometimes it feels like the plan is not going to work. And then you can show them a version of the plan that works. I try to follow their approach as closely as I can.
For instance, in the scene where Grace dies, in the script there was a very different order to when the Terminator comes out, and when it sees them. In the original, scripted version the Terminator came out way earlier and then it was kind of meandering around, not finding them. It felt like the tension was inert. I felt like it needed to show up and immediately kind of go for them. So I cut it that way right off the bat because otherwise it felt really wrong. We edited that in a million different ways, so it kept evolving, and Tim was fine with that.
SPOILER ALERT OVER
HULLFISH: Usually the problem with NOT doing the plan, and just going straight into your own idea is that you’re not giving the director a chance to understand that his way won’t work.
CLARKE: I definitely like to show the director what their plan is. If you’re going to leave Plan A you’ve got to kind of carry them along with you. Or if you have the time, you cut multiple versions. You cut the Plan A version, then you cut the Plan B version. Then it gets complicated because eventually those start having different structural destinies as well and then you don’t just have two different scene versions, but two different structural cuts of the movie.
You want to spend as little time as possible on multiple structural versions of the movie because it just becomes really labor-intensive to update the VFX and stuff like that. When you finally get into the director’s cut, are you cutting in reels? Or are you cutting single scenes? Do you attempt to have the entire movie, like a screener? No. On any NLE I’ve ever used, the performance is terrible if you have the whole movie.
Despite the fact that we’re no longer projecting 35mm, reels become a functional way to break up a movie. Once I have a third of the movie shot, I start to build the thing and put in cards where the missing scenes are, so I start to build the structure as it’s being shot. If you’re looking at it purely on a scene level, you’ll miss things like pace or structure or transitions. This big-picture stuff that you can catch if you’re building the structure as you go, whereas if you’re just dealing with it on a scene level, you’ll miss it.
HULLFISH: That happened to me on a film that I cut recently where I cut two scenes separately but while we were still on location I put them together and realized that the two scenes needed a transition shot to bridge them. If I hadn’t put them together, I wouldn’t have know that something was missing.
CLARKE: It can be a great scene on its own, but that doesn’t mean anything. How does it behave in the context of the movie, right? There’s a lot of stuff that kind of changes once you kind of have it all in a row. To me, that’s the whole benefit of having an editor assembling as you’re shooting. Hopefully, there are not too many of those things, but there can be little things you can grab.
HULLFISH: Any specifics on how you’re collaborating with the director as you’re going through the director’s cut?
CLARKE: We screen it fairly regularly and he loves to show people so we’re always pulling people into our screening room. We’re always generating feedback. So there’s always something to work on. Despite coming from a VFX-animation background, he really is like an actor’s director. He is very focused on performance. He has an incredible BS detector. When something seems a little bit artificial or forced or whatever, he can sniff it out.
HULLFISH: You mentioned the screening room. Did you actually have a nice little multiseat projected screening room? And how were you monitoring audio coming out of Premiere?
CLARKE: We monitored in 3.1. I think I did that for the first time on Skyscraper. I haven’t done 5.1. I feel like it might be a little too much effort. I love having really great sound. 3.1 lets me push the music off the dialog and have a kind of a boom track that just makes it feel much more like an action track, cinematic sound. And so you can definitely create a much better, immersive sound experience. I’m all about trying to simulate the finished product as much as possible. 3.1 is enormously helpful in terms of doing that.
In terms of our screening room, it was basically a little space that we shared with VFX. They brought in a nice projector that they used for VFX reviews. It wasn’t massive. It could probably fit maybe 15 or 20 people.
HULLFISH: So for those who have not heard the term 3.1, that’s really like LCR with a subwoofer.
CLARKE: Yeah, exactly. Yeah.
HULLFISH: Did being in that screening room and watching screenings affect you in any way differently than being in your cutting room?
CLARKE: Well, for sure. Wide shots play so much differently on the big screen. I think you notice different things between the wide shots and the close-ups. When I watch the movie in IMAX there are certain edits that I feel could be longer now because there’s something about having it fill up your whole field of vision and it becomes more overwhelming. It takes slightly longer to fully take in all the information. So then, what screen size are you perfectly calibrating your edit for?
For sure it’s useful seeing it on a different size — closer to movie size. I wish I could screen it on a REALLY big screen.
HULLFISH: Another thing I thought of when I watched a big VFX movie that probably had a ton of shots is that when you’re cutting those shots in with previs wide shots, they probably aren’t very interesting or compelling, but then when you get the finished VFX shot in, you probably wish you’d cut it in longer because now it looks so amazing and you want more time to take it all in.
CLARKE: Yup. I learned that on Deadpool. You can read pre-vis really fast because it doesn’t have motion blur or particles. But once you add those things, then the shot becomes infinitely more complex to read, and it almost needs to be double the length. Shots that I had cut into action scenes at 10 frames needed to be lengthened to 36 frames because the motion blur would just completely transform your ability to comprehend what you’re looking at. So now whenever I think of cutting in an incredibly short previs shot, I make sure to pad it.
HULLFISH: Also the previs doesn’t have textures and there’s no facial expression or emotion.
CLARKE: Also, you can do amazing, impossible camera moves in previs and it looks really cool and dynamic. But if you do that and you see the final shot, it’s like, “Oh! CG camera” when you view it in high res. It just seems fake. It’s too cool and perfect and so you actually need to make the camera work sloppier. So, when you do the effects, you need the camera to overshoot and pans back. You need to intentionally have imperfections to ground it and make it seem like a real camera.
HULLFISH: When you were talking about editing dailies and delivering them to your director on set. Were you doing that through something like PIX? And were you getting feedback during the shoot? I’ve talked to several editors who deliver scenes to the director to view during production and they’re just too busy to respond.
CLARKE: I think Tim loved watching those scenes during production. He was so excited to see a cut scene of something that he’d shot two days ago. And some directors — when they get those scenes — they hoard them… they keep them to themselves, but Tim would show them to everybody, like, “Hey, craft services! Look at this!” As for actual feedback I only get that erratically. This was such an intense shoot, so even on his days he was having meetings and was engaged in other stuff.
HULLFISH: A director I worked with would bring in a few members of the crew — mostly camera crew — and show them stuff at the end of the day. And when he figured out that I could Airdrop the scenes from the Avid onto his phone, that was like a miracle. There’s probably some I.T. security company that’s flipping out.
CLARKE: Yeah. Our editing computers weren’t even connected to the internet. It’s interesting for you to say that the director came in for you. So often, the director might be in another country and they’re just sharing cut scenes over PIX. But they like the director to actually come into the cutting room so that they can show them — they can react AND the director can’t re-watch the scene a thousand times. It’s much more like watching a movie in a theater where you see it once and give your honest, initial reaction instead of picking it apart by watching it over and over again. You don’t sweat the small stuff.
HULLFISH: Tell me a little about that shooting experience. They were shooting where? You were editing in L.A. and using PIX to get dailies?
CLARKE: Yeah. We would send dailies and cuts that way. In the old days, you’d do a DVD burn, but that was a nightmare because you had to put a unique burn on each DVD. (Adding a watermark of the name of each person over the image.) That’s a full-time job for one person. So PIX adds the watermark to the stream and you only have to upload the one version.
HULLFISH: And was your team getting dailies via PIX?
CLARKE: No. We were getting Aspera downloads. I think the lab was delivering the dailies to the studio with PIX.
HULLFISH: On my last bigger film, we were using PIX to actually deliver cut scenes to the studio instead of all of the dailies. It was a faster way to show that the work was getting done and the progress and quality.
CLARKE: Hmmm. For me the cut scenes were only going to Tim and if he wanted to show people, those cut scenes were only going to the people he wanted to show. I wouldn’t be sending cut scenes to the studio.
HULLFISH: The director definitely has to make that decision. The director doesn’t want the studio to see something that they don’t approve as a cut scene.
CLARKE: Yeah. Exactly, especially with a cut scene, the editor’s cut of a scene might not be exactly what the director wants it to be, and then you’re entering into the politics of what the movie should be very early on. That shouldn’t happen until after the directors cut.
HULLFISH: I wouldn’t send the cut scenes on PIX until the director approved the scenes and the studio definitely knew that it was just the editor’s cut. But for upload and watching purposes, it cuts the amount of footage that gets sent to the studio from maybe six hours a day down to 4 minutes.
CLARKE: If the director is on really good terms with the studio and feels really confident in their power I’m sure that would work great. Information is power. So the question is: does sharing the information in this way empower the director or not. It probably really depends on the specific film and relationship.
HULLFISH: True. The relationship would count. And this director had absolute final cut.
CLARKE: So in that case, there’s nothing to lose. And with a cut scene, the studio could air their concerns while there’s still time to do something if the director decided to listen. That seems like an ideal situation.
HULLFISH: Do you remember how long the assembly was? And then where you got down to for the final movie?
CLARKE: It becomes a bit approximate, because there were some scenes that were really long and on those, they were cut down as part of the assembly. So those scenes do exist as longer versions, but they never made it to the assembly. If we counted those scenes, then the mega-cut would have been about three hours, but because I started slashing and burning during the assembly phase it came in around 2:43 as a first cut. Then the director’s cut got down to 2:20 and by the end of the process it was about 2:05.
HULLFISH: Were there any big challenges in cutting this film?
CLARKE: The biggest challenge in this movie was in handling the protagonist. In the first movie you had Linda Hamilton then in T2 you had Linda Hamilton and Arnold. And in this one we have Grace and Dani and Linda and Carl, so we have four protagonists plus this alternate Skynet storyline. So in a way we’re doing an alternate villain as well, so it becomes a whole lot of information and character arcs, so finding the right amount to feel like you were servicing each character arc and have it feel real and satisfying without having the middle of the movie bog down.
In a way, we were kind of emulating T2 in a way which had this dramatic middle section followed by a relentless third act, so we kind of mirrored that structure, but with so many characters, it was kind of easy to bog down in the middle. We didn’t want to feel like we’d left the action movie behind for too long. That became the major challenge of the movie.
HULLFISH: Did that take a lot of full-length viewings of the movie to get the sense of that? Or could you sense it on a more micro-scale than that?
CLARKE: Well, I think both. The hardest character was the Dani character because she changes the most. She goes from being this naive innocent to being on the road to being this future leader character. That’s sort of similar to the Linda Hamilton character’s journey in the first film, but in a way, this needs to go even further because in the first movie Linda Hamilton was just the mentor of the future leader. And on top of that expansive character arc we have to give Linda and Arnold their due since they’re franchise characters. So giving Dani the space to grow into that character was actually quite delicate work.
HULLFISH: As many other editors I’ve interviewed have said about other action films: the action doesn’t really work if you don’t care about the characters.
CLARKE: Yes, and you also have to have that character stuff in the action scenes too. To me, the first half of the movie is a little more effective than the second half in that regard. In the freeway chase, I feel very anchored in the characters. I feel very anchored in the characters in the detention center. Those are scenes that are exciting and dynamic and you’re also very connected to what the characters are feeling and the jeopardy.
That becomes a lot more challenging in the scenes with the T5 and it becomes so VFX-driven and spectacular. That’s where it’s harder to keep your hooks firmly tethered into the characters.
HULLFISH: You were talking about the process of getting yourself from an almost three-hour long first cut to an almost two-hour long final cut. I’m assuming that was a combination of big macro cuts and also finer trimming of each scene?
CLARKE: Yeah. There were a couple of big lifts and then a lot of snipping out dialog and stuff like that. It was definitely the right way to do it, but a lot of scenes were a bit over-written in the middle because they wanted to explain how things worked but it was just too much, so we had to kind of boil and reduce the scene into the essential information along with a thread of character. There was a lot of that that happened in the middle of the movie. Then there was a big chunk that came out quite late in the editing process in the border-crossing sequence. So we end up doing a pretty drastic reduction on the border crossing sequence. And so it became a little more transitional. It used to be a set-piece, but it didn’t kind of measure up to the other set pieces.
HULLFISH: It all depends on the movie. Sometimes it’s chipping away at smaller pieces and sometimes it’s more structural. Sometimes the scenes feel like they’re as tight as they can be, so you have to go for big chunks.
CLARKE: Sometimes you realize that there are whole scenes that are kind of redundant. Maybe it’s a beat that reinforces something that’s already been said. No editor likes it where you feel like the scenes are great but the overall pace of the movie is bad. So then you’re yanking out content of a well-paced scene just to kind of get the overall number down. In those cases, it’s nice to find a whole lift that can go so you’re not overly tightening the dialogue or pacing of good scenes.
HULLFISH: Or sometimes, it’s even an awesome subplot or B story, and as cool as that subplot is, it’s got to go.
CLARKE: In this movie, there was a piece with Linda Hamilton after they stole her car and she’s left with the rev 9 reforming. There was a scene of her stealing another car. She orders a guy out of his car and takes his car and drives off. Then there’s a section where she surreptitiously follows them. But eventually, you realize that you don’t really need that. You know that Sarah is resourceful warrior. She just shows up with another car? You get it. She’s not going to be stopped by having her car stolen.
HULLFISH: And did some of those friends-and-family screenings or more formal screenings prove that out? You could see that a general audience could make the leap?
CLARKE: The thing about friends-and-family screenings: you can never trust them. You can never trust the general enthusiasm level, but what you can trust is confusion. You can definitely tell if they’re not tracking. You can get some sense of pacing, but they’re generally pretty charitable with pace as well. That’s something you definitely want to gauge before you deliver a director’s cut because you don’t want to do a bunch of big lifts and feel like you’ve created something that’s incoherent. So for sure, we’re testing out those experiments in screenings.
HULLFISH: Got it. Julian, thank you so much for your time today. Really appreciate talking to you.
The first 50 interviews in the series provided the material for the book, “Art of the Cut: Conversations with Film and TV Editors.” This is a unique book that breaks down interviews with many of the world’s best editors and organizes it into a virtual roundtable discussion centering on the topics editors care about. It is a powerful tool for experienced and aspiring editors alike. Cinemontage and CinemaEditor magazine both gave it rave reviews. No other book provides the breadth of opinion and experience. Combined, the editors featured in the book have edited for over 1,000 years on many of the most iconic, critically acclaimed and biggest box office hits in the history of cinema.
The post ART OF THE CUT with “Terminator: Dark Fate” editor Julian Clarke, ACE appeared first on ProVideo Coalition.
There has been a trend toward bigger and bigger lenses, sacrificing weight savings and portability for ultimate image quality and mind-blowing apertures. And while few will argue against improved image quality, sometimes, you just do not want to lug around a huge chunk of glass. This great video review takes a look at a lens that embraces portability: the Sigma 45mm f/2.8 DG DN Contemporary.
As we wrap up 2019 and the 50th anniversary of the moon landing in 1969, Sotheby’s has one more NASA-themed auction up their sleeve. Launched yesterday, the Space Photography auction includes over 100 original NASA “red number” prints, including some of the most iconic images to come out of the US space program.
This lot of 140 prints comes from the collection of photography collector and dealer Philip Kulpa, and they’re not just anyone’s old prints of NASA public domain photographs. These are “Red Number” prints, meaning that they are the original chromogenic color prints direct from NASA, complete with ‘NASA’ and the mission name or number stamped in the margin in red ink.
Included in the collection are some of the most iconic images to come out of both the Gemini and the Apollo programs, including Buzz Aldrin at Tranquility Base, man’s first footprint on the moon, a view of the Earth rise as captured from the lunar surface, and Apollo 16 Commander John Young’s famous jumping flag salute.
Here are just a few examples:
This “Space Photography” auction is online only, and is running from the 25th of November (yesterday) through the 2nd of December. And while we don’t expect anyone will be able to snag any of these for under one grand minimum—each of the 10 x 8-inch prints pictured above are expected to go for between $2,500 and $5,000—none of the items seem to have a reserve, with some bidding as low as $50 as of this writing.
Who knows… put in a bid and you might get lucky.
To see all of the “red number” prints currently up for sale—as well as some other great items like a signed print of Buzz Aldrin on the surface of the moon that’s expected to fetch up to $9,000—head over to the Sotheby’s website.
Washington Redskins quarterback Dwayne Haskins missed the last play of his first NFL victory this past Sunday. The reason? He was taking selfies with fans in the stands.
Hive focuses on it’s essential features with the new, more affordable, CX line of lights.
We’re a pretty big fan of the Hive Wasp line. Why?
They are single-source, punch, “omni-color” (RGB diodes plus more!) line of LED lights, available in 50, 100, and 200 “strengths.” In fact, if you’ve read a camera review here on NFS over the last few years and maybe saw photos with crazy backlight colors… odds are it was coming from a Hive, powered by the shot app.
The American Civil Liberties Union (ACLU) has sued the United States government on the behalf of five photojournalists who allege their rights were violated ‘on multiple separate occasions’ while reporting on conditions at the US-Mexico border. According to the ACLU, the Department of Homeland Security made a database of journalists and photojournalists who were reporting on US-Mexico border conditions and used this database to target, detain, and interrogate them.
The lawsuit was filed on the behalf of Bing Guan, Go Nakamura, Mark Abramson, Kitra Cahana and Ariana Drehsler, all of whom are professional photojournalists and U.S. citizens, according to the ACLU. The lawsuit alleges these individuals were among the journalists included in Homeland Security’s secret database.
The database allegedly contained the photojournalists’ names, birth dates, headshots, and information about whether they’d been interrogated. An ‘X’ was allegedly used to cross out the individuals who had already been interrogated, indicating that the ‘random’ secondary screenings and interrogations they were subjected to weren’t actually random.
Bing Guan said in a statement to the ACLU:
‘I was being targeted by my own government for reporting on conditions at the border.‘
The ACLU explains that photojournalists were detained when they attempted to reenter the United States, at which point they were allegedly interrogated about various matters ranging from their observations of the condition of the border and shelters to whether they could identify people from a series of headshots. Multiple photojournalists claim they were forced to reveal the images they had taken and that at least one officer had used a phone to snap images of the photos.
Forcing the photojournalists to disclose details about their sources and observations was a violation of the First Amendment, according to the ACLU, which calls the ‘disturbing actions’ a potential deterrent that may prevent other journalists from pursuing similar work.
The ACLU said in its announcement of the lawsuit:
‘That the government’s actions occurred at the border makes them no less unlawful … When the government tries to circumvent constitutional protections, we must hold it accountable. No journalist should have to fear government interference for having the persistence, courage, and commitment to expose the truth.’
The lawsuit can be read in full on the ACLU’s website. Plaintiffs seek an official declaration that their First Amendment rights were violated; they also want the records related to their interrogations to be expunged and more.
Every day there is an unreadable amount of content that purports to tell you how to achieve success in everything from Instagram to business, but there’s one important ingredient that is often missed out. And it’s inconvenient.
Last year, I used a custom-built setup to shoot a timelapse of an eternal terrarium in my kitchen. Over the course of 10 months, a camera took two photos every hour of the day, while the plants inside the terrarium grew on their own without disturbance.
I’d like to show you what I did for this project, what kind of technology I used and also what obstacles I faced in the process.
A while ago, I saw a video by YouTuber SlivkiShow (he has channels in Russian, German and English) where he built an eternal terrarium and documented it on several discrete occasions over a period of a few months.
An eternal terrarium has a sealed ecosystem (e.g. it exists in an air-tight glass box), no possibility to exchange oxygen or carbon dioxide, and has to work with the amount of water that is present at the moment of sealing. When it evaporates, the water will gather at the top before later falling down like rain.
I found the idea quite fascinating, but given my background in timelapse photography (mainly in the mountains) I thought it would be interesting to document the growth of the plants inside a glass box in a timelapse video with a fixed camera and seemingly continuous light. I figured I would do this for maybe six months… maybe even a year.
Camera & Lights
But how do you light a scene continuously for one year and, furthermore, who is going to press the shutter every half an hour or so? I realized quickly that this project was going to be wildly different from the timelapses I had done before.
I knew from the very beginning that if I were to embark on this project, I couldn’t risk using a camera that I wanted to use again afterwards, so I settled on an old Nikon D5000 that I bought used at a steep discount a few months before.
For lighting, my first idea was to set up a continuous source: an LED panel, for example. As the change of day and night would be brighter than the light, I could just put a cardboard box over the scene to block out daylight completely. But if I did that, the plants would probably not get enough light to grow.
I decided, instead, to use one of my Nissin flashes, because then the plants could get as much daylight as they wanted, and the flash would assure the same brightness in the photos at any time of day. Provided, of course, that the exposure settings in the camera were aggressive enough to block out the daylight. I would just need to trigger the flash with the camera, which I did via cable because a wireless transmitter would run out of battery at some point.
As a power source for the camera, I ordered a battery decoy inlet with connection to an AC/DC converter, so that I could power it from the mains. The flash was powered the same way—I attached alligator clips to the pins inside the flash’s battery compartment and connected them to the outlets of a 6V power adapter.
For optics, I used a 50mm Nikon lens and taped down the focus ring to avoid any possible change in focus.
To trigger the camera, I didn’t want to use the internal interval timer, because it would stop after 999 pictures, so I decided to use an Arduino board. The code I wrote for the Arduino worked by shorting the contacts of a trigger cable (connected by alligator clips) by setting a digital output pin to “high” at a given time interval. Each time it triggered a short, the camera and flash would fire.
As a power source for the Arduino, I used an old laptop that was constantly plugged in. This didn’t just provide power, it also meant I could update the code on the fly, if need be, without moving anything around.
If you’re interested, you can go to my website to view the Arduino code for this setup.
I took an aquarium that I had used before (for ‘fruit falling into water’ photos), put a layer of gravel and sand inside as water drainage, and then some soil that I had gathered from a nearby forest. I was hoping that the soil I picked up would include seeds that would then start growing something on their own. I finished with a thin layer of dark potting soil to have a more even image and so I could see better when something started to grow.
Before sealing, I also added a few seeds from the kitchen (cress and basil) because I knew that at least they would grow easily. (read to the end to get my later opinion on that…)
I poured in some water (how do you know if it’s enough? Hint: at first it wasn’t enough), cleaned the glass on the inside carefully, and then sealed the top off with the glass from a picture frame and some silicone sealant, making it air-tight.
I built a practical setup out of spare wood to hold the terrarium and the camera in the same relative position to each other for the whole time, and coated parts of it black to cut down on reflections.
I screwed the camera to a block of wood with a standard tripod screw and then glued the wood to a base plate (no glue on the camera). I also built a holder so that the flash was always in the same position on top of the terrarium, and painted it black as well.
Because the evaporated water could not escape from the terrarium, a layer of haze was building up on the glass, which softened the image until, at some point, one couldn’t see anything anymore.
Using a hairdryer to warm the glass from the outside all the time proved to not be efficient and also quite disturbing… so I installed a heatlamp (like the one you would on a terrarium with reptiles inside) to warm up the glass, which turned on and off every 15 minutes in the time with the photos so that it was never on when a photo was taken.
Unfortunately, the glass on top also warmed up, and no water droplets were gathering or falling down any more. So I had to install a small fan from an old PC to cool the top glass, which finally worked to keep the ecosystem in balance.
Fast forward 10 months…
With this setup, the plants grew for a few months. The cress was the first to come out, grass followed and then various other small plants as you can see in the video.
When they started to dry up after about 8 months, I broke open the top glass to have them dry out further. Then in the end, I poured Ethanol inside and set the plants on fire, to only leave back “burnt soil.”
I changed the SD-card about once a month and backed the material up regularly. In those 10 months, about 200GB of images were gathered.
For the post-production step, I used the ever-trusty LRTimelapse by Gunther Wegner, which I used to import the material in batches of about 2,000 images, calculate a preview, and then delete the frames that had too much haze, no flash, or the heatlamps in them.
I ended up using exactly 14,275 frames for the video. Taking two pictures every hour for just over 10 months would produce a little more, but between the ones I lost or had to delete I ended up using a total of about 96% of the pictures I captured. In the graph above, you can see the phases where I lost or had to delete pictures. For example, a lightning strike in August or problems with the lamps in December and February.
After rendering the timelapse video, I imported the sequence into Adobe Premiere and sped it up to fit to around 5 minutes. I also changed the play speed a bit, to lengthen the more interesting parts where plants come out of the soil and speed up the latter parts where all of the action was dying down.
Although most of this project worked out quite well, there were a few obstacles that I had to sort out:
Not enough water in the terrarium.
After the first week, everything that had grown till then, fell down and dried up. So I drilled a small hole in a top corner and via an injection syringe added some water.
- Plants leaning toward the light – As the position of the terrarium was near a window, the plants tended to lean to the right and would potentially tip over, so I added an LED panel on the left side that was switched on only during the daylight hours, to straighten up the plants.
- Lightning strike – A lightning strike hit the house and the peak voltage cleared the Arduino and also caused the Nissin flash to go off continuously for a day or so until I noticed. Fortunately, the flash did not take any permanent internal damage (at least to my knowledge). I could re-upload the program to the Arduino easily but lost a day of footage, which accounts for a minor jump in the final video.
- Corrupted files on the SD-cards – A month after the lightning strike, I noticed that some files on the card could neither be opened or copied. The problem occurred with only one other card at a later point, but not the remaining cards. I could not finally decide whether the problem was with the SD-cards or if the camera had taken damage, but the remaining 5 months the problem did not occur again.
- The timer for the heatlamps was not exact enough – Every three or four months I had to reset the timer for the heatlamps, otherwise their glowing would show up in the pictures.
- Plants did not look as natural as I wished – On the one hand, because I added seeds from the kitchen, the main plant inside the terrarium was basil, so it looks kind of human-cultivated instead of natural-grown. On the other hand, maybe without these the terrarium would have looked empty, so I guess it was quite alright.
- No animals – At the beginning there were ants and worms crawling around (from the plants they must have had enough oxygen) and I was hoping that they would contribute to a diverse ecosystem in the terrarium, but after the not-enough-water-problem, they all perished.
All in all, even with countless hours of trying, failing and trying again until it worked, and many obstacles that popped up along the way, this was still a fun and interesting project.
It ended up being a great learning experience on many subjects, ranging from photography to electrical, to some kind of gardening. And should I ever get an idea for a similar project, I bet I’d do it again.
About the author: Philipp Rameder is an IT-student from Austria who enjoys photography when he’s not busy studying. His main focus is portrait photography, landscape, and timelapse, but he loves to try out new concepts and just be creative with what comes to his mind.You can find more of his work on his website, or by following him on Instagram, Facebook, and YouTube.
Fstoppers is starting our Black Friday sale early with some of the biggest discounts we’ve ever offered, and they’re lasting all week. Check out today’s spotlight of $200 off “How to Become a Professional Commercial Wedding Photographer” and $50 off “Photography 101: How to Use Your Digital Camera and Edit Photos in Photoshop.” See the discount details below or visit the Fstoppers Store for discount details on all our tutorials.