Imagine Products has released a ShotPut Pro update and a new Imagine HQ iOS app. ShotPut Pro has been the preferred offload, backup, and verification program for many major motion picture companies and it is used by more than 50,000 people worldwide. The new features and capabilities in ShotPut Pro Mac 2019 supports handling of … Continued
A new Kickstarter campaign is seeking funding for I’m Back, a relatively low-cost digital back for medium format cameras. The digital back supports a number of analog cameras, including the Mamiya RB67, 645, and C330, Rolleiflex Automat, Bronica Etrsi, and others. I’m Back GmbH, the company founded by Samuel Mello Medeiros, previously introduced a digital back for 35mm analog cameras.
The I’m Back MF features a 16MP Panasonic MN34120 sensor and touchscreen display, 4200mAh Lithium-ion battery, and 128GB microSD card. The model supports capturing 16MP JPEG and black-and-white raw images and 1080/60p video, plus there’s built-in WiFI for wirelessly transferring content.
In addition to running off its internal rechargeable battery, users can power the digital back using a standard micro USB charging cable or external power pack. I’m Back MF will ship with an adapter for the chosen camera, flash sync cable, and battery charger.
The resulting content, as evidenced by the following sample shots and videos captured by the I’m Back MF, is fairly low-quality, making this digital back a low-cost alternative to pricey models for photographers who just want to have fun with their old analog medium format camera. The Kickstarter campaign even states the digital back ‘may not be suitable for professional photographers who wish to maximise the number of pixels in an image ;-).’
Assuming the Kickstarter campaign is successful, I’m Back MF will be offered for the following cameras:
- Hasselblad 500/1000f/1600f/2000/200/SWA/SWC
- Bronica S
- Bronica Etrs
- Mamiya 645
- Mamiya C330
- Mamiya RB 67
- Kiev 88
- Rolleiflex Automat
If enough orders for these models are placed, the company also plans to offer I’m Back MF for the Pentax 6X7 and Pentax 645. The Kickstarter campaign is offering the digital back for various cameras, as well as bundles with adapters, at prices ranging from 399 CHF ($404) to 1,795 CHF ($1,818); shipping to backers is estimated to start in April 2020.
Disclaimer: Remember to do your research with any crowdfunding project. DPReview does its best to share only the projects that look legitimate and come from reliable creators, but as with any crowdfunded campaign, there’s always the risk of the product or service never coming to fruition.
Submit your best real estate image for a chance to win a free Fstoppers tutorial.
Real estate photography is a thriving business opportunity but it takes skill to shoot a home. Do you think your images are good enough to meet the industry standards?
‘The Plagiarists’ was shot entirely on Betacam.
You simply have to take a gander at the IMDb page for The Plagiarists to get an idea of the scale of this micro-budget production. The film, directed and written by James N. Kienitz Wilkins and Robin Schavoir—”Peter Parlow,” who is credited as the director, is their combined pseudonym—boasts a total of seven crew members, including the co-directors.
Just don’t call The Plagiarists an indie movie. As Wilkins and Schavior told No Film School in a recent interview, “Indie is like the modern equivalent of Hollywood now.” They believe that the “indie” label has evolved such that it no longer refers to independently-made movies, but rather to a “scene” that “espouses mainstream values.”
The ubiquitous Director’s Monitor Cage from Wooden Camera goes on a diet.
A director’s monitor, also known as a clamshell, is one of those tools that, when used once, makes it very difficult to go back to not having one. Even more so when the cage in which your monitor is housed is light and versatile, like Wooden Camera’s new offering.
With an all-carbon-fiber construction for the handgrips and rod supports, the new Director’s Monitor Cage V3 from Wooden Camera is 33% lighter than its predecessor, making an integral part of production even easier to use.
Just like they say in the product demo video, “simple changes make a big difference.”
According to a report from Nikkei Asian Review, Canon’s operating profit is set to drop 40 percent this year; double the percentage Nikkei said Canon had anticipated its profits to drop earlier this year.
Back in April, Nikkei reported Canon was planning to lower its profit forecasts for the 2019 fiscal year by 20 percent—approximately 50 billion yen—due to ‘shrinking camera sales.’ The translate report from April reads:
‘Canon will lower its forecast for the fiscal year ending December 2019. Consolidated operating profit (US GAAP), which indicates the mainstay of the business, is likely to decrease by 20% over the previous fiscal year to just over 270 billion yen. About 50 billion yen lower than the previous forecast. The shrinking of the digital camera market and the deterioration of the semiconductor market due to the functional improvement of smartphones (smartphones) will hit hard.’
The new report from Nikkei says the publication ‘has learned’ Canon’s fiscal year profit is set to decrease twice the amount initially reported in April, due to ‘a slowing European economy and slumping chip market.’ Nikkei writes in its report:
‘Canon‘s operating profit is on track to sink 40% this year to slightly over 200 billion yen ($1.85 billion), Nikkei has learned, amid a slowing European economy and slumping chip market.*
The Japanese company’s profit for the year ending in December is seen falling roughly 60 billion yen short of its downgraded guidance in April. Sales likely will shrink 6% to a figure above 3.7 trillion yen, off about 100 billion yen from April’s forecast. The full-year projections are expected to be lowered again when Canon presents first-half earnings next week.’
The report goes on to say Nikkei expects Canon will further drop its profit projections during its first-half earnings presentation next week and ‘report a first-half operating profit of around 80 billion yen, down 50% from a year earlier, with sales slipping 10% to roughly 1.7 trillion yen.’
If you don’t have control of the lighting in which you’re shooting, you may end up with an otherwise great image that has a harsh, distracting shadow somewhere on the subject. This helpful tutorial will show you everything you need to know to remove such shadows using Photoshop.
In 2011, Gore Verbinski, fresh off one of the largest modern franchises in “Pirates of the Carribean”, set his sights on a western and an animated talking animal movie that were actually the same movie…”Rango”. We look at this unheralded masterpiece that delicately balances multiple genres in entertaining ways.
The western genre has been played in and out of each decade since the 1940s. Movies like Unforgiven breathe new life into it, but that movie came out over twenty-five years ago. To look at the modern western, you can look at No Country for Old Men and even the Coen’s remake of True Grit. But I pose that maybe one of the best modern westerns is actually a talking animal movie with multiple references to Fear and Loathing…
Yeah, let’s talk about Rango.
Rango was released in 2011 by Paramount. It cost roughly 135 million to make but drew in a worldwide box office of $245,724,603. The film won the 2011 Academy Award for Best Animated Feature. It was the first non-Disney or Pixar film to win since Happy Feet in 2006.
Roger Barton has edited many of the biggest and most iconic films in Hollywood history. As an associate editor, he worked on Titanic and Armageddon. His films as editor include, Gone in 60 Seconds, Pearl Harbor, Star Wars: Episode III – Revenge of the Sith, Eragon, Speed Racer, all four Transformer movies, Pirates of the Caribbean: Dead Men Tell No Tales, Terminator Genisys, and Godzilla: King of the Monsters, among many others. He’s currently editing Michael Bay’s 6 Underground.
HULLFISH: You and I have been trying to connect since you edited Godzilla: King of the Monsters. I’m really glad we finally were able to make it happen. I was talking to the editor of Chernobyl recently and he was telling me about this service for remote editing called Evercast, and then I found out that you not only used it on Godzilla but actually started working with the two founders to develop it and implement it at some of the studios.
BARTON: I became involved in Evercast while I edited the last Godzilla film. At that time, I was about six months separated from my wife and I had a 13-year-old son who was trying to navigate this new scary world. I was approached by Legendary Pictures to do Godzilla and though I really wanted to cut the film, I couldn’t leave my son behind and to go to Atlanta for four months.
HULLFISH: God bless you for that.
BARTON: So as chance would have it a buddy of mine who lived across the street from the house I was renting looked me up on IMDB and said, “Oh my God! I didn’t know you edited these kinds of movies…” (whatever that means…), “…you should reach out to my friends who are developing this new remote collaboration tool called Evercast.” As he described it to me, I couldn’t believe it, because it sounded exactly what I was looking for and I think the industry has been hoping for, for many years.
They organized a demo for me the next day that began when I received an email invite which contains a URL that you open with Google Chrome. As long as you have Chrome and a webcam, you can join an Evercast room. (iPad, PC Windows, and Linux are all being released very soon). I connected with the two creators of Evercast, Brad Thomas and Alex Cyrell from my home while I was on my laptop using a standard wireless DSL connection. They were connecting to me, each from their own homes in Arizona. I was completely blown away by the lack of latency, the quality of the image, and the forethought put into the platform, specifically with regard to creative collaboration. I’ve used several other platforms and/or combinations of platforms which is why Evercast really impressed me — they basically took the best ideas of each and built them into a platform that is live, and doesn’t require you to spend lots of money on proprietary equipment.
So, back to Godzilla. I desperately wanted to stay in Los Angeles and try using Evercast to collaborate with the Director, Michael Dougherty who was already in Atlanta. I approached Valerie Flueger Veras, the head of post at Legendary, who supported me (thank you Valerie!) and introduced me to their head of content security, Dan Meacham. After vetting the platform up and down, Dan gave us provisional approval to use it and that was a huge hurdle because obviously, no studio is going to allow their very expensive IP to run over the internet unless it’s absolutely bulletproof.
The biggest hurdle still remained – a test with the Director. By this time I had a couple of weeks of footage to play – and the expectation was that I would be traveling soon to Atlanta for an unknown period of time. It was a Sunday afternoon when we decided to do a test, so I sent Michael an invite and shortly afterward he popped into the Evercast room with no tech support required. I was in my cutting room in Burbank, streaming out of my Avid to him in his apartment where he was on his laptop connected wirelessly. What started as a test effortlessly turned into a working session where we just stayed on the platform and worked for three hours. I was able to show him cuts; make trims, change takes all with him live as if I was doing the changes with him sitting right behind me.
Here’s the best part, I recorded the entire session. This enabled me to relax, knowing that I could replay it all the next day to make sure all the notes were done and done accurately. As I said, many of small notes or experiments, I did live with Michael, but for more complex ideas that require time and thought — I would do later by playing the recorded session back while I followed along with my Avid. Evercast records each stream individually, so I could switch between the main screen to view either his recorded webcam or the content was was being streamed at the time. This meant that nothing was lost in translation.
Anyway, at the end of the three-hour session, I said to Michael, “Do you still want me to come out to location?” And he said, “God no! Why would you want to come to Atlanta for three months, let’s continue using this.” That was a life-changing moment because it started me on this new path as I try to promote something I believe can improve the quality of our lives. I’ve always shied away from the spotlight, so in many ways, this is uncomfortable for me but if I can make a small difference or help other families stay together — it will have all been worth it.
HULLFISH: That sounds fantastic. Did the technology of it kind of melt away? Because human interaction — you and I are on Skype right now: I can see your face, I can see whether you’re getting antsy or whether you’re engaged. Did you feel like you were really engaged with him like you would be if you were in the room?
BARTON: Very much so and not always in a good way, (both laugh). I think a lot of editors will understand this: I’m used to running first cuts for directors and usually they’re behind me where I can’t see their reactions…but on Evercast I’d see his facial expressions as he’s watching the cut live — just like I’m watching you right now. And so whether it’s a laugh, a smile, a grimace or whatever, I could make note of it…good and bad. Thankfully I could mute his camera from my side if I had clearly F’d up a scene (laughs).
Before this experience, what I’m used to — is being sent to location for months at a time, trying to pull the director into the cutting room after a 14-hour shoot, where literally hundreds of people are nipping at them all day for answers. I’m usually the last person they want to be with at the end of the day, which is never a good time to be showing your first cut.
On Godzilla, by staying in Los Angeles and using Evercast, I literally got more collaboration time with the director than I ever have — even when the production sent me and my crew to location. In most instances it was Michael who initiated the Evercast sessions, whether it was a five-minute quick question, 30 minutes between lighting setups or an hour session at lunch, all he had to do was open up his laptop, hit his bookmark in Google Chrome and we would be in the room together. We took advantage of all those moments on set where he had downtime and in most cases, he was totally relaxed because I wasn’t trying to force him to be there.
Michael shot 3 million feet of film so I averaged between 8 and 12 hours of dailies a day which meant I had to adapt my workflow in several ways, but using Evercast enabled me to collaborate with him and get his eyes on sequences early. In fact, by the time he returned from shooting, he had a pass on many of the scenes and certainly all the big set pieces because we had early VFX turnover obligations. Getting this early feedback translated to far less anxiety when Michael returned from shooting – because he already knew what movie he had shot, and any missing footage we needed I could easily ask for by streaming the cut to the set — rather than exporting QuickTimes, uploading them on other platforms and waiting for a response that often never came.
HULLFISH: That’s interesting.
BARTON: It takes away that impersonal element of sending a cut to the director and then waiting for feedback days later. And in the case of visual effects reviews, it allows eight to ten groups from anywhere in the world to come together live in this protected space — without the need to distribute files ahead of time.
HULLFISH: What are some of the tools that you have? Can you draw on the for example?
BARTON: So we already covered the recording, which for me was one of the biggest features, but soon we’ll have the ability to take pieces of those recordings and allow other people to have access to them. For example, let’s say that you and I are in a visual effects review and the director says, “By the way, I have a music note for this scene. This part of the cue isn’t quite hitting the tone I want.” So at the end of that session, you can go back to that recording and select any portion of it, then give access to any authorized users of that room which could include the Music Editor or Composer. In the world I tend to work in, there are hundreds of notes to distribute to several different vendors after a visual effects review. Often this requires translation and interpretation. Imagine if the animator (who wasn’t part of the review) could watch the Director give notes on their shot — flipping between watching the Director’s webcam and the content being streamed. We’re attempting to remove all that stands between a Director and the person who’s doing the work so there’s absolute clarity on what the note is, which we think increases efficiency by reducing the number of iterations.
HULLFISH: Oh yeah! Absolutely!
BARTON: Then, of course, you can draw on the screen, but unlike the other platforms, you won’t have to upload all the files first, since we’re a live collaboration platform.
HULLFISH: One of the things you kind of alluded to was the security issues. Some people might not realize it, but unless you’re working at a studio, the studios are really critical about needing all of their vendors and crew to be in a facility that has these kinds of door locks and there’s no access to the Internet and other security precautions. Facilities have to get certified for these specific security protocols. Tell me a little bit about the security and a little bit more about how that got approved by a bunch of studios.
BARTON: When I finished my tour of duty on Godzilla, the impact Evercast had on me and my life was profound. I saw an opportunity to not only become part of a new company and exercise a different part of my brain but more importantly, I saw an opportunity to make a positive impact on the lives of my friends, my colleagues — who go through many of the same challenges that I do. I hear time and time again how much we all love this craft, but the lifestyle that comes with it is just getting too much…or maybe I’m just getting too old (laughs). Footage counts are skyrocketing, budgets and schedules are shrinking, studios are chasing tax rebates around the globe requiring us to leave home for months at a time, or longer. So I’m on board with any technology that pushes back against this trend to help us reclaim a little bit of our lives.
After working with Brad and Alex for six months, discussing ideas that would improve the platform, I approached them and said, “I really feel like I can open some doors for you guys with people I’ve worked for during the last twenty years.” Not long after, I invested money, took nine months off from cutting to introduce Evercast to the studios — becoming part of the team and was made co-founder. My primary goal at that time was getting Evercast approved by the studios from a content security perspective. So I walked Evercast into each of the studios beginning with Disney.
HULLFISH: It’s always good to start small.
BARTON: (laughs) Well, the film I had done before Godzilla was Pirates. (Pirates of the Caribbean: Dead Men Tell No Tales (2017)) And of course Disney is conquering the world so why not? They were the first to see Evercast and their immediate reaction reinforced my own instinct on what this could potentially be. I remember this boardroom meeting we had. Their production technology department was there; post-production; visual effects; IT — there were probably twenty-five people in that boardroom when we demoed Evercast for the first time. At one point, the boss stood up and said to their team, “Guys, you may all have your own security issues, but this is clearly the way of the future so figure it out and get it done.” I just looked at my partners and thought, Wow! This really has an opportunity to change a lot of lives…and by the way, we now have a world-wide master agreement in place with Disney and all their affiliates.
HULLFISH: I love the idea of just the quality-of-life issues. It looks like you’re in your home. Were you cutting with Michael in your home? Or did you just come back from the studio?
BARTON: One of the reasons I’ve worked for Michael on so many projects is that he never asks me to travel — so when I sign up, I know I’ll be in town for the next calendar year. Don’t get me wrong, I absolutely love traveling, just not for months at a time without my family. Not to mention, Michael’s cutting room is in Santa Monica, so it’s a 10-minute drive from my house.
HULLFISH: Do you know Alan Bell? He has cut some very big films and decided at the top of his game to move to New Hampshire, but he still wants to cut movies — just not in LA.
BARTON: Alan is my hero because he’s attempting to create the life he wants without sacrificing his career. Total trailblazer. I did a demo for him nine months ago — before he moved back east — and he was excited about the possibility of using Evercast to edit movies from his house. We’re continuing the discussion and it’s just a question of finding the right fit.
This raises the issue of our pricing model which is currently based on a flat rate — and it enables anyone on the entire production to use Evercast for as long as they want with no restrictions or recordings. We provide a virtual room that anyone you approve can enter. For this room, the flat rate is $999/month which breaks down to $250 per week and for that price, you can literally have anyone in visual effects (not including vendors), editorial, music, sound or casting using the platform. We have people doing production meetings, table reads editorial and VFX reviews, sound spotting on the platform. In fact, just today there is a music video being shot where the director literally is directing on one coast while the shoot is happening on another as the live camera feed is streamed to the Director.
HULLFISH: The production of movies including VFX and editorial and music and graphics design does seem to be increasingly international. It’s been that way for a while, so it seems that no matter what you’ve got somebody in L.A. and somebody in New York and somebody in London on every production almost.
BARTON: Yeah, especially with regard to visual effects. The vendors are now spread so wide across the globe. You’ve got WETA in New Zealand, ILM up north while MPC and Framestore have hubs in London, Vancouver, and Montreal. It’s all over the place. So Evercast is a way to keep everyone on the same page. In fact, on the film, I’m cutting for Michael right now — called Six Underground — our visual effects supervisor Rich Hoover is using Evercast about four or five hours every day reviewing shots with all of our vendors including ILM.
HULLFISH: I love the transparency of letting all those things flow directly from the director to the actual end person because normally it is through layers of other people. The VFX supervisor and you are sitting with the director and then you guys get all those notes and those go to the local VFX producer and down through supervisors to the end animator or compositor and somewhere along the line something gets lost in translation.
BARTON: It happens to me all the time in visual effects as shots come back wrong, but I’m guilty of it as well. In the past I would scribble down notes on a yellow pad and then I’d try to read them the next day and the pages would be full of sentence fragments because sometimes the directors are moving so fast, I don’t want to stop them for clarification because they’re in the flow. So I do my best guesswork the next day, which is not always accurate especially when I don’t have a photographic memory.
HULLFISH: You think of something like a director giving notes and he makes a hand gesture or uses his hands to show the interaction of two objects. There’s no way you can translate that to the VFX guys — especially directly to the end worker who’s executing the shot — but if you can send that little video to him, he’ll totally understand what he’s supposed to do.
BARTON: Right. And so that probably removes two or three iterations to get you to where the director wants to be. I see that time and time again. If I had a nickel for every time I’ve heard, “That’s not what I meant,” I’d be a rich man. Even in editorial, often I’ll make a change convinced that this is what the director had in mind only to run it for them and hear them say, “That’s interesting, but no. That’s not what I meant. This is what I meant.” Having a recording of the session ensures everything is done right.
HULLFISH: It sounds like a great platform. I don’t do gigantic movies like you do — the biggest of the big — but $250 a week seems incredibly reasonable even for the lower budget films I work on.
BARTON: Well when you spread it across all the departments who have access to it, we feel it’s literally giving it away. And I get that there are smaller budget projects where there might be a reaction to our flat rate but I think those are the projects who benefit the most by avoiding all the travel for the editor, the assistant editor which include flights, per diem, hotels, cars — not to mention the lifestyle issues that we’ve been talking about.
We also realize as a company that there’s a huge market to reach with all the individual content creators out there — the YouTubers, the people who don’t have a studio behind them, so what we intend to do is, in the in the months ahead, release an Evercast lite version that removes some of the security protocols that we’ve spent a tremendous amount of resources towards. And by removing those, we think we can lower the price to reach all of those individual content creators which then makes this the same user experience available to just about anybody.
HULLFISH: You mentioned that with a VFX guy, it might save him two or three iterations. That’s a week of work for one person depending on the shot.
BARTON: It certainly is.
HULLFISH: I go on location sometimes but a lot of times a director will deliver the entire dailies “dump” at the end of the movie or may be delivered in two gigantic chunks and directors just say, “Hey, you’re you’re on your own.” And I would love to be able to offer this solution to them.
BARTON: Do you do your work from home?
HULLFISH: I do minor editing work at home, but I have an office about 20 minutes away with a big sound system and big monitor and a standing desk and an Avid NEXIS Pro. I have two systems there, so I can have an assistant there. When I’m at home, I’m working off a MacBookPro and headphones.
BARTON: I’ve tried to work from home and what I find is that I get so distracted during the day that I end up working until 3 or 4 in the morning to make up for all the time that I just goofed off!
HULLFISH: I can relate. Since you were committed to being home with your son before you really knew about Evercast, were you planning — on Godzilla — to try to use some other system?
BARTON: I didn’t have a horse in the race. I just wanted to find whatever technology existed out there to improve my lifestyle and cut a movie I wanted to do, but do it from Los Angeles. That’s what started me down this path.
HULLFISH: I’m going to bring up a touchy subject which is this leads us down the road to outsourcing editing to India and China or at least some English speaking country with a lower cost of living. Obviously, I believe that the editor’s position requires things that can’t be outsourced — it’s a relationship that is important to the director — but can we talk about that elephant in the room?
BARTON: I don’t look at what we do — what we have to contribute — as something that can be outsourced so easily, like a rotoscope artist or a compositor who doesn’t have that intimate relationship with the filmmaker. For that reason, I think we’re completely protected as editors because of what we contribute to the process. That’s my personal opinion.
HULLFISH: You could say this allows me to get Alan Bell who just wants to cut a movie in his home and as long as I’m willing to do it through this medium, I can get this great editor.
BARTON: And that makes me really happy. I think that’s to be celebrated — that people can make personal decisions about whatever is best for their family and still have a career.
HULLFISH: With the cost of living in California or New York, I’m sure more and more of us are probably thinking, “Alan’s got a good idea.”
BARTON: Oh, it’s been on my mind for a long time. I’ve seen other people do it with varying degrees of success. I’m wearing my Oregon baseball hat as you can probably see. I’m from the Pacific Northwest and I have a love-hate relationship with L.A. I love the people I’ve met here — my close friends and family are here — but if I didn’t have to live here, I’m not sure I would.
HULLFISH: I’m interested in the transition that happens once the director comes off the production end and is free to be with you side-by-side.
BARTON: I’ll give you a good example of how it continued to be used after production on Godzilla. I got into a rhythm with the director where he would check in with me from his house in the morning and I would show him what I was working on. Then he would give me notes which I churn through so that by the time he shows up at noon, I already have them implemented. Evercast continued to allow us to push the ball down the field, even while we were back home, and it also meant that the Director didn’t have to be in the cutting room for 12 hours a day over my shoulder, which I greatly appreciated…just sayin’.
HULLFISH: Since this interview, I got a demo of the Evercast system. Here’s the video.
HULLFISH: I’d love to switch topics. I was looking through your imdb page and maybe it’s missing some early years, but it looks to me like you went straight to working on the world’s biggest movie. Your imdb page looks so improbable.
BARTON: (laughs), I literally did make the leap from That Darn Cat to Titanic. My friends make fun of me all the time.
The editor of That Darn Cat was really Richard Harris who was one of the principle editors for Jim on Terminators and True Lies so when Jim was looking for someone to run the cutting room on Titanic, Richard was one of the people he reached out to — and because we were finishing That Darn Cat, Richard said, Oh my God! You’ve got to meet Roger…he’s a bit of a geek which is exactly what Jim needed at the time to manage that film, and so I was given this incredible opportunity to make this quantum leap into a world I probably had no business being in. But, I think my career is marked by those moments where an opportunity presents itself and I’ve decided to jump and hope the net appears below me.
I constantly put myself in situations where I’m uncomfortable because I feel like that’s when we grow as people or as artists. That was certainly a big opportunity for me.
I’ll give you another example on the same film where I took that leap. At that time, I was simply hired to run the cutting room because It was Jim’s intention to cut the movie by himself. As the weeks went by, I’m working at his house in Malibu and the footage is just piling up – mountains of it. Three or four weeks into the process, I can not imagine Jim catching up to the amount of footage he’s shooting, so I see an opportunity. I called the producer, John Landau, and said, “Listen. I’m not sure how you’re going to make your release date if Jim cuts this himself, so, if you’d like, I can start assembling this into some shape that Jim can at least work from, so he’s not starting from scratch. I can find someone to replace what I do — running the cutting room.” Afterward, there was a long silent beat on the phone and I’m sitting there with gritted teeth, waiting for his answer. And he says, ‘Ok kid, let me let me talk to Jim about it and I’ll get back to you.’
So the next day John calls me and says, All right. It’s a green light. “Go ahead and start cutting, and find your replacement.” So, for the next month, I was alone at Jim’s house, cutting together many of those scenes that were recut of course by Jim, Conrad, and Richard. But, to be sitting in there for that month, cutting that movie on my own, I kept looking behind my shoulder saying, ‘How the fuck did I get here?’ Thankfully, it seemed to work out, okay, because I had worked under some really talented editors up to that point, and I paid attention, so when the opportunity came, even though there was some fumbling around and experimentation, I felt like I could at least show Jim something that reflected the material.
And ultimately that’s how I met Mark Goldblatt. Because at Jim’s Christmas party, he ran a few scenes for his friends and amongst those cut scenes were things that I had assembled, and Mark took notice of it, asked Jim who had cut it, and Jim sort of reluctantly turned his head to me standing in the corner. That was my introduction to Mark Goldblatt. He was looking for an assistant while he was cutting Armageddon for Michael Bay, so I literally jumped from Titanic to Armageddon, where I assisted Mark, and had the best film-school experience you can possibly imagine by sitting behind him for twelve hours a day while watching him cut. Mark Goldblatt is one of the most incredible, generous human beings I’ve ever met — he’s also the most talented editor I’ve ever met and is responsible for my career. Mark taught me not only how to edit, but how to be an editor.
HULLFISH: That’s a really unusual position for an assistant editor to be in. What were your responsibilities besides acting as a second pair of eyes in the cutting room? That’s just not what any assistant editor I’ve ever heard of does — to solely give feedback on edits “live.”
BARTON: I was a sounding board for him.
HULLFISH: I’m shocked that someone pays an assistant editor to act solely as a sounding board. Many people use their assistant editors as sounding boards but it’s only after they’ve done the syncing and all the other stuff.
BARTON: Well, some of the other assistants were a little pissed off that my entire job was to sit in there with Mark while they were out syncing dailies and doing a host of other things to run the cutting room. I had no idea what I was walking into but quickly learned, I was there to support Mark in whatever way he wanted me to. It created a little tension in the cutting room, but Mark also worked twelve hours a day from 11:00 a.m. until 11:00 p.m. And when he left, he would typically leave me two to three hours of sound work to do that I had to have ready for him in the morning. So not only did I have to be there before he showed up, but I had plenty of work to do once he left after I’d been focused in his room for twelve hours a day.
HULLFISH: So you weren’t just sitting in the back, but that is still amazing. Did he teach you by saying, ‘Here’s what you do?’ Or were you just in the room and you got it through example?
BARTON: Everything he taught me was by osmosis. I was lucky enough to be in that position — to sit with Mark and provide an opinion — for whatever that was worth. But the benefit to me was understanding his workflow and watching him work the footage over and over and over again – each time elevating the film. Mark accomplishes this in part by building out select rolls that allow you to frequently revisit the raw footage in an organized and efficient manner. Watching him build these reels allowed him to understand the Director’s intentions and how they might have changed over the course of each setup. Revising these reels allowed him to constantly come up with new solutions to support new ideas. It really spoke to both by right and left brain and I’ve used it ever since, whether I’m cutting an intimate dialog scene or gigantic action set piece.
HULLFISH: You also mentioned that you do a continuous process of select rolls. What are you doing once you’ve made a selects roll for a scene? What changes are occurring to the selects roll through the process?
BARTON: Well, the reel is so important to me that as new footage comes in — before I do anything else — I will fold that into the existing select reel as if it was all shot on one day. So going back to that represents the entire scene.
For me personally to load every clip individually whenever there’s a new idea — it’s really arduous and not creatively what appeals to me. By having done the heavy lifting early on and organizing the footage in a sequence that represents all the usable footage — I feel like I can go back into a select roll and very quickly come up with some options rather than going to hunt for it in this enormous amount of dailies that might be spread across 50, 60, 70 clips.
HULLFISH: The reason why I asked it was because some people will make multiple selects reels where they’ll whittle it down even more or they’ll maybe have an assistant cut a reel for the scene of only a selection and then they’ll go you know. Now I just want to act sometimes as an assistant make me a selects reel of nothing but reactions of “Joe.”
BARTON: To each his own. Everyone has their own process and I would never criticize someone else for theirs, but for me I feel like I need to run the footage through my hands as often as possible. So when it comes to finding material — whether it’s a master, a reaction, or a close-up or a dynamic effects plate, I feel like that’s where the creativity lies. And I think in that process of scrubbing the footage, a whole host of other questions might be answered or a possibility might be presented.
HULLFISH: And how granularly are you breaking your selects rolls up?
BARTON: I wish I could tell you there’s one rule, but it’s an intuitive process that changes each time I seem to do it. Also, I don’t want to break apart the film too much, otherwise I won’t see the possibility of just sitting on a shot that works great without cutting.
HULLFISH: That’s exactly my point.
BARTON: Often, if I’m looking at a master that that tells the story or reveals geography through an interesting camera move — I’m paying particular attention to those and making sure that they exist in a long section even though they might encompass several lines of dialogue. In fact, it might encompass a quarter of the scene. The length of the select roll doesn’t matter to me at all. So if it’s a four-hour select roll, I have no problem with that, because normally I’m scrolling the reel to at least park myself on the section I want to run. Then I sort of break it down into the medium shots where I might play two or three lines of dialogue — still looking for when I can hold it longer — but when I get into the close-up coverage that’s when I start to cut it up more so than I would otherwise. It’s really important at that point — when I’m breaking up individual lines of dialogue that I am also cutting into my select roll any reaction shot I feel might tell the story in another way — in a more powerful way. I’m a big fan of reaction shots.
HULLFISH: You mentioned that one of the things you learned from Mark Goldblatt was the process of going over and over and over and redoing things. And if you weren’t in the room with him you might just think, “Oh he’s a brilliant editor and he cut it like that the first time he laid it out.” The only way you knew how he ended up at a scene was because you were there the whole time to see that it was a process, and not “one and done.” It reminds me of a story from a documentary about the Eagles where Jackson Browne lived upstairs from the Eagles and so they heard him writing songs and they could hear that the songs didn’t come out fully formed. He needed to work them and revise and edit and it was a PROCESS.
BARTON: I had a similar experience on a film where we considered folding elements of one of my favorite bands’ new album into our score. On paper, this was the coolest idea ever! We flew to meet the band and they handed over sketches for their new album which I was SO excited about. The next day I plugged the drive in, listened to the tracks and thought…”you’ve got to be kidding me?” Unfortunately we couldn’t see past these sketches so we passed on the opportunity, then months later the album came out and guess what…there were some amazing tracks! It’s a process.
How each of us gets to a locked picture may vary wildly, but in my experience it’s usually a very long, drawn-out process where you feel at times, “this scene can’t get any better” and then the pizza delivery dude sees what you’re working on and offers a new idea that makes you think, “why the hell didn’t I think of that!?” (laughs) I’ve worked on lots of films requiring multiple editors and sometimes you lean out your door to hear someone else working on the same scene as you. We all need strong opinions as editors, but I do my best to separate my opinion from my ego — it’s a work in progress, but I’m working on it because even without multiple editors, we’re ultimately there to support someone else’s vision: the director’s.
I really appreciate the way that Kevin Feige, Jeff Ford and everyone at Marvel works because they really seem to support and promote “the best idea wins” approach because it really does take a village to make the best film. As much as filmmaking is about presenting a strong point of view, what we do is inherently subjective. I believe editors do their best work when they make strong choices, but that doesn’t mean those choices are the only way the scene or sequence works — and to top it off, because the entire experience is so fluid, what works one week may be undid by a change in the previous reel this week. This multi-faceted puzzle is why I love what we do as editors.
HULLFISH: You have worked on more massive projects than just about anyone I’ve spoken with. How different is it in terms of process to work on these enormous films than something smaller? What kind of project management is layered on top of your creative skills? Or is all of that stuff on post producers or assistant editors? What would surprise someone like me, whose films are smaller, about working on something massive?
BARTON: With regard to the crew, it’s a matter of scale. Often, on the big films, you’re describing, just about everything is amplified, including expectations — so the pressure is real but no more real than those working on smaller films who are doing more with less.
For me, I try to stay focused on the storytelling and hire people who are far more adept at running the room that I am. I used to be that guy so I have pretty intimate knowledge about workflows, but the technology has ironically passed me by. Now I try to stay in my lane — focused on the creative end of things while keeping an eye out for the warning signs that a problem is heading our way.
HULLFISH: You’ve been fascinating to talk to. Thank you so much for your time — plus your wine’s almost done. I’ve been watching your wine glass get lower and lower and I figured, “When he hits the bottom it’s over.”
BARTON: (laughs) I love working with Michael. He’s been incredibly loyal to me over the years, but sometimes, when I come home, this glass of wine is absolutely essential. And by the way, he might feel the same way about working with me all day!
The first 50 interviews in the series provided the material for the book, “Art of the Cut: Conversations with Film and TV Editors.” This is a unique book that breaks down interviews with many of the world’s best editors and organizes it into a virtual roundtable discussion centering on the topics editors care about. It is a powerful tool for experienced and aspiring editors alike. Cinemontage and CinemaEditor magazine both gave it rave reviews. No other book provides the breadth of opinion and experience. Combined, the editors featured in the book have edited for over 1,000 years on many of the most iconic, critically acclaimed and biggest box office hits in the history of cinema.
The post ART OF THE CUT with mega-movie editor, Roger Barton appeared first on ProVideo Coalition.
I’ve been fascinated with the idea of incorporating the moon into photos whenever possible. And so, with the 50th anniversary of Apollo 11’s moon landing on July 20th, I was excited by the possibility to shoot something special for the occasion: Putting a man “on” the moon.
The man here is Ty Johnson, a paramotor pilot, in Cedar Rapids, Iowa. As NASA will tell you, getting a man to the moon is harder than it looks. This is how we did it.
I first spotted Ty’s paramotor—a powered-parachute—in the air when I was shooting Iowa’s amazing, magical fireflies. I chased him down as he landed with a simple request: let me shoot you in flight. He agreed, and in a few days we got to work.
I wanted two sets of shots: him passing in front of the moon, which would be shot from the ground, and a set from the air, with me following him with a drone camera.
As with NASA, we had our own set of restrictions facing us, and safety was a priority. Paramotors, because they are so light (a two stroke engine, a prop, a parachute and a chair for the pilot) have wind restrictions in order to fly safely. We had to take these into consideration just for Ty to take off, not to mention getting him lined up in front of the moon. The wind was our enemy. And for the actual moon shot, so were the clouds.
We tried first on July 11th, the moon a Waxing Gibbous at 73.9% full. We coordinated on the ground with a planned series of shots and, after he took off, I called him on his cell to provide direction. He could hear me through ear buds but the noise of the propeller and motor washed out much of his replies. It was one way communication.
Paramotor pilots face numerous flight rules. The big one we were dealing with: no flying over congested areas. With a subdivision to south, we were boxed in, making lining up with the moon difficult in the short amount of time we had before darkness.
Another FAA paramotor flight rule: no flying 30 minutes after sunset.
Time was ticking and the winds were stronger at altitude. Here’s a shot from our first attempt on July 11th, the moon a Waxing Gibbous at 73.9% full, below:
I got him going over the moon, but that was it. It was best we could hope for that night. It was time to land.
The next few nights were a no-go. Winds were just too high to fly. Then, on July 14th at 8:17pm, with the moon a waxing gibbous at 97.1% full, Ty texted me: the winds had finally died down a bit. Was it too late to shoot?
I’d meet him at the field, I told him. Sunset was at 8:40pm.
The seconds were ticking by, the sun was setting and we had to get to the field. He then had to setup his aircraft and get into position. And had the winds even died down enough to take off?
Ty put his machine together and put on the propeller as we discussed the shot and flight, and then I drove to my shooting position. The moon was lower on the horizon and further east than on July 11th, which would allow him to fly over fields, avoiding the subdivision to the south and giving him more maneuvering room.
I put the camera on a tripod this time and auto-focused in on the moon rather than following him and shooting by hand like I had on the 11th. I then switched to manual focus. It was getting darker by the moment—I would still have to change exposure and ISO settings by the time he got into position.
Ty laid his wing out on the ground. The winds were unsteady. Would he be able to take off?
I watched from a short distance. A first attempt and a gust took the parachute in the wrong direction. He re-positioned again into the wind. The chute went up, caught the wind. A short run and he was in the air.
I let him climb and then made the call.
I gave directions by phone: more altitude… head east… southeast, away from me, toward the moon.
The winds were coming out of the south and he was headed south southwest toward the moon, giving him a stability that he didn’t have a few days earlier. I needed him further away from my camera, so he and his aircraft were smaller than the moon and would fit inside it.
He was on track. A finger’s length away. Closer…
Turn southeast… climb…
Almost. Not too much altitude. Ease off on the throttle a little. Don’t over shoot it. Further east.
That was it. The parachute’s in the moon. Keep going.
Climb… That’s it.
Later we estimated he was about 1.5km from my shooting spot. Below is an estimate of our positions using The Photographer’s Ephemeris web app (mine the red pin, his the grey pin).
I clicked away as the light faded. Ty could hear the camera shutter as I guided him across the moon.
I checked the shots and he swung back across the moon a second time. Some shots were in focus, some were not. But we still had time.
I adjusted the exposure settings from 1/200 sec to 1/320 sec and bumped up the ISO from 160 to 400. It was getting darker still. I directed him to make another pass.
We got it for sure this time:
A few more passes for good measure:
The sun had set 13 minutes ago so these would be silhouette shots as to not over expose the moon. But we had it. Time to return to earth.
Later, when I showed the moon transit shot to my daughter, her first thought was: hey that’s from the film, E.T. the Extra-Terrestrial. So in addition to celebrating NASA’s Apollo 11 accomplishment, I guess we are in inadvertently paying homage to Steven Spielberg’s 1982 film as well.
Now, I thought as Ty landed, we have 2 days until the full moon. If we could just get a shot before sunset.
The next day, July 15th, came and went. The winds were between 9 and 15 mph. Too windy to fly.
Today is July 16th. Full moon. Sunset is at 8:39pm. Moonrise is 8:45pm. We’ll try again tonight.
Because you can’t just go to the moon once.
Update: Storms moved in. It rained. No flying for Ty and no moon visible this evening.
About the author: Christopher Sherman is a commercial and fine arts photographer who travels the world taking pictures full-time. You can find more of his work on his website, or by following him on Instagram, Twitter and Facebook.
This article was also published here, and is being republished with permission.
A couple of months ago a rogue wave destroyed my Nikon D850. Today I got back it back from Nikon and it’s been fully repaired.
These golden hour film tips can help you save time on set and get the beautiful shots your movie or TV show deserves.
Some of the most beautiful shots in cinematography history are shot right at dusk when the light is lush and perfect. These shots are not easy to pull off due largely to the small window of time you have to capture them. Get them right, and you go down in history, get them wrong and no one can see your characters’ faces.
The stakes are real!
Today we’re going to go over some tips for shooting at golden hour from Lewis McGregor, a.k.a. UglyMcGregor, (the video’s an oldie but a goodie) and cover techniques that will help make you a better filmmaker during the most beautiful time of day.
5 Golden Hour Tips and Techniques
First things first, let’s take care of a few definitions.
Golden Hour Meaning
Golden hour, or magic hour, refers to the time the period of daytime shortly after sunrise or right before the sun sets, when we get a soft red or orange glow.
In part 1 and part 2 on this topic, I said project sizes for edit are getting larger and larger due to never saying “Cut!” I have tried to emphasize that this isn’t just a rant about my needing to manage storage.
This time, I’d like to detail a concrete example of the impact of long takes on a project—how a lack of “cutting the roll” meant that what the director desired didn’t end up in the finished project. (Some details have been changed to protect the innocent.)
This was a project with many moving pieces. The production was shot on location. Post went from offline to grading and then to finishing, all at separate locations. I was doing the finishing.
The director had his hands full both directing talent and appeasing clients during the shoot. For a series of scenes, the camera rolled constantly while directions were given, positions reset and takes redone. Some takes were 10 minutes long.
(After the project was delivered, I was able to watch some of the clips. The recording was stopped probably because the media was almost full. I never heard anybody stopping the roll.)
Offline proceeded as normal. A proxy workflow was used, which certainly helped with the amount of footage being used. The director supervised after the first rough cut and followed all the way through picture lock and on into grading.
As grading wrapped, there was a hint of problems to come. When you render out graded footage, there are two options. You can either render out the full length of each clip or just the length that’s used in the edit plus some extra frames before and after: “heads and tails.”
Heads and tails give the finishing editor leeway to trim shots a few frames here and there if needed. But only as far as the length of the heads and tails—typically a second or two.
If possible, I prefer to get the full clips back. It gives me more options during finish in case there are any fixes I need to do. It’s not a deal-breaker if I don’t get it, but I always ask.
In this case, because of the extreme length of the clips, the colorist wasn’t about to render out full-length clips. The amount of render time would have been extreme—exceeding the budget and the time allotted for the session. Shorter clips with heads and tails were delivered.
And here’s where it all fell apart. The edit included several scenes with time remapping of clips. A clip would play at normal speed, then would run at a high speed for several frames, and then play at normal speed again.
Since picture was locked and mix was already done, my job was to match, frame for frame, the offline. I used all my tricks to verify that the XML I got back from color matched. Everything was perfect, except for most of the time remapping shots.
The clip frames leading into the effect were fine, the frames trailing were fine, but the actual sped-up footage didn’t match. This wasn’t a consistent error.
I noticed those clips that had a constant speed change in the original edit (180 percent, for example) had fractional speeds of 179.7 percent on the XML edit. Try as I might, no amount of adjusting the clip—either by speed or by changing its location in the sequence—would solve the problem.
Next time, figuring out the problem and why it was caused by never saying “Cut.”
ARRI’s extremely compact, powerful M18 daylight lamphead has been used on film sets worldwide for the past ten years. A limited special anniversary edition available in August celebrates the anniversary.
To mark the tenth anniversary of the M18 daylight lamphead, the film technology company ARRI is offering a special edition of the fixture, limited to 500 units. The lampheads, printed with the anniversary logo on the housing, will be ready for shipment starting in August. Get your now, if you want to have a collection item that is also a key tool for lighting movies.
The M18 is an 1,800 W open face lamphead, combining the Academy Scientific and Engineering Award-winning lens-less optical technology of the ARRIMAX with the innovative True Blue design. The result is an extremely powerful lamphead, as small as a 1,200W PAR but with a 70 % higher light output. The M-Series M18 fixture is adjustable from 20° to 60° without requiring spreader lenses.
For indoor and outdoor use
Since its launch, the M18 has enjoyed great popularity and has become an integral part of almost every film set in the world. Recently, it provided excellent light for the feature film productions “Dunkirk” and “Sauerkrautkoma” as well as for the series “Dark,” “Parfum,” and “The Alienist.” “What I like most about the M18 is its versatility combined with its compact size and incredible performance. The M18’s easy handling and its ability to be used with household plugs make it a good choice for my shoots,” confirms Michael Matuschek, a commercial filmmaker and gaffer.
Even after ten years, no significant changes have been made to the certified system. With the same size as a 1,200 W lamp with parabolic reflector, the M18 provides around 70 percent more light output while maintaining the same color quality. The M-Series M18 fixture is adjustable from 20° to 60° and, like all M-Series luminaires, does not require any drop-in lenses. Operation is possible without a generator almost anywhere in the world using household plugs. The weatherproof, easy-to-maintain M18 meets IP23 requirements and is therefore ideally suited for outdoor use.
Today the ARRI M-Series consists of the ARRIMAX, M90, M40, M18, and M8. For all its daylight systems, ARRI offers an extended warranty of five years.
The post ARRI M18 lamphead: limited edition to celebrate the tenth anniversary appeared first on ProVideo Coalition.
The Ronin-S was a hit, but now we have the Ronin-SC catering to the growing mirrorless camera market.
DJI packed a lot of features into the original Ronin-S of last summer, but one of the few quibbles we had with the original design was that in some ways it felt too big, especially in the battery.
While 12 hours of battery life was great, we would have happily bought a smaller single-handed battery grip for that only gave us 6 hours of life for the trade-off in weight savings.
Well, DJI has just announced the Ronin-SC, which has that smaller battery grip we were hoping for, but it actually smaller all over, with a reported 30% size reduction and a 40% weight reduction making it an exceptionally appealing stabilizer.
Aurora Aperture has introduced Adapter Mount Format (AMF), a filter system designed for mirrorless mount adapters. Unlike filters that are positioned over a lens, the AMF filters are small and rectangular with a design that drops into mount adapters, positioning the filter behind the lens. The AMF filter product line includes the company’s PowerUV, PowerND, PowerGND, and PowerDusk filters.
Aurora Aperture presents its drop-in filters as having multiple benefits over traditional lenses, which are mounted on the front of the lens. By positioning the lens in the mount adapter, a single set of filters can be used for each adapter rather than needing the same filter in different thread sizes for each lens. As well, the AMF filters are smaller and therefore easier to carry with a lower cost.
Any lens that supports any of the four mirrorless mount adapters can be used with the AMF drop-in filters, including ultra-wide-angle lenses that don’t feature a front filter thread.
With the exception of the PowerDusk filter (which is based on neodymium glass), the AMF filters are made from Schott B270 glass. The drop-in filters also feature multi-layer nano-coatings for light reductions, as well as a PFPE-based nano-coating for easily removing dust, oil, water, and dirt. Compared to the 6000 series aluminum alloy used in competing lens filters, Aurora says the 7051 aluminum alloy used on the drop-in filters is aerospace-grade and twice as strong.
Aurora Aperture is funding the AMF drop-in filters on Kickstarter, where it is offering a single filter for pledges of at least $44 USD, two filters for $86 USD, all the way up to 10 filters for $348 USD or a full kit for $372 USD.
Disclaimer: Remember to do your research with any crowdfunding project. DPReview does its best to share only the projects that look legitimate and come from reliable creators, but as with any crowdfunded campaign, there’s always the risk of the product or service never coming to fruition
If you’ve ever wondered what plant is in the foreground of your last magical landscape image, there’s a few apps out there that can help. Those apps can also keep you from trampling rare and endangered species and habitat to get that epic Instagram shot, and along the way you’ll end up learning a few Latin names… maybe.