ART OF THE CUT with Walter Murch, ACE on 7 of his films

Walter Murch, ACE, has edited for more than six decades on more than 60 films and more Oscars, Emmys, Eddies and BAFTA awards than I can mention. His imdb page lists 66 films over more than 6 decades.

I sat in a screening room in San Francisco with Walter before watching the documentary he edited, Coup 53. In two previously published interviews, we discussed that film and follow-up questions about his book, “In The Blink of an Eye” and his book with Michal Ondaatje, “The Conversations. In the upcoming weeks, I’ll post an interview in which the questions will be asked by the members of the Blue Collar Post Collective.

This interview is about 7 of his films that we discussed after he gave me “homework” to watch five specific films. The films we discuss here include The Conversation, Julia, Particle Fever, Romeo is Bleeding, The Talented Mr. Ripley, Tomorrowland, and The Godfather.   The podcast of this interview was released in mid-February. This interview has some minor edits for clarification that make it slightly different than that podcast.

(This interview was transcribed with SpeedScriber. Thanks to Martin Baker at Digital Heaven)

This interview is available as a podcast.

Walter Murch, ACE with Steve Hullfish
Walter Murch, ACE, and Steve Hullfish. Photo taken the evening of this interview in San Francisco, CA.

HULLFISH: Apocalypse Now had a LOT of footage, but the comparison of most film-acquired projects to digitally-acquired project has meant a LOT more footage, generally. Does all that footage make it harder or easier for the editor?

MURCH: It’s easier for the director, largely. And it’s somewhat easier for the script supervisor, but it’s harder for the editor. I should just qualify by saying that the way Francis (Ford Coppola) shoots certain scenes — and The Conversation is one of them — is, he makes it as if it’s a documentary. So people have identities, they have dialogue. Sometimes they’re allowed to improvise. The scene is set and he just launches the actors into it and covers it with multiple cameras. And at the end of a take, he decides — this is all before videotap (in other words, no monitor) — so he has to kind of intuit what’s going on. He decides, “OK, I’ll do another take” or “I’ll move the cameras into a secondary position and ask them to do it again.” And this generates, as you can imagine, a huge amount of material.

conversation-opening-sceneSo when I received the dailies on The Conversation from “the conversation” — which is to say the people walking around the beginning of the square — it was a little overwhelming for me. This is the first feature film that I had picture-edited. I don’t know what I had expected, but I wasn’t expecting a fusion of documentary and fiction.

So I had to do a grid map of all of the dialog and then all of the different cameras on X-Y coordinates and then — say this line of dialog is covered on this camera and this line of dialog isn’t and then I had a grading system of: “its covered and it’s very good” or “it’s covered and it’s a little sketchy.” So it was kind of like a Sudoku game.

But it was very valuable because the only films I had edited before that were documentaries, so I kind of flipped my mind into that mode. Francis did the same thing earlier on the wedding scene of The Godfather, which was simply a wedding, and everyone had their identities and multiple cameras rolled. And the same thing on Apocalypse Now with the attack on the Valkyries, which was much, much more daunting physically because the cameras were mounted in helicopters and you had much less control, so you would shoot ten minutes — which is all the camera could hold at that time — and then go back to base camp, and decide where and whether to move the cameras. So early on I got inoculated to a lot of footage for a scene and that kind of kept me in good shape editorially. So I was challenged early and I adapted my techniques to deal with that.

Because of the cost of film in those days, you would have two streams: there would be “print” and what would be called “b-negative,” meaning, “Don’t print this take, but hold onto it and maybe we’ll use it, but only if we get into trouble putting the scene together with the printed takes.” The ratio would probably be: out of seven takes, maybe three would be printed.

Gene Hackman from The Conversation

There’s a rule of thumb regarding that, which is that the next-to-last take is usually the best take, even though the director prints the last take. That’s due to the psychological interaction of actors and directors where the director is fishing for something and he sees the progress of the actors. If it’s a scene with multiple actors, he sees that it’s getting good; no, no, the fourth take wasn’t so good; now the fifth take is good. Sixth take is even better. Let’s do another take. And if the seventh take is not so good, what the director finds impossible to say is, “Okay. Print take six, but don’t print take seven.” Because the actors will say, “What was the matter with the last one?” So, you naturally print take seven and you say, “Great, let’s move on! Next setup.” And either the director will imply or leave it unstated that the sixth take — which is where they peaked — is the best one.

I told this story to (director) Fred Zinnemann when I was working on Julia and he was 70-years-old at the time and was very traditional. He shot a low amount of footage. And Francis, in The Conversation and other films, would — if the actors went off, he would keep going and maybe sometimes print or shoot 15, 20 takes. And his justification of that was that: first, the actors are good, then they got confused, and we just keep going and out of boredom, then finally they come up with something unique, and I want to get that unique thing.

So with digital, none of that really applies. There are all kinds of different methodologies that different directors use. But in my experience, almost everything gets printed. You have to cope with everything, but they indicate that there are different streams — that was the good take and this is the not so good take, but they get “printed” and the editor has to deal with them, whereas in film those takes would just not get printed, and so you would have a clean deck, as far as that goes.

slate from Francis Ford Coppola’s The Godfather.

But the unique thing about digital is the idea of resets within a take. So the director calls “action,” you’re going along and then at a certain point the director will say, “Okay, keep rolling but go back two lines and start again from there and make it a little angrier.” Then you go forward to line 10 and then he’d do the same thing. You go back to line eight and then move forward. So it’s kind of like cross-country skiing across the scene.

HULLFISH: Great analogy.

MURCH: The most number of resets that I’ve coped with was around mid-thirties, 32 different resets. And that’s difficult to keep track of. I think there’s technology now that copes with that, but this was five or six years ago.

HULLFISH: The Conversation had a very conservative use of close-ups, I thought. Is that something that you feel is true — just in your style editing?

MURCH: Yeah, I think so. It went through different kinds of geological epochs in the shooting. The conversation itself (the conversation that starts the film) is shot documentary-style. Nobody really knew what the cameras were going get. What Francis was after were serendipitous moments between the actors and the actors relative to the background people. The people were not actors they were just ordinary people. They weren’t even extras. The actors we were following were in the world of reality which added this wildcard aspect to it all.

Gene Hackman in The Conversation.

Then the section where Gene Hackman is analyzing the tapes is very two-dimensional. It’s just a medium close-up of him. And then looking at a kind of flat image of the tape recorder and the VU meters and the knobs.

And then there is the three-dimensional space of the party where Gene Hackman invites everyone back to his laboratory and they have a little party and that’s staged in a deep space.

It’s not abnormally free of close-ups, but it certainly isn’t over close.

HULLFISH: With some inexperienced editors I’ve seen, they tend to go to a close-up right away or they use them a lot. There’s an overuse of them and then that removes their power.

MURCH: Exactly.

HULLFISH: So I thought in your case, you were very careful about where you chose to put those close-ups.

MURCH: Right. When I was editing Julia, my assistant editor had worked with Fred Zinnemann before, but it was the first time I had worked with him and my assistant made a point — whispering to me — “Don’t go to close-ups too soon, because Mr. Z — which is what everyone called him — hates that,” for exactly the reason you pointed out — that the close up is kind of the nuclear button on the scene. When you go to a close-up, you want it to mean something.

From the 1977 movie, Julia.

And of course, that makes it difficult. If you have matching problems in a wider shot then where do you go? I remember in the scene in Julia, where they’re at Alberts, which is this cafe, so it’s a scene between Jane Fonda and Vanessa Redgrave. This was a real cafe in Strasbourg. After dailies, the script supervisor said, “It’s a nightmare of mismatching. You’re gonna have your hands full with this one.” But I found a way to cope with it. But, their hands were not always in the same position when they said a line and I wanted to hold off using the big close-ups until the apex moments of the scene. So you have to choose your moments carefully.

Films are full of mismatches. It’s just like any magic trick: it’s how visible those moments are or are NOT for the audience.

HULLFISH: You mentioned Fred’s desire not to go to close-up too early. In the book, “The Conversations,” you talked about your preference to cut before action begins, not mid-action. Do you find that certain directors tell you, “I love cutting mid-action” and then you have to change? Or do you have to have that discussion that you don’t like to do it and then debate why?

MURCH: The only time that’s happened was with Brad Bird on Tomorrowland. Otherwise, nobody’s ever mentioned it to me. With Brad, I think it was because he’s an animator and animators are — by nature — they think in shots and how do you join one shot to the other?

Almost naturally, they think of ending the shot with an action with which they will join up to the next shot. So the fact that I didn’t do that got his attention and he wanted always to have matching action. I tried to explain sometimes what I was after, but that didn’t really interest him.

HULLFISH: And we’re in the job of servicing our director, right?

MURCH: My rule of thumb — especially if it’s an idea that you really believe is important for the film — is to say your idea. What you’re being paid for is to contribute ideas, not just sit there and “tell me what to do.”

If the director doesn’t like it and you still really think it’s important, wait a while and then find a good opportunity to once again say, “You know, that thing we were talking about last week? Now that we’ve…” And if the director says, “no,” you say, “OK.”

If it’s REALLY important to you, try it a third time. If the director still says, “I don’t want to do that,” then shut up, because you don’t want to become a pest. On the other hand, if something is really important, find a diplomatic way to get the idea across. And everyone has different methodologies for that.

HULLFISHLet’s talk about Particle Fever. I loved this documentary. I loved the cut as Nima finishes erasing a chalkboard, then there’s a big slash drawn on the board and that motivates the cut to the next shot.

I thought of your theory from “In the Blink of an Eye” that the shot has a life and after a certain moment, that life is over-ripe and it’s time to move on.

MURCH: The closest analogy is what happens in music. That gesture was kind of like the blare of a trombone or something. At the moment, what you generally want to do is cut just before it overstays its welcome. If somebody is moving towards a door and about to go through the door, unless you’re making the point of the door itself, you don’t let them go through and then cut. But you don’t want to cut too soon either, so you have to feel the moment where it is inevitable that the person will go through the door and not do a Colombo — which is to stop and turn and say something else.

It’s kind of like the moment when a plane is taking off. The moment where the wheels are up. You’ve run out of runway and you have to take off. And that’s the moment to cut.

HULLFISH: To go back to The Conversation: I don’t know whether this is something you remember specifically, but I thought that there were numerous times that there were empty frames either at the beginning or the end of the shot — not that that didn’t mean the shot was over — but that they were absent of people.

MURCH: That was part of the aesthetic of the film in that Francis wanted this unsettling aspect — sort of the DNA of video surveillance — to infect the visual style of the film as if there wasn’t somebody behind the camera and that there was a motion detector attached to a servo motor and as long as the actor is moving in the shot, you hold. If the actor leaves, you wait, wait… It’s been programmed and if he doesn’t come back in the shot, then you go looking for him. He exited left, so I will pan left and do it very robotically. Those pans don’t have a human behind them. They are, but the operator was told to make it look artificial.

You see that throughout the film, particularly at the beginning when Harry’s in his apartment.

Gene Hackman in The Conversation.

HULLFISH: Back to Particle Fever again. There are lots of pre-laps. What does a pre-lap get you? (To start audio from the next scene while you’re still on the previous scene or vise-versa.)

MURCH: It’s a way of blowing a little smoke across the moment of the cut in that the moment that the new sound enters in, you’re asking the audience, “Can you make sense of this?” And so you alert the audience to a new development.

Again, you have to judge the right moment. Then at the right moment, you cut to the reality which is producing that sound, whatever it is. It’s the opposite of what in screenplays is called a smash cut. With a smash cut you just want to hit (he smacks his fist into his palm) with full force at the moment of the cut. Whereas a pre-lap is a way of bleeding some of the DNA of the incoming shot into the outgoing shot. And so you’re making sometimes a poetic simile between: “What is this person thinking?” You end on a close up of a man, and you hear a line of dialog and you’re looking at the face and a woman says, “What did he mean?” Cut. And now it’s the girl. And she says, “I told you to go to the store BEFORE dinner” or something. But by implication, what did he mean? Is something that the character in the outgoing shot is thinking or imaginatively hearing.

HULLFISH: There’s a series of jump cuts in the documentary of guys watching reloading Mac browsers to watch the collisions. Do you remember building that sequence? It’s all these people all over the world. They’re waiting for the Large Hadron Collider to run for the first time. And you did this great little montage of jump cuts between people.

bs-hs-particle-fever-documentary-20140319MURCH: This was an unscripted documentary. It was shot over a six or seven-year period as the Large Hadron Collider was being built. And we had access not only to the footage that Mark Levinson directed but all of the archival footage that CERN had collected in all of the years of their existence since the late 1940s. So there were about five hundred hours of material that we had access to.

At a certain point in the evolution of a documentary like Particle Fever or like Coup 53, the decisions that you make editorially with the director are, I would say, almost identical with what you were dealing with in fiction film.

The film is perhaps still too long, but you’re trying to compress it and to tell the story in the clearest and emotionally engaging way that you can do it in the shortest amount of time and you will make discoveries on how to do that.

Getting to that point, though, is very different because with a fiction film that is shot with multiple takes, you obviously have a script and editorially you have an abundance of interpretation and a paucity of events. Meaning the only events are what’s in the screenplay.

The screenplay may have — let’s say — 125 scenes. You don’t have another 125 scenes that you can go to. What you have is what’s in the screenplay, but you have many different interpretations. Some of them with small differences, others with large differences in performance. And for a scene, the editor might have 50 different line readings of a line of dialog from seven or eight different camera positions. And how do you thread that needle? That’s the challenge?

Coup 53 Director Taghi Amirani with Walter Murch, ACE in Murch’s editing room.

If that line of dialog is said with THAT attitude, what’s the best interpretation of the NEXT line of dialog? And should we be on the person who is speaking or should we be on somebody listening? All of those kinds of questions.

Whereas the opposite is happening in a documentary, you have a paucity of interpretation. Things generally only happen once. But you have a multiplicity of events because you could have five or six hundred scenes potentially for a documentary that you squeeze out of five hundred hours of material.

And that’s the question. What are the building blocks of the film? So you are writing the story, basically deciding those questions. And then because you only have one line reading of that line of dialog, you have to find a way to make that particular line reading — in the context of the larger film — work at the optimum value that it can achieve. And you try to minimize what seem like mistakes and maximize the potential of everything else.

HULLFISH: Another scene that I loved in Particle Fever was APPARENTLY the opposite of your lesson that you only take a scene so far and then once it’s expired its use then you move on.

There’s a great scene that ends with a guy missing his exit. Do you remember? And obviously it’s a very funny moment. It’s a great place to end. But you’d think, “Oh, we don’t need to have the guy miss his exit,” it’s a mistake, but it does reveal something.

img_3092-2MURCH: Yeah. He misses his exit while he’s talking and we stay on him because it’s a character moment. It’s the classic absent-minded professor. Here’s somebody who’s so excited about finally getting to see the Higgs-Boson announcement that he misses his exit.

It was real. It wasn’t concocted. The camera person on that shot was David’s wife. She was just there and it really happened.

HULLFISH: That’s very funny.

MURCH: So those are the nice, spontaneous moments that you want to make the most of. To get the best character revelation. It’s a light moment at a kind of heavy period of the film where it’s finally coming to fruition.

HULLFISH: That’s an interesting thing to talk a little bit about: trying to regulate the tone of a documentary so that you aren’t at 11 or at zero or sad or happy too long. Talk to me about your sense of that as you’re a little further into cutting a documentary film — feeling that.

MURCH: These are sort of what you might call the peristaltic motions of digestion of the audience. You are, in a sense, almost force-feeding the audience because they can’t control the speed of the film. And if you feed people too fast, they feel like the stuff is being rammed down their throat and they’re going to choke.

Walter Murch, ACE editing
Walter Murch in the Coup 53 editing room. Photo by Taghi Amirani.

And it’s at that point that they either check out of the film or actually get up and leave. So you need to have these moments where the digestive tract kind of relaxes for a moment and you can have a moment to digest what you’ve been fed. And humor, such as it is, is a very good digestive agent.

It indicates to the audience that the filmmakers are kind of on their side without being overt about it. It’s saying, “We know you need a break right now. And so we’re gonna give this to you.”

You’ll see moments of that in Coup 53 — which is also a heavy film because of the subject matter. But there are light moments within it. And we knew that from the beginning. When we did the card structure for the film, we had a purple card that just had the moment “grace” on it, like a grace note in music, and we would just almost arbitrarily put these things here in speculative moments.

Not all of them wound up at exactly that place, but it was a reminder to us — Taghi and me — that this was important. Otherwise, it gets really too dense.

HULLFISH: And what ended up being used for those filler card moments of grace?

MURCH: Moments where there’s just no dialog and thoughtfulness. Something very heavy has happened, and we’re just looking at somebody’s close up. We just stay there. Things like that.

HULLFISH: I’ve talked to other editors about the use of shoe leather — which, of course, is a derogatory term for us editors — but sometimes it’s that shoe leather that’s that moment of grace.

sea anemone, courtesy Adobe Stock.

MURCH: Right. Exactly. If you look at sea anemones, it opens up and then it’s waiting for something to float by. So a little fish comes by and it grabs the fish and then it closes and it’s now digesting the fish. Then when it’s digested, it opens up again. And the trick with a film is that the logic component of a film and the emotional component of the film are sometimes on different tracks and they have different moments of digestion.

Sometimes you can digest an emotion while you’re receiving information. The danger is trying to load up both of them at the same time. So you have to kind of alternate how you feed the information.

HULLFISH: Let’s talk about Romeo is Bleeding. You asked me to watch that. Out of your considerable filmography, why that one?

MURCH: It’s a good example of when an editor comes in to help a film in trouble in post-production. There was a traffic jam of conflict between the director, the producer and the author of that film, and as usually happens, the editor got caught in the middle. So the studio shut down the film, looking for a new editor to come in and calm things down. I was at loose ends at the time and intrigued by the script by Hillary Henkin, who wrote Wag the Dog. Gary Oldman – a great actor – played the lead, Jack Grimaldi. And it was beautifully shot by Dariusz Wolski (it was his first credit as DP – he was later to do Pirates of the Caribbean). The music was composed by Mark Isham.

Also, I really liked the films of the director, Peter Medak (The Ruling Class, Let Him Have It, The Krays) though I hadn’t worked with him before. So when I took over the editing of Romeo, I had to identify what the trouble was and then find a solution for it and calm everything down.

The problem seemed to mainly concern a disagreement about the character of the female lead, Mona, played by Swedish actress Lena Olin. Lena, who I had worked with on Unbearable Lightness a few years earlier, was cast as a last-minute replacement for Mona deMarco, a Mafia assassin. The part required an Italian-American accent, which Lena attempted but just could not master. My suggestion, which was accepted, was to make Mona into a Russian mobster – think Little Odessa – and as a result, she became Mona Demarková. Her accent – a crazy blend of Swedish trying to be Italian-Amerian – turned out to fit this new character perfectly.

Unfortunately, Romeo was not a success at the time, but it has its adherents now, more than 25 years later. Lena, in particular, has been singled out for her powerful and amazingly athletic, menacing performance.

HULLFISH: In The Talented Mr. Ripley talk to me about the dynamics of the climactic murder scene on the boat.

MURCH: It was a difficult scene to shoot: it took place in a rowboat out in the ocean and there were weather problems, the sun on one day, no sun on the other.

And the fatal hit itself is a complicated thing to pull off: Dickie’s head is bashed in with an oar. Makeup and VFX and everything. We tried to make everything as intense as possible.

This is where Ripley (Matt Damon) reveals to Dickie (Jude Law) that he wants to live together with him. This is crazy from Dickie’s point of view. The murder itself comes out of the rage and anger and disgust that Dickie feels, and Ripley reacting violently to this. So you just try to make it as intense as possible, physically and psychologically.

But then you want to — this is one of those grace moments at the end — you want to find a way to relax after the murder and find some sort of grisly poetry in the aftermath of the violence. As horrible as that is, it can be horrible and transcendent at the same time.

HULLFISH: So the anemone releases?

MURCH: Right.

HULLFISH: Structurally that scene happens almost perfectly in the middle of the movie. Do you remember trying to have it land like that or was the structure already set.

MURCH: That was how it was in the screenplay. It varied maybe five percent one way or the other, depending on the overall structure.

Anthony’s first assemblies were always long (Director Anthony Minghella). The first assembly of The Talented Mr. Ripley was four and a half hours. So we had to find a way to cut at least two hours out of the film to get it down to a releasable length.

The fact that it is right in the middle and the fact that Jude is so charismatic is also a problem for the film because: “Wait a minute! I loved looking at that guy and now he’s dead? And we’re left with Ripley?” Matt’s a wonderful actor, but Ripley is a kind of creepy guy. So the whole film — like The Conversation — is told strictly from Ripley’s point of view. It’s a single point of view film, which is a technique that both Francis (Ford Coppola) and Anthony used to compel the audience to identify with a character that they would normally tend to resist.

If you’re making a film about somebody who is not the normal kind of hero of a film, then making the film have a single point of view is a good strategy. In a sense, you’re forcing the audience into a Stockholm syndrome. The audience have no alternative. There’s no moment where two of the other actors go off and discuss Ripley or discuss Harry Caul. “What is it with that guy?” That just doesn’t happen. There’s no relief from the single point of view. Everything you’re looking at is either Ripley or something that Ripley is looking at.

HULLFISH: As if you’re stuck on a desert island with one other person…

MURCH: Right.

So after Jude dies, not only does the film still stick with the single point of view, but now Dickie (charismatic Jude Law) is gone. The film recovers, but it takes a while. It is a shifting of gears where the audience has to deal with the sudden loss of Dickie.

It’s what Hitchcock did in Psycho where Janet Leigh is killed 40 minutes into the film. Her death is not exactly at the halfway point, but almost. Psycho is about 100 hundred minutes long, so it’s a similar kind of thing.

HULLFISH: Talk about the cut — if you can remember — from when Peter Smith Kingsley was describing Tom Ripley’s characteristics to Tom. Then cut to him walking back in the stateroom as the audio of Peter continues.

MURCH: The scene — as shot — went all the way through killing Peter. And after Peter is strangled, the shot tilted up to the porthole window, then it went out the window And for various reasons, that did not work. I mean, it was shot perfectly well, but emotionally, it was hard for people to deal with.

So using that flash forward to Tom entering his cabin, while keeping the dialog in the past was something that I came up with one night to try to help that moment so that we see the two guys together, Ripley dealing with some very complex emotions. He’s distressed, but he’s trying with his voice to be happy. And we’re looking at his face. And at the same time, he’s tightening a cord around his hand, which is ominous.

And then you cut to a doorway, Ripley comes into his next-door cabin, and we’re still hearing the dialog with Peter. And then the audience realizes what has happened. So it’s a post-lap. And it’s devastating, but not brutally so because you don’t actually see the murder itself. It takes the audience a moment to catch up with events: “What the hell just happened?” And that slow realization both softens and in some strange way intensifies the moment because now — on the other side of that cut — the audience realizes that Ripley just killed the one person in his life who seemed to understand and love him for who he was. All because of this fluke of meeting Meredith (Cate Blanchett), he had to kill Peter.

So it was a very complex thing.

judeHULLFISH: There’s so much effort and so much talent in writing a script as you know — so much development process — and yet all the answers don’t happen with that script.

MURCH: And that’s one of the great mysteries of writing a good screenplay: it has to command the attention and emotions of the reader and be logical within a certain framework, but you can’t quite predict what’s going to happen, moment by moment, but when it happens that seems to be the right thing. Deja-vu, in a sense.

And yet this same screenplay has to somehow — in its DNA — acknowledge the fact that it doesn’t have all the answers to all of the problems that might happen during shooting, but it has to provide the raw material for answers to those unknown problems, at a whole different level of creativity – not with words on paper, but with images and sound in time.

A really great screenplay has to be almost indifferent to what happens to it. It’s like those men who make animals out of twisting balloons. They can make a giraffe and then suddenly something twisty happens and the giraffe becomes an elephant. But it’s still the same balloon.

And in a sense the screenplay is the “balloon” that is able to manifest itself as a giraffe or an elephant, depending on necessity, and not care. The danger — in a paradoxical way — is a screenplay that is too perfect, which HAS to stay exactly the way it is to be perfect. And the cinematic world just isn’t that way, most of the time. It frequently requires this ability to transform itself. Which is another kind of danger!

Every film is a mixture of different strengths and weaknesses. Nobody expected Jude Law to be as charismatic as he was. We knew he was a good actor, but we just didn’t know what was really going to happen. It turned out very different from the first Ripley film (Purple Noon, 1960) with Alain Delon.


In that film, Dickie is not especially attractive. It’s Alain Delon, Ripley, who is the beautiful one. And in a sense, what Anthony did in casting, is he flipped the relationship there. And so Talented Mr. Ripley has to cope with that twist.

HULLFISH: Most shots are relatively short, but occasionally, an editor uses a much longer shot — 60 seconds or 90 seconds. Does the potential of that edit point keep growing at every moment until you make that edit?

MURCH: Yes. Absolutely.

HULLFISH: Is it more dangerous to have to walk that tight wire?

MURCH: I like to say that the editor is making 24 decisions a second, whether to cut or not: “no, no, no, no, no, no, no, no, no, no, no, no, no. YES.”

If the shot is going to be long it has to be well-written and well-acted and have a complex evolution within the shot to allow it to sustain itself for a minute or more as a single shot. It will have its own internal dynamics built into it.

If the director is going to do that, he or she must put a lot of horsepower into that decision. Under those circumstances, it’s not up to the editor to say, “I want to cut this shot at the midway point” unless something disastrous happened in the shooting, and you have to then figure out what you’re going to do.

Every good scene has its own spine, and each of the moments internal to the scene is like a vertebra and you’re just watching it go from vertebra to vertebra and then toward the coccyx — toward the end — you do have to make a decision. “Shall we let it go all the way? Or can we cut a little sooner than was originally intended?” This is something you’ll only find out when you put the whole film together and see that undigested single shot and how it works within the context of the whole film.

HULLFISH: In “In the Blink of an Eye” you described a shot as having branches. You can’t really cut along the limb, but you can cut anywhere that there is a branch that shoots off.

MURCH: Yes, that would definitely apply – what I call “nodal” editing. Just like the branch of a tree has a linear grain, and then suddenly a ‘knotted’ grain where a new branch emerges. But a master shot that lasts for a long time is something that must have its own unique internal quality that the director and the actors have developed through staging and rehearsals. It’s almost a tree unto itself. And usually, there’s not a lot of extra coverage because the filmmakers have committed to this decision. And so there may not be anything an editor can cut to.

HULLFISH: Just in general, what makes you want to play a scene on a reaction shot — specifically quite a bit of a scene on a reaction?

MURCH: The quality of the acting: the thoughts that are visible moving across the character’s face. You’ll begin to feel this when you watch dailies.

It also will depend on the characters having their own arc within the film. Is this a moment when that character realizes something about his relationship with the other characters, something that’s going to change everything? That would be a reason to cut to that character in a reaction shot.

HULLFISH: I’ve talked to editors that discuss reaction shots in terms of empathy. That they want to see how a character feels about a question or a statement, which can be a reason, right?

MURCH: Right. Sure.

HULLFISH: But you’re putting it in the context of story — “Does that person have a reason later on in the story that you need to know this moment?” Or acting performance — “This is such a great performance I want to be on him because he’s so much stronger than this other person.”

MURCH: Imagine the situation not in a film, but in life. You’re sitting with two other people and they’re having a conversation. The first person will talk. Then the second person. A, B, A, B. But you will not just look at the person who’s talking, and only when they stop talking, look to B.

In “In The Blink of an Eye” that’s what I called the “Dragnet” style of dialogue cutting. Where we’re always on the person talking, and we cut to the other person only when he has his lines.

But pay attention to what happens in the real world: it’s a much more fluid thing, as soon as you ‘get’ the idea of what the first person is saying, you will look at the other person to gauge their reaction. So that would produce a post-lap cut – the first person is still talking when you cut to the reaction shot of person B, the listener. And then you might stay on person B while person A starts talking, which would be a pre-lap cut.

HULLFISH: Sound is also a specialty of yours. In The Godfather, there’s a great scene with sound design where Michael goes to an Italian restaurant to avenge his father’s death. There’s great use of a train in the distance when Michael kills Solozzo and McCluskey.

MURCH: It’s an elevated train. The script was not written with that in mind, but Francis wanted to hold any music off until after the murder. Interestingly, that moment after the murder was where the film was originally going to have an intermission.

Godfather is the first long film not to have an intermission, but when we were still working on it, in 1972, the plan was to have an intermission as soon as Michael runs out of the restaurant. But Bob Evans, the head of Paramount at the time, nixed the idea “We don’t want to let the audience off the hook!” A good decision. His other good decision was to allow the film to be almost three hours long, overriding the contractual obligation to have it be no longer than two hours and twenty minutes.

So Francis and Nino (composer Nino Rota) had designed a big operatic moment at that moment and Francis didn’t want to dilute that by having music underneath the previous restaurant scene. And yet, it’s a fairly long scene and almost half of it is in Italian with no subtitles, which is a risky thing to do, because it throws the audience back to watching body language and voice tone to figure out what’s going on — unless they understand Italian.

So it seemed to need something to underpin Michael’s emotions during the scene. And that’s when I had the idea of using an elevated train sound, as a kind of mechanical string section. I grew up in New York, not far from where that restaurant was supposed to be. And I knew that part of the Bronx was a rat’s nest of many lines of the elevated train system. The off-screen train sound comes and goes perhaps five times during the scene, getting louder each time. Finally just before Michael stands up and pulls the trigger, the sound screeches with a metallic braking sound. Very Bernard Hermann.

HULLFISH: And it’s great emotionally. It’s like the teakettle in so many scenes in movies, where, as the tension goes up, the squeal of the kettle intensifies.

MURCH: Yes, exactly.

Art of the Cut book cover
Art of the Cut: Conversations with Film and TV Editors

The final interview with Murch will be coming in a few weeks. It is a series of questions from the curious minds at the Blue Collar Post Collective Facebook group. Stay tuned!

To read more interviews in the Art of the Cut series, check out THIS LINK and follow me on Twitter @stevehullfish or on imdb.

The first 50 interviews in the series provided the material for the book, “Art of the Cut: Conversations with Film and TV Editors.” This is a unique book that breaks down interviews with many of the world’s best editors and organizes it into a virtual roundtable discussion centering on the topics editors care about. It is a powerful tool for experienced and aspiring editors alike. Cinemontage and CinemaEditor magazine both gave it rave reviews. No other book provides the breadth of opinion and experience. Combined, the editors featured in the book have edited for over 1,000 years on many of the most iconic, critically acclaimed and biggest box office hits in the history of cinema.

Ideas for editors and post-production people while in isolation or quarantine

It may seem like a no-brainer to learn something new while you’re isolated in downtime. NAB has canceled. You local restaurants and businesses may or may not be shuttered. Your downtime may be a self-imposed (or government-imposed 🤨)  quarantine but it might also be because your jobs canceled or your place of employment has temporarily shut down. What seems like an eternity of free downtime can easily get taken up by endless binge-watching of any number of multi-season television shows but I’m going to suggest a few things that might be more productive. Some of these are my own ideas, some I’ve overhead and some come to us via Twitter.

With that, here are some ideas for editors, students, artists and post-production people of all kinds while in isolation or quarantine.

Learn something new for your post-production career

This is an obvious and easy one but so often we don’t sharpen our skills into new areas we know that we need. It’s hard to come home from a long day of editing and sit in front of the screen for another few hours. Our brains and eyes and fingers need a rest. Take this time to do some of those tutorials you’ve bookmarked and read some of those articles you’ve saved.

While I hesitate to recommend YouTube tutorials as so many of them are full of bad advice there is much that can be learned there. I honestly think it’s better to learn how to fix your toilet than it is how to properly use Adobe Premiere. Your public library might have Lynda / LinkedIn Learning access for free if you have a library card.  EditStock is offering 30% discounts so if you’re in need of new or different kinds of footage to edit then check them out.

Make use of those free 30-day learning trials

While we’re all used to (most) software having a free trial period, many learning platforms also have free trials. I’ve linked to some of those below. If you sign up and get something really good out of them consider staying on once the trial is over. Unlike the free YouTube world, these sites have staff, marketing and infrastructure costs that they must pay to stay alive.

Lynda / LinkedIn Learning and their free 30-day trial but check your library for free access!

Pluralsight 10 day free trial as they offer a lot of different skills outside of the creative space

Mixing Light has a 7-day free trial and there is no better place to learn color grading

Some training sites don’t have quite the same model as the ones above, like fxphd or Ripple Training so it’s worth browsing their offerings and signing up if there is something you wish to learn.

Our sister site is full over 3,000 filmmaking videos that are all free, all the time so you can always start there.

Just go ahead and download DaVinci Resolve

resolve-icon-2If you haven’t yet given Resolve a play then you know you want to as God knows you’ve heard enough about it over the last few years. It’s free, it’s available on both Mac and PC and it’s an incredible tool that can do pretty much everything you need to do in post-production. I’ll even give you the download link right here. I don’t think Resolve is going to take over all of post-production as there are still things some of the other NLEs do better and damn it, it’s okay to just prefer the way one NLE works over another. But Resolve is so full-featured it can be a great free tool to supplement many parts of the post-production process. And you know you’ve been wondering about … and it’s free.

Blackmagic has produced an incredible array of training materials for Resolve including video and PDFs ready to go. There’s not really any indication they are going to slow down.

For the Media Composer and Premiere Pro editor

fcpx-icon-2Apple offers a free 30 day trial of Final Cut Pro X. If you’re on a Mac, download it, install it and give it a play with an open mind. Leave your preconceptions at the door and don’t listen to what others have said if you haven’t kicked the tires yourself. It doesn’t work like Media Composer or Premiere so don’t try to treat it that way. Invest in some good training that will teach you how to use it properly and you might find you don’t hate it. After you invest a bit of time in FCPX you might indeed determine you don’t like how it operates. That’s okay but hopefully, you’ll come out with an understanding of why others do like (even ❤️💕) it and you can respect them for it.

For the Final Cut Pro X editor

motion-iconIf you’ve never taken the time to download (Mac App Store link) and take a look at Apple’s Motion motion graphics application then you should do so. At $50 I’ll make the proclamation that dollar for dollar Motion is the best bang for the buck you can spend in post-production. You might be happy with all of the effects plug-ins and templates you can already get for FCPX but Motion will allow you to create your own should you so choose.

Better than that you can use Motion for all kinds of motion graphics and effects work. You can do so in a more traditional motion graphics keyframing interface that isn’t FCPX’s horrid frustrating keyframe interface. You’ll be happy with the $50 spent and if, at some point in the future, Apple adds a send to Motion option back to FCPX you’ll be ahead of the curve.

Avid’s FREE Firsts

media_composer_video_editor_softwareprotooks-first-iconToday’s massive media creation atmosphere means many editors and media creation professionals will never come across Avid’s Media Composer or ProTools. They are industry standards for video editing and audio editing respectively. All that tv you’re going to binge-watch was most likely edited and mixed on these two Avid products.

Avid offers full-featured, free versions of both Media Composer and ProTools. These |First products are the full application just limited in things like the number of tracks you can use in a timeline and what you might be able to output. The benefit of this approach is that if you learn how to use the First version very well you can easily step into the full version as it’s the same piece of software. I’m also a firm believer in, at the very least, knowing what others are often arguing about so you can talk intelligently on the subject.

Download Adobe Audition

auditionOk, video editors working in the Adobe ecosystem … you know you’ve seen it in your Creative Cloud app and probably never hit the install button for Adobe Audition. You should and a work hiatus is a perfect time to learn how to use a good digital audio workstation. Adobe Audition is incredibly powerful and one of the most underrated post-production tools out there (kind of like Apple Motion).

Audio is crucial to a good video experience and while Premiere has some of the best NLE audio tools out there Audition can take your audio finishing to another level. Don’t fear the conform process moving an edit from Premiere to Audition as there is a command to easily send it over. You can also open an Adobe Premiere timeline right from withing Audition without generating any new media.

What to do to fill your time during isolation or quarantine doesn’t have to revolve around post-production software. In fact, it shouldn’t you should take some time to get away from it if you’ve been forced away from it.

Make a family documentary

While we’re supposed to practice social distancing during this time it’s okay to be around your family and even then if you choose to make a family documentary you can do so without close contact. I thought about this when I saw this tweet from Philip Grossman.

This is something that I desperately wish I had done with my parents and grandparents before they passed away. Most all of us in this business have the gear to do this and even if you don’t you’ve got a phone that shoots great audio and video. Bonus points if you learn a new NLE to edit the thing. And remember, your family documentary doesn’t have to be about aging parents. Make a family documentary short about your kids and have them do the same about you. There’s some media literacy learning during prolonged school closure. Grab iMovie, for free on your phone. They can figure it out.

Digitize your family archives

Many of us have a few boxes of old family photo albums, 8mm film, VHS tapes, scrapbooks, audio cassettes and the general family history that accumulates over the years as parents age and pass away. Often those things end up in a box in the attic. Like the family documentary above these memories are timeless but the medium holding them (photo paper, magnetic tape, regular paper) can break down over time. Or a mouse might get into that box and go to town.

I’ve got three bins of photo albums, photo prints, slides, scrapbooks, diaries and VHS tapes that all need to get digital. It’s the kind of thing you put off and put off.

While there are numerous services that will digitize these things, and will even send you a postage-paid box to ship them out, if you have time on your hand you can do a lot of that yourself. A flatbed scanner or an iPhone app or a bedsheet with the working projector means there are tools at your disposal to archive these memories yourself. That is time-consuming so in lieu of doing it yourself take this downtime to use a service like Everpresent or Legacybox to get those archives digitized.

Finally set up your cloud backup service

Once you’ve got a ton of new family archives digitized you’ll want them (and all your other digital life stuff) in a secure off-site location should the worst happen. If you haven’t yet signed up, set up and uploaded to a cloud backup service like Backblaze, Carbonite or IDrive then this might be the opportunity.

Since most home internet upload speeds are slow it takes some time to sort through and determine what all you want to be protected in the cloud. These consumer services aren’t meant for terabytes of client video media but rather personal cloud backups and they should be treated as such. Most of them archive by hard drives and if those drives aren’t mounted every 30 days or so (check the service you’re looking to use) they will delete that media. I recommend a dedicated “family” drive and a repeated reminder to mount and check every so often.

Spend some time away from the screen

Glowing screens consume the editor’s life. One, two, three or more screens stare back at you for 8 to 10 hours (or more 😩) every working day of our life. We then look at phone screens throughout the day for text messages, social media, and various work-related apps. Often we browse the iPad or tablet in the morning over coffee, during lunch or dinner for more social media, reading the news or playing some games. To wind down we Netflix and chill, Hulu a series or endlessly browse Amazon Prime Video to find the 🏆 amongst the 💩. We spend so much time staring at screens so use this time to get away from them.

I used to love building models and always wanted to build a model for each cool car I’ve had over the years, many of which I’ve already started. The thought of impending isolation made me dig out the box. Extra point if you can tell me what the green one is (and no, I’ve never had that one).

How about:

  • Read a dead tree paper book that you can hold, smell and turn the pages of.
  • Draw, be it with some sketch paper and marker assortment or printer paper with pencils and sharpies. Just draw like you used to do when you were a kid.
  • Paint for creativity. Do you by chance have the oils or watercolors from that college art class? Get them out and use a part of your brain you haven’t used for years. Don’t have any painting supplies? Amazon is still (most likely) delivering and delivering quickly.
  • Paint your house. Not the exterior of your house but I bet many of us have a room or wall or some touch up that we’ve always wanted to do. I’m sitting here typing looking at bare wood on an office door. Unless there’s a mandated shutdown on business your Lowes or Home Depot will be open. Better than that visit the local hardware store as they might be hurting for business right now.
  • Do you have kids? Play. Play Lego, Hot Wheels, Barbies, tea time, hide and seek, the floor is lava, wrestle, paper airplanes, board games whatever you can do with them. Go outside in the yard for some 🏈, ⚽️, 🥏 or 🏸. This might be (hopefully) the only pandemic you get to spend with them.
  • Do some yard work if weather permits. You don’t have to actually stay inside this whole time and if you own a home you know there is yard work you want to do.
  • Cook. Yes, the Costco has been overrun and missing 🧻 as well as a lot of food but many of them still have stock. Check the organic section as it’s probably stayed pretty well stocked. I saw some advice on Twitter that your local Asian market is probably full of products without many people shopping there.

These ideas above might seem like no-brainers but I hope they can be helpful to see them written out and even a quick breeze through them might lead to an idea or might help stave off some boredom or might get you out from in front of the tv for a while. I need to heed a lot of the above advice myself.

I also asked this question on Twitter and got some great responses.

img_3136Any other good learning links you’d like people to know about (as I know there are tons of them not listed above)? Any software that every post-production professional needs to know about? Is there a game that every family should have and love and play often? What’s a good book that you’ve had on your reading list but haven’t had time to dive in to?

Please let us know in the comments.

COVID-19: entertainment workers should not be colateral damage, says IATSE

COVID-19: entertainment workers should not be colateral damage, says IATSE

The presidents of both IATSE and MPTF aim to help the entertainment industry professionals survive through what is one of the worst crisis of an industry whose workers can not work from home.

That special film you want to watch or the next episode of your favorite TV series may be delayed, because productions are grinding to a halt, but the worst part of the story is that thousands of people are being forced out of work, many without pay. Think about this when you sit comfortably at home, in front of your TV set.

Job losses across the United States are growing and economists worry more layoffs are coming as businesses see plummeting sales, writes The Washginton Post. The entertainment industry is also  being severely affected, with tens of productions being halted, both in TV and film, theaters being closed, all around the world, and a variety of sectors of the entertainment industry are suffering financial hardship due to the situation.

The Motion Picture & Television Fund is ready to help those in the industry in need of medical, financial and emotional assistance, writes the Deadline, citing MPTF president and CEO Bob Beitcher, who said in a message to the industry “Nearly 100 years ago, one of our founders, Mary Pickford, said about MPTF, ‘We see a need and we fill it.’ The same is true today. MPTF remains fully committed to its mission of helping our entertainment community in their time of need.”

COVID-19: entertainment workers should not be colateral damage, says IATSE

Our “new abnormal”

“The need we see today,” he added, “is supporting those members of our industry workforce who are undergoing hardship due to COVID-19 related issues – productions getting shut down or pushed back, staffs being trimmed as companies anticipate financial downturns from our ‘new abnormal.’ We have been at this place before during the 2007-2008 work stoppage and supported 630 industry members with financial assistance and case management in a challenging time. Thanks to the generosity of our community, MPTF is still here for you today.

Another effort to help the industry professionals comes from IATSE, whose International President, Matthew D. Loeb, calls on Congress to pass relief package that includes displaced entertainment workers. He said: “As social distancing measures are enacted and events and projects across all sectors of the entertainment industry are cancelled, it’s become clear that the COVID-19 crisis requires decisive action from our Federal Government to support displaced entertainment workers.”

“Right now, thousands of our members across all sectors of the entertainment industry are suffering financial hardship because of government mandated cancellations. Entertainment workers shouldn’t be collateral damage in the fight against the COVID-19 virus.”


COVID-19: entertainment workers should not be colateral damage, says IATSE

An alliance started in 1893

The International Alliance of Theatrical Stage Employees, Moving Picture Technicians, Artists and Allied Crafts of the United States, Its Territories and Canada was founded in 1893 when representatives of stagehands working in eleven cities met in New York and pledged to support each others’ efforts to establish fair wages and working conditions for their members. This  union has evolved to embrace the development of new entertainment mediums, craft expansion, technological innovation and geographic growth.

Today, its members work in all forms of live theater, motion picture and television production, trade shows and exhibitions, television broadcasting, and concerts as well as the equipment and construction shops that support all these areas of the entertainment industry. The IATSE represents virtually all the behind-the-scenes workers in crafts ranging from motion picture animator to theater usher.

COVID-19: entertainment workers should not be colateral damage, says IATSE

An urgent relief package

It’s that huge number of professionals that the IATSE wants protected. “But this isn’t just about us” says Matthew D. Loeb, as “economic studies demonstrate that entertainment spending reverberates throughout our communities nationwide. Film and Television Production alone injects $49 Billion into local businesses per year, and the overall entertainment industry supports 2.1 Million jobs in municipal and state economies.”

“Along with the other entertainment unions and the labor movement at large, we call on the Federal government to pass a relief package that prioritizes workers whose incomes have been lost as a result of this crisis. Strong measures like ensuring continuity of health benefits, providing enhanced and extended unemployment, disability, and workers compensation insurance are necessary for ensuring the financial stability of entertainment workers and their families. Additionally, the government should enact a special emergency paid leave benefit geared to include our members.” he said.

“It is vital that these measures are enacted as soon as possible to provide effective emergency relief for workers who have felt the economic consequences of the Coronavirus the hardest.” added Matthew D. Loeb, International President of the IATSE.

“A Cow At 8:00 AM is Different Than A Cow At 2:00 AM”: Animal Trainer Lauren Henry on First Cow

Kelly Reichardt peppers her 19th Century Oregon Territory with warm cakes and endearing fauna. Eve, the “first cow” in the territory, is a symbol of opportunity to everyone but its natives, the hinge of the film’s plot, a romantic proxy to its protagonist, “Cookie,” and one of animal trainer Lauren Henry’s best behaved cows. The secret to contriving “wild” and natural animal behavior in the preposterous habitat of the movie set? Patience. Putting the time in to normalize the set for the animal and any action he or she might have to perform in it. But Henry’s work goes beyond […]

Cubic Motion joins Epic Games: digital humans are inching closer

Cubic Motion joins Epic Games: digital humans are inching closer

A longtime Epic Games partner, with whom it has cooperated to create photorealistic digital humans first revealed at GDC 2016, Cubic Motion is now part of the company behind the Unreal Engine.

Cubic Motion is a longtime Epic partner and has  been integral to numerous notable Unreal Engine real-time demonstrations, including the first “Hellblade: Senua’s Sacrifice” live character performance at GDC 2016, followed by the expanded “From Previs to Final in Five Minutes,” which earned Best Real-Time Graphics and Interactivity at SIGGRAPH 2016. Epic and Cubic Motion have continued to collaborate, showcasing high levels of quality and believability in photorealistic digital humans in “Meet Mike” at SIGGRAPH 2017, and “Siren” at GDC 2018.

Cubic Motion’s facial animation technology has also been used in the production of many notable AAA titles, including Sony Interactive Entertainment’s “God of War” and Insomniac Games’ “Marvel’s Spider-Man” titles where the frontier between what is a videogame and cinematic animation is blurred to extents that were not imaginable a few years ago.

The company also worked on HBO and Survios’ new virtual reality experience, Westworld Awakening, and many other titles. Cubic Motion’s intricate facial capture and animation technology, was also used to allow League of Legends fans to see the character Akali dance, talk, and react like never before. For China’s League of Legends Pro League (LPL) finals in September 2019, live broadcast K/DA heroine Akali was brought to life both for a dance number and real-time interview.

Cubic Motion joins Epic Games: digital humans are inching closer

Persona, a new era for digital characters

A leading provider of automated performance-driven facial animation technology and services for video games, film, broadcast, and immersive experiences, Cubic Motion is also the company behind Persona, which paves the way for a new era in live digital facial performance, one where human actors, digital characters and audiences can co-exist and interact.  Persona captures and translates an actor’s performance onto their digital counterpart, in real-time. Designed from the ground up for live performance, Persona enables immediate character animation in real-time engines such as Epic’s Unreal 4.

Announcing the acquisition of Cubic Motion, Epic Games says that “by joining forces, our teams are solidifying our commitment to advancing the state of the art in the creation of believable digital humans for all Unreal Engine users.” Cubic Motion’s talent will work hand in hand with 3Lateral, developer of innovative technologies that enable digitization of human appearance and motion at unprecedented levels of realism. 3Lateral joined the Unreal Engine team in January 2019 to lead development of the state of the art in real-time capabilities for the creation of virtual humans and creatures.


Crossing the uncanny valley

“We are delighted to be joining Epic Games and look forward with excitement to this next chapter in our story,” said Cubic Motion CEO Dr. Gareth Edwards. “Together, we are uniquely positioned to push the boundaries of digital human technology, bringing ever more realism and immersion to all forms of visual entertainment.”

“Digital humans are not only the next frontier of content creation, but also the most complex endeavor in computer graphics. With Cubic Motion bringing their computer vision and animation technology and expertise to our digital human efforts, Epic along with our team at 3Lateral are one step closer to democratizing these capabilities for creators everywhere,” said Tim Sweeney, founder and CEO of Epic Games.

“Facial animation that conveys the slightest nuance of human expression is essential to crossing the uncanny valley. We believe that holistically combining Epic’s Unreal Engine with 3Lateral’s facial rig creation and Cubic Motion’s solving technology is the only way to answer this challenge, and ultimately, to reach the pinnacle of digital human artistry with Unreal Engine,” said Epic Games CTO Kim Libreri.

Lume Cube 2.0 review: rugged, portable lights for stills and video

Way back in 2014, Lume Cube Inc. sailed past its fundraising goals when it launched its brand-new, eponymously-named LED lighting system on Kickstarter. Aimed both at still and video shooters, the original Lume Cube offered 1,500-lumen light output and Bluetooth wireless control in a remarkably compact package. And despite measuring just 4 x 4 x 4.5cm (1.6 x 1.6 x 1.75″), it was also both rugged and waterproof.

Enter the Lume Cube 2.0: as portable as ever with some key improvements under the hood. In short, we think it’s an incredibly handy little light with a robust ecosystem of accessories. See our condensed findings below and read on for the full analysis.

Key takeaways:

  • Compact, size, rugged and waterproof construction inspire confidence and use in a variety of situations
  • Excellent quality of light, with daylight white balance and a high CRI value
  • Optical sensor to act as a slave flash for stills shooting
  • Fine control over brightness
  • Impressive battery run-time, but longer charge times
  • Good max brightness for size, but still struggles under sunlight
  • Excellent, if pricey, accessory system

What’s new?

Although it’s exactly the same size as its predecessor and has the same light output as before, the updated Lume Cube 2.0 now runs for three times as long at full power, and is easier to recharge too. And the light it produces has improved noticeably in terms of its color rendition, coverage and evenness.

The new version also offers an added low-light mode with a 1-10% power range in 1% increments, controllable either from the smartphone app or the Lume Cube’s own physical controls. And the product bundle has been expanded, with a modification frame and two filters included as part of the base product bundle.

The standard Lume Cube 2.0 kit (left, $90) now includes the modification frame with diffuser and warming filters. The Pro Lighting Kit (right, $300) includes a carry case packing two lights with modification frames, a dozen filters, two grids, and a full set of barn doors, snoot and diffusion bulb for each light.

Who is it for?

As before, the Lume Cube 2.0 is aimed both at still and video shooters, regardless of whether they’re using a sizable interchangeable-lens camera rig, GoPro, smartphone or even a drone.

For standalone cameras, most functionality can be used without a smartphone at all, and the Lume Cube works either a continuous light source or an optically-triggered slave flash. Alternatively, multiple lights can be controlled individually or as a group using an Android / iOS app.

Smarter physical controls and a better-placed optical sensor

The Lume Cube 2.0 looks quite similar to predecessor, and its size and weight are unchanged. The biggest difference is that the optical sensor has moved from the front of the light to its top, sharing its location with the status LED. In its new home, it’s now easier to trigger from a wide range of angles.

It sits in between a clearly-labeled pair of opaque black buttons on the top deck which replace the translucent buttons of the original model. These together provide access to power control, brightness adjustment, optical slave flash, and low-light mode.

The non-swiveling hot shoe mount shown in this image is included in the standard bundle. The Pro Lighting Kit doesn’t include any mounts; the ball head mount shown atop this page costs $25.

Charging is quicker and easier, but waterproofing suffers

Around back, the charging port is now a modern, reversible USB-C connector, and is covered by a soft rubber flap. While it’s still a bit fiddly to pull open with recently-trimmed nails, it closes securely, stays in place, and it’s really nice not to have to worry about the lights being affected by rain. (And to be able to take them underwater too, if that’s your thing.)

While less bothersome than the screw-in cover of the original Lume Cube, it’s also easier for water to circumvent, though. As a result the new model is waterproof to a maximum depth of 9.1m / 30ft, down from 30.5 / 100ft for its predecessor. That’s still plenty for snorkeling and probably a pretty significant proportion of recreational scuba too, but if you’re planning on deeper dives you may want to stick with the earlier version.

Wider, more even and daylight-balanced light

On the inside, everything is new. The battery, LED and optics have all been replaced, and while it still has a light output of 1,500 lumens (750 lux at 1m), the Lume Cube 2.0 now has a 5600k daylight color temperature, down from the 6000-6500k of its predecessor.

At the same time, the quality of its light has improved, with a Color Rendering Index score of 95. The Lume Cube 2’s new lens is also less prone to hot spots, and has a wider 80-degree coverage, up from its predecessor’s 60-degree beam angle.

Here’s a view from the rear, with the Lume Cube on its optional ball head mount atop my personal Pentax K-3. This mount is much better than the basic one included in some kits, as it lets you aim the light in almost any direction.

Smarter firmware and new features

The Lume Cube 2.0 has also received some smart updates in the firmware department. Perhaps most importantly, it now requires a three-second long press of the power button to switch on. This ensures that unlike its predecessor, it won’t switch itself on in your camera bag and drain its battery right before it’s needed.

The new firmware also allows brightness to be adjusted in either direction using the Lume Cube 2.0’s physical controls, unlike the original version which could only increase brightness to its maximum before looping back around to its minimum brightness setting on the next step.

And a new mode accessed with a long press of both buttons at once allows a much narrower 1-10% brightness range with a more precise 1% step size, rather than the full range in 10% steps as is the default. It’s handy if you’re shooting long exposures but still need just a little illumination.

Same accessories and mounting system, but now it’s in the bundle

A wide selection of filters and accessories can be attached to the Lume Cube 2.0 using the exact same modification frame attachment as before, allowing owners of the original Lume Cube to upgrade their lights or add new ones while keeping the rest of their gear.

A wide variety of optional mounts are available, including this spring-loaded smartphone clip ($20) to which I’ve attached the ball head and a Lume Cube 2.0 with bulb diffuser.

The frame itself now ships even in the base product bundle along with a pair of warming and light diffusion filters. Each of these uses name-brand LEE filter materials from the company’s LED-specific Zircon line. The level three warming filter drops the color temperature to 4,500 kelvin, and the diffusion filter is the lowest strength available.

Incredibly portable yet decent battery life too

The Lume Cube 2.0 is very solidly built, with not a hint of creak or flex anywhere. It’s also impressively small. Even with a modification frame attached it’d fit in looser pants pockets, and you could easily bring two or three in a jacket pocket and almost forget they were there until you needed them.

Given the compact size and relatively powerful output, I was really impressed by battery life, which is a huge improvement on the previous iteration’s 25 minutes. At 100% brightness, I could manage anywhere from 62 to 90 minutes on a charge, depending on whether or not Bluetooth was enabled, meeting the manufacturer spec precisely.

Battery life impresses, and you can charge the lights while using them

And by dropping to 50% brightness, I managed an average of three hours, 38 minutes per charge with Bluetooth active. That absolutely demolishes not only the original Lume Cube’s runtime, but also the spec sheet, which promises only 2.5 hours!

I must say it surprised me that controlling the lights via Bluetooth decreased the battery life as much as it did, though. The good news is that the Bluetooth radio does eventually go to sleep if the light is left inactive for a while. Once fully asleep, it needs to be woken back up with a physical button press before it’ll respond via Bluetooth again.

Fast charging requires a fast, modern USB-C charger and cable

The only place I didn’t come near the manufacturer spec was recharging. Lume Cube’s documentation promises around 45-60 minutes for a full charge, but using the supplied USB-C to USB-A cable and a wide variety of different chargers capable of up to a maximum of 18 watts per port, I was never able to recharge in less than two hours, 49 minutes.

I used three Lume Cubes for this shot. One was unfiltered on a mini-tripod near the bottom front of the cage, and another with bulb diffuser was pointed straight down from a bit above and in front of the birds. Finally, a third was just out of frame right and, in turn, was aimed to bounce off a small folding reflector just out of frame left.

To get near the claimed time you’ll likely need a recent, high-powered USB-C charger with Power Delivery support, as intended for charging laptops and the like, plus a Power Delivery-compatible USB-C to C cable. Unfortunately, I haven’t one to test with myself to confirm the claimed charging time.

Recharge and use your lights at the same time

The good news is that you can charge and use the lights at the same time if you’re within reach of a power outlet. This can potentially extend run times a lot, especially if you can switch off entirely or dial the brightness down significantly while setting up and between groups of shots.

I did just that while working to get the shot of my pet parakeets above, avoiding having to stress the birds any more than was necessary.

Note, though, that it’s not recommended to go above brightness level 80 while also charging. This is likely due to heat concerns, as after extended periods at 100% power, the Lume Cubes can get uncomfortably hot to the touch even without charging at the same time. (Not enough to burn instantly, but enough that you couldn’t persuade me to hold my finger on it.)

So long as you’re not in direct sunlight, the Lume Cube 2.0 is sufficiently bright for daytime use at shorter distances. Compared to the unlit shot (left), a Lume Cube at arm’s length (right) not only fills in shadows, but is strong enough to cast its own.

Best for smaller subjects or in lower ambient light

The Lume Cubes’ small size is great in terms of portability, but it comes at the expense of daytime usability. For what they are, these are pretty powerful lights but they’re simply no match for full sunlight, where they struggle to fill in shadows even at full power from just a couple of feet away.

A small reflector would be a better choice here, using the sun’s own power to provide light where it’s needed. In full shade or even indirect sunlight, though, even just a single Lume Cube can make quite a noticeable difference, so long as it can be kept fairly close to the subject.

That makes it quite well-suited to things like head-and-shoulders portraits, selfies and talking head video capture, and so on. And once you take the sun out of the equation, shooting indoors or at night the Lume Cubes really shine, if you’ll pardon the pun.

This image required just a single Lume Cube. I positioned the camera directly above the cash, then put a sheet of glass at a 45-degree angle in between. I bounced the light from a single unfiltered Lume Cube off this, and shielded the subject from direct lighting.

Up next, let’s take a look at the Lume Cube 2.0’s accessory mounting system, and its Android / iOS app experience, before wrapping up with a final conclusion.

EIZO ColorEdge CS2740: first CS series monitor with 4K UHD

EIZO ColorEdge CS2740: first CS series monitor with 4K UHD

The new EIZO ColorEdge CS2740 offers the EIZO quality in an accessible 27-inch monitor versatile for photos, video editing, design, and other types of creative work in resolutions up to 4K.

Announced last October and presented as a new solution to make “video production comfortable”, the EIZO ColorEdge CS2740 takes the place of the ColorEdge CS2730, offering more pixels, 10-bit input and more connections in what the company says is as an easy-to-use 27-inch 4K UHD high resolution display, which combines large production space and compactness.

The ColorEdge CS2740 is one of the monitors EIZO would show at NAB 2020, as well as other events, but due to the COVID-19 outbreak, those shows were cancelled or postponed. Still, the monitor is now being shipped, so look for information regarding your contru if this is a solution that suits your needs.

With the release of the ColorEdge CS2740, a 27-inch, 4K UHD (3840 x 2160) monitor for creative work, complete with USB Type-C connectivity, EIZO expands its CS series. The ColorEdge CS series is a range of monitors that include many of the advanced features of EIZO’s professional ColorEdge CG series, while also meeting the varying needs and budgets of hobbyists and prosumers.

EIZO ColorEdge CS2740: first CS series monitor with 4K UHD

Working area and toolbars

The ColorEdge CS2740 is the first in the CS series to implement 4K UHD resolution, which is four times the size of Full HD (1920 x 1080). The detail allowed ensures that high resolution content is displayed crisply. The monitor also boasts a pixel density of 164 ppi for 4K image display that has never looked smoother. Alphanumeric characters and contours are distinguished with excellent sharpness, so users can check even the finest details without needing to zoom in.

The resolution and screen size mean that users can fill the screen with both an area equal to A3, and the tools of the app being used. In fact, says EIZO, “the 27-inch screen provides ample space for easily displaying Full HD content in full, with enough space to show toolbars, palettes, and other windows or applications simultaneously. This makes the monitor versatile for photos, video editing, design, and other types of creative work in resolutions up to 4K.”

EIZO ColorEdge CS2740: first CS series monitor with 4K UHD

Effective for editing and checking video and photo

The monitor’s wide color gamut reproduces 99% of the Adobe RGB color space, ensuring images are reproduced faithfully. To counter fluctuations in brightness and chromaticity characteristic of LCD monitors, EIZO’s unique digital uniformity equalizer (DUE) corrects deviations in every tone across the screen for stable display.

The ColorEdge CS2740 supports EIZO’s ColorNavigator 7 color management software, so users can regularly calibrate and quality control their monitor quickly and reliably for predictable color results. It also supports EIZO’s Quick Color Match software, which simplifies the screen-to-print color matching process in just a few steps.

Able to display 4K UHD (3840 x 2160) high resolution, the monitor supports 10-bit input (approximately 1,073,4 million colors) for richer color reproduction, meaning there is no gradation loss or color cast, making the ColorEdge CS2740 effective for editing and checking video and photo data. Users should note that 10-bit display requires a graphics board and software that support 10-bit output. Eizo also says that for 10-bit display with HDMI connection, Deep Color compatible device is required.

EIZO ColorEdge CS2740: first CS series monitor with 4K UHD

ColorEdge CS2740 is now shipping

The now common USB Type-C connectivity is present, and it allows the monitor to display video, transmit USB signals, and supply power (60 W delivery) to a connected device, such as a smartphone or notebook PC, using a single cable. Users can simply plug in and start getting creative without worrying about additional cable clutter.

Additional Features

  • Smooth gradations with 10-bit display from a 16-bit LUT (look-up-table)
  • USB Type-C, HDMI, and DisplayPort inputs
  • Optional light-shielding hood
  • 5-year manufacturer’s warranty

The ColorEdge CS2740 is now shipping. Date of availability varies by country so contact the EIZO group company or distributor in your country for details.