Fotodiox has a new Canon to FUJIFILM X Smart Autofocus Adapter. The adapter provides full electronic functionality when using Canon lenses on a FUJIFILM X Mount camera body. The EF-FXRF Fusion Smart AF Lens Adapter gives users autofocus, in-camera aperture control, powered lens image stabilization, and EXIF data transmission. The adapter was designed to offer … Continued
A Texas appeals court has ruled that the University of Houston does not have to pay the photographer of a picture it has been using in online and print promotional materials. Houston photographer Jim Olive says the university removed copyright markings from an image downloaded from his stock library, failed to credit him when it was used and wouldn’t pay when he sent a bill, but the university claims it has sovereign immunity and that it can’t be sued.
The case surrounds an aerial image Olive shot from a helicopter hired specifically for making pictures for his library. In an online image search, he found the university was using it on its website and then in printed materials. When it failed to pay an invoice he sent for the usage Olive tried to sue the university, but it claimed that under the Eleventh Amendment it couldn’t be sued as it is a state institution.
In an attempt to get around this Olive tried to sue the University of Houston for taking his property – in which case even government agencies would have to compensate the owner. The Court of Appeals though has said that the university’s actions didn’t comprise ‘taking’ and that Olive will have to pay the university’s legal costs.
The Court of Appeals though has said that the university’s actions didn’t comprise ‘taking’ and that Olive will have to pay the university’s legal costs.
According to a report in the Houston Chronicle, which described the success of the university as ‘a big win’, Olive said ‘It just doesn’t seem fair to me.’
If this ruling is allowed to stand it would seem that any state institution can use images and other intellectual property without having to pay the originators, a precedent that would be damaging to photographers across the country, because if that’s the case in Texas, it may well be true in all other states covered by the Eleventh Amendment.
As an architectural photographer, the main types of lenses I use are tilt-shift lenses. These prime lenses are unique in how they operate because they allow you to move the internal elements parallel to the sensor. This can be extremely useful for perspective control and ideal for shooting architecture. The question is, are they vital or just overpriced?
Amy Dotson, who recently departed her position as Deputy Director and Head of Programming at IFP, Filmmaker‘s publisher, is headed this fall to Portland, where she will step into the role of Director of the Northwest Film Center and Film and New Media Curator at the Portland Art Museum. Today she gave a speech at the day of industry talks at BAMcinemafest and kindly offered the text to Filmmaker to publish below. Lotta change in the air, ya’ll. So much has happened of late. As some of you may know, I’m on the precipice of new adventures. That said, I’m […]
If you’re planning to do any drone photography in Japan, make sure you stay away from alcohol. The country has just outlawed drunk droning, making it an offense that’s punishable by up to a year in prison.
Japan’s parliament passed the law on Thursday in an effort to exert more government control and oversight over drones, which have exploded in popularity around the world in recent years.
If you’re caught flying drunk and your drone weighs over 7 ounces (200g), you could be fined up to $2,750 (¥300,000) and face some harsh time behind bars.
“We believe operating drones after consuming alcohol is as serious as (drink) driving,” a Japanese transport ministry official tells AFP.
Japan’s other drone regulations include flying below 150m (492ft), stay away from airports/crowds, flying during the day, and keeping the drone in sight at all times — failure to follow any of these guidelines could earn you a hefty fine of up to ¥500,000 (~$4,600).
So when visiting the “Land of the Rising Sun,” just remember: don’t drink and drone, or always have a designated droner.
Over the last few years there have been a few pieces of photographic equipment that have either sped up my workflow or turned awkward, finicky techniques into simple and swift processes. But there are two specific tools that have made my life so much easier, especially when used in conjunction with each other.
Fake photos are a rampant issue in our digital age, but researchers are working hard to restore a greater degree of trust to photography. One team has created a new AI that can detect when faces in photos were manipulated using Photoshop.
First, the researchers trained a Convolutional Neural Network (CNN) using thousands of portraits scraped from the Internet that Face Aware Liquify were applied to, both those edited automatically using a Photoshop script and those retouched by a human artist.
“We started by showing image pairs (an original and an alteration) to people who knew that one of the faces was altered,” says Adobe researcher Oliver Wang. “For this approach to be useful, it should be able to perform significantly better than the human eye at identifying edited faces.”
While humans were only able to detect the edited faces 53% of the time, the AI managed to correctly catch 99% of them.
What’s even more impressive is that in addition to figuring out whether and where a photo was manipulated, the AI could also undo those edits and bring that photo back toward its original state.
“This is an important step in being able to detect certain types of image editing, and the undo capability works surprisingly well,” says Adobe Research head Gavin Miller. “Beyond technologies like this, the best defense will be a sophisticated public who know that content can be manipulated — often to delight them, but sometimes to mislead them.”
Here’s a video that explains how the technology works:
Adobe says that this project is just a piece of a larger effort to create better technologies for detecting digital manipulations in photos and other forms of media.
NASA has announced an unusual new discovery: the iconic Star Trek Starfleet logo has been found on the surface of Mars.
The insignia was captured by the agency’s Mars Reconnaissance Orbiter that’s orbiting the red planet, and the MRO HiRISE (High-Resolution Imaging Science Experiment) camera team at the University of Arizona explains that the curious chevron shapes are footprints of old sand dunes:
These curious chevron shapes in southeast Hellas Planitia are the result of a complex story of dunes, lava, and wind.
Long ago, there were large crescent-shaped (barchan) dunes that moved across this area, and at some point, there was an eruption. The lava flowed out over the plain and around the dunes, but not over them. The lava solidified, but these dunes still stuck up like islands. However, they were still just dunes, and the wind continued to blow. Eventually, the sand piles that were the dunes migrated away, leaving these “footprints” in the lava plain. These are also called “dune casts” and record the presence of dunes that were surrounded by lava.
Enterprising viewers will make the discovery that these features look conspicuously like a famous logo: and you’d be right, but it’s only a coincidence.
Caption Spotlight (12 Jun 2019): Dune Footprints in Hellas
Enterprising viewers will make the discovery that these features look conspicuously like a famous logo.
Last year I joined my local photography club. The club holds regular competitions and I was amazed by the quality of the bird and wildlife photographs. I’ve never been much of a natural history photographer. So it’s not surprising that my own photographs did very poorly in competitions.
In particular, a judge criticized a woodpecker photograph that I submitted because it was clearly on a bird feeder. “Hand of man!” he said as he dismissed my attempt.
But this piqued my interest. I wondered what makes a good wildlife photograph. It occurred to me that wildlife photography has more than a passing resemblance to street photography. This is a genre in which I feel more comfortable.
One of the secrets of street photography is to first find your “stage” and then wait for the subjects to appear. So I thought I would apply this to photographing garden birds. I decided to set up a stage and then wait for the birds to arrive.
I have a bird table in my garden, but I knew that I needed to create a stage that didn’t reveal the “hand of man”. So I hunted down a moss-covered log and wedged it in between the table. I then baited the back of the log to encourage the birds to sit on it.
I wanted to make the photographs in my own style, so I decided to add off-camera flash. I then set up my camera and triggered it remotely.
Here are some of the best examples.
Photographing birds on a mossy log was beginning to feel a little like stamp collecting, so I thought I’d change things up.
As I watched the birds, I realized that many of them used the log as a kind of queuing post as they waited to get on the bird table. This gave me an idea. I clipped some flowers and blossom and attached those to the other side of the bird table.
I then captured this image. It was like someone turned on a lightbulb in my head.
I loved the storytelling feel of this image. Time for more of the same!
I was getting a little addicted and wondered what other scenes I could create. So my next step was to create a reflection pool.
I decided to do it on a miniature scale by using a plastic tray from a garden center.
When I first set it up it was raining, so I took some photographs from inside my conservatory.
The reflection pool is so small that there’s really only one angle you can shoot from. Once the rain stopped, I set up my camera outside and got these.
It’s interesting to compare the following image with the woodpecker photograph at the top of the article. This one certainly appears more natural even though it was taken within 2 meters (~6.6 feet) of the original.
My aim in doing this was to end up with two photographs worthy of a club competition. I think I achieved that. What I didn’t expect was for it to be such fun. I now know a lot more about birds than I did a couple of months ago and it’s turned out to be really enjoyable watching (and predicting) their behavior.
I still don’t think I’m much of a natural history photographer, but I’ve definitely got some insight into why other people do it.
About the author: David Travis is a travel photographer based in the Midlands in England. The opinions expressed in this article are solely those of the author. You can find more of Travis’ work (and buy prints) on his website, Facebook, Twitter, and Instagram. This article was also published here.
Your footage is already shot with lighting that makes it look “cinematic” and it’s time for color grading. You apply a teal-and-orange LUT preset, but although it is in the right direction, your video still lacks something that prevents it from having that blockbuster look.
Tiffany Hillkurtz started working in animation in 1997 as an assistant in the animation department. She continued through the ranks of editorial on films like Space Chimps, Astro Boy, Madagascar 3: Europe’s Most Wanted, Free Birds, Penguins of Madagascar, Minions, and The Grinch. (My favorite imdb note on Hillkurtz is that she played a Jawa in Star Wars: Episode IV – A New Hope including an adorable photo of her in costume.)
Art of the Cut spoke with Hillkurtz about her work as the editor of The Secret Life of Pets 2.
HULLFISH: I’ve done a couple of interviews with animation editors so some people that have read those might know something about animation, but let’s really talk about the differences between live action and animation editing and just kind of fill me in on where the process starts for you.
HILLKURTZ: I like to think that editing live action and animated films are both akin to creating a sculpture. With live action, you have a block of marble, all this footage that’s been shot, and you’re chipping away, trying to find the movie. Whereas animation is more of an additive process, like working with clay or plaster, starting from nothing, building something from the ground up. In the beginning, there’s nothing but a script, and sometimes not even that, sometimes you start when there’s only an idea. You’re creating an animatic of the entire film, layer by layer, continually fleshing it out, adding in complexity and elements from each department as they’re created, in order to build a film.
So, in animation, you come in at the beginning and you record the dialogue with just people in the office — this is called “scratch dialogue”. You edit that together while the storyboard artists are drawing boards. Then you time out those boards with the dialogue to see how it plays, and to see if that scene kind of works. It still has to “play” as a real scene, so you add sound effects and temp music.
Then you work with the director to implement any changes he wants and show a pass to the executives and whoever needs to approve it, and then eventually you send it to Layout. Layout places the cameras for the scene in the set that has been built in the computer (this process is regarding CG animation. It is different for a hand-drawn animation pipeline.). The story guys don’t always know what the location looks like, so the boards are not like live-action boards where it’s about how you’re going to shoot it. Animation storyboards are more about acting, characters, developing the story, or coming up with gags that aren’t in the script.
Support ProVideo Coalition
Filmmakers go-to destination for pre-production, production & post production equipment!
When the layout shots come back, they’ve “shot it” but then we might want to change it. Sometimes I have to ask for a new shot, or I extend a shot with a freeze frame or reframing, or I might realize I need a wide shot, so I’ll just ask for it. So it’s more malleable than live action because a lot of times in live action you’re just working with whatever they’ve shot, and if they didn’t shoot it, you don’t get it unless you’re able to do reshoots.
HULLFISH: A lot of people may not understand what the layout process is. But before we get to layout, let’s talk a little bit more about the process that happens even before layout, which is that the story is so much more malleable while you’re still in storyboards. What is your job or what are you bringing to the process while you’re still in storyboards?
HILLKURTZ: Basically, I’m timing out the boards and the dialogue to create the film. Once you get the dialogue from the script, even if it’s scratch dialogue, you’re still trying to decide if it’s sounding right. Sometimes you’ll go back to the writer and ask for a different line, or once it’s all laid out, you’ll discover it’s too long. You tune it as you go, depending on how it feels. In the beginning, you’re just trying to have it play as well as it can, figuring out the characters and how they work with each other.
It changes when you get the actors’ voices cut in, because that’s truly the character and that significantly affects the timing. Also, as an animation editor, you’re free to pick any take you want. The actors are by themselves in the booth recording anywhere from three to — depending on the actor — 40 takes of the same line, and you listen to them all. You choose which one you want. But you don’t have to use the whole line. You can use the first half of this take and the second half of that take, or one word in the middle. You can shape the performance of each character. Then, when you put that performance with the other actors’ performances, you create the pace and the tone of the interaction. There are many things to consider: Is it a sarcastic response or a serious response? Is there a really long pause, which could change the performance between two characters? Or are they talking over each other, which can also change the performance?
When the actor records their dialogue, I note the takes the director and I respond to the most. I will use those first when assembling, but they don’t always work in the context of the complete scene, so I like to re-listen to every take. Because of this careful process of working with the dialogue, a character’s performance is basically a combination of the voice actor, the animator, and the editor.
HULLFISH: I’m interested in the malleability of the script at that stage because when you edit the whole movie together, you can then watch it and see structural and story problems in context and in TIME (instead of just on the page) and STILL have the chance to add things, which you rarely get in live action. Or you can delete whole groups of scenes without the pressure of how much effort it had taken to shoot them.
HILLKURTZ: That’s the great thing about doing this all in storyboards – it’s easier to just move things around. A lot of times the script isn’t even completely finished. So you’re working at the beginning and it informs how the script will end. A lot of places I’ve worked, you work in chronological order. You start at the beginning of the movie and you go through as much as you can.
Obviously, there are some sequences that bounce around, but even if you have a full script, when you’ve done act one and act two, you might think, “Wouldn’t it be better if THIS happened in act three?” And you can — at that point — just change everything without huge consequences. After the first nine months or so we pretty much have a screening every eight to ten weeks just so you can see how everything feels, and how it’s playing together, and how it looks. It is very malleable at almost every stage.
HULLFISH: From a technical standpoint, when you’re in storyboards, how much are you simply cutting from one storyboard to another and how much motion work are you doing WITHIN the frame? After Effects type stuff? Or trying to pan or tilt or zoom on the storyboards? I’m assuming you’re editing in Avid — so how much keying and motion work are you doing in the Avid?
HILLKURTZ: Yes, definitely Avid. It depends on how much time you have. Some studios do a lot of After Effects. On Pets 2, we didn’t do any. I’ll do some moves — push-ins, crops, etc— in the Avid. Quite often I’ll re-draw eyes or mouths if I have storyboards where the character is looking one direction and I want them to look in another. Instead of getting a new board, I’ll just erase the eyes and draw them in the other direction with the paint tool. It’s amazing the difference changing a mouth shape can have.
The thing to remember is, the whole movie is in different phases throughout the process. You don’t finish all the storyboards and then move everything to Layout.
HULLFISH: Because the studio’s trying to keep everybody so you don’t dump everything off onto one department at the same time.
HILLKURTZ: Exactly. So when we have enough inventory of storyboarded scenes that have been approved, then we send that out to Layout. But then you keep going with the storyboards for the next part while Layout is working on that first part. Then layout shots start coming back into editorial, too. Eventually, you start getting animation shots in, and then you’re in three stages for different parts of the movie.
Editorial is like the hub of a wheel in animation because everything comes through our department. The storyboards come in, you cut them, you send them back out to Layout, who does their thing, then sends that back to you. Then you work on the layout material and send that back out to animation, and then animation sends their stuff back to you. It’s this ongoing process of importing, editing, and then exporting.
HULLFISH: Before we talk about what layout is, you mentioned about the transition from editing with scratch tracks to editing with the real actors doing the voices. What is the schedule or when do you start getting actors to do their lines and I’m assuming that they come in a couple of different times?
HILLKURTZ: Depending on the character and the size of the part, there are anywhere from two to six recording sessions per actor, and then an ADR session at the end for any extra bits that we need to fill in. I always like to have the production dialogue, with the actors’ voices, for when I cut layout because the actors talk at different speeds than the scratch, or they might improv and say some different lines. For storyboards, it’s great if you have the real actor but we don’t always have that luxury because we’re still working through the story and the script, and you don’t want to bring them in for no reason, and then throw everything away. Usually when you have a pretty solid storyboard pass, then you’ll have actors come in and do a recording session.
HULLFISH: Sometimes you’ll go into layout with the real voices, but you can’t go to animation without it. How widely spread out are those recording sessions?
HILLKURTZ: It depends on the availability of the actor. It takes a few years to make an animated movie, so they’re off doing other movies or tours. And then, when we’re ready, we schedule the next session. So it could be like a month or it could be six months in between. It also depends on the quality of the scratch. Every movie I’ve worked on, it’s different.
HULLFISH: So it’s got to be before animation because they don’t want to be wasteful, but before animation, because they need to animate the mouths and physicality to match the performance.
I worked for an animation studio for two years, but explain to my readers what the term “layout” means and what it entails and what kinds of decisions are being made at that phase.
HILLKURTZ: The actual term comes from the 2D animation days, where the artists did a drawing of the scene’s background. In CG animation, it’s translating a 2D drawing into 3D space. Basically, they’ve built the location for the scene in 3D space in the software, and Layout places the virtual cameras in the scene to create what in live action would be a set-up. The characters are added in their place in the set. In storyboards, you might not have really been able to see how that works. So with a “real” space and a real camera, you can now see things like, “Oh, he can’t get through that window, so we might need to change a character from moving left to right to moving right to left. Or this character has to come in through this door, so now we have to change the eye-line to be different than it was in storyboards. They’re basically looking at how to shoot it: How do we block the cameras and the characters?
Layout has gotten significantly better from when I first started. Back then, the characters might have been represented by tubes or frozen grey versions of themselves, but now they actually look like the characters. They’re moving around, but they’re not animated per se. There’s not a lot of lip sync and they’re not doing their subtle character gestures, but they’re moving around the room in time to the dialogue that we’ve given them to try and figure out how it’s going to be shot. This is another place where you can change things. It’s easy to decide “This should be a closer shot.” Or “This should be a two-shot.” And you don’t have animators spending time animating gestures or things that won’t end up in the shot. It’s the best time to figure out how it all is going to look.
HULLFISH: You mentioned layout has gotten a lot better in recent years, but it’s still a stage where the storyboards have a lot of character and personality and acting to them and then you get to the layout stage and a lot of the emotion kind of gets sucked out of it doesn’t it?
HILLKURTZ: A little bit, yes, but certainly not as much as before. I remember on a project years ago we would get notes about the character’s eyebrows or something and I’d think, “But that’s not what we’re looking at! We’re supposed to be looking at the shot.” But we were getting acting notes because the layout stage was starting to look so much more like animation.
It is harder to see emotion or feel emotion in the layout stage, so sometimes for a screening, we’ll actually go back to storyboards instead of showing layout, just to get the emotion and the character moments in the scene. You just have to conform the boards to wherever you are in layout.
HULLFISH: Is that conforming something you’ve got assistants doing or is it something you’ve got to do?
HILLKURTZ: Usually an assistant or an associate editor will do the conforming. I had three different Associates through this film because of scheduling issues: Tom Walters, Rachel Brennan, and Nico Stretta. All of them were super helpful to me and the whole process. I like to always carry the boards on a lower track, so if we need to conform storyboards back to layout timing, they’re right there. Eight or nine years ago, there were a lot more conforming boards than now, because layout has come so far. We would probably do that conform a lot more for screenings for people outside the studio. Whereas inside the studio, people can pretty much envision it.
HULLFISH: As the project becomes more locked, what are your responsibilities at that point, since the conforming is mostly left to others?
HILLKURTZ: That would be nice (laughs) but it actually gets so much busier towards the end, because, if there’s time, people will use it. And so we’re still getting everything in pretty much toward the very end. Also, I’m kind of a control freak — which is not very healthy — so I like to look at everything. I’ll have my associate cut in the lighting shots and I still want to look at it before we’re moving on. And, after you have a preview screening, things might change again. So sometimes you go back to boards again, or you’re moving scenes around to see if they work better in a different order.
In Pets 2, there are basically three storylines and the biggest challenge was figuring out how to intersperse those storylines so you weren’t spending too much time with one character and also not too much time away from a different character. So moving scenes around to get that proper placement was really important. So my work definitely continues to the very end.
HULLFISH: I’m really interested in that idea of cutting between A, B, and C story. Did that intercutting change past the storyboard phase?
HILLKURTZ: Oh yes. It changed throughout. I have a big whiteboard in my room with all of the sequence names and numbers on it, and sometimes I would even put scenes on pieces of paper in front of me on my desk, and just move them around and play it back in my head before even trying it in the Avid. Then the director and I would talk about it. Then we’d try it in the Avid and see how a big, long 10-minute segment would work. Then we could see if moving a scene or cutting a scene in half would work. You’re just playing with it pretty much until you can’t anymore. Especially with three storylines, because you really want to get it right. You really want to get it so that it’s the right amount of time with each storyline and with each character, so you still care about everybody and you’re not missing anybody.
Also on that board, I would write notes about what I’m waiting for: Am I waiting for production dialogue? Am I waiting for layout? Or have I received the layout, but I haven’t cut it in yet? I’d keep little notes of what the status of that sequence is.
HULLFISH: Did you color code those status points?
HILLKURTZ: Yes I did: Red if I was waiting for it, Green if it was in & ready to cut, Blue If I had to show it to the director. That way I could glance across the room and see, “Ooh, there’s a lot of red. Where is that?” Or if there’s a lot of green: “I should be getting on that!”
HULLFISH: Did stuff happen where the animation changed the timing of what you had in layout or was there some other way that you had to lock that down or some other reason why you had to lock down those timings?
HILLKURTZ: You don’t really begin “locking” until you start to hand over to the composer or the sound effects team towards the end. Animation changed a lot because they’re coming up with really fun things for the characters to do, or some funny body movement. Then when we get the animation back, my associate will cut it in, but if there are timing changes, they’ll put it on a different track and point it out to me so I can make sure that it will still work. Sometimes if the animators change animation timings, it changes the timing of a joke because the dialogue has now changed. So then I see if it still “works,” and if I think it isn’t working, I’ll point it out to the director and then we have to decide whether to change the animation or move the dialogue so the joke still works. We can basically re-take any shot. Production doesn’t like it, but when the animation changes timing, you really have to look at it to make sure things are still playing. Especially with a comedy, you just want to make sure that the jokes are all still landing. Or, is the action funnier, and now you don’t actually need the line of dialogue?
HULLFISH: Talk to me a little bit about track management. You were talking about how you are carrying storyboards along with you. Can you just describe what your tracks are looking like?
HILLKURTZ: I’m an archivist and I carry everything. (Meaning that as she moves from one phase to another, the previous phases exist under the latest tracks in video.) Once the boards are there and I get layout in, I’ll just carry the boards on a track below, and when layout is approved I just have one track of layout, and then when animation comes in — I learned this years ago — I carry the latest two takes of animation because sometimes you want to go back to the last take. But once we get lighting in, we just carry one track of animation.
Closer to the end of the process, when we turn over to music or sound effects, we make a dummy track, so track one doesn’t really have anything on until we have a dummy track. And then a layer of boards and then layout and then animation and then lighting.
HULLFISH: What’s the purpose of the dummy track?
HILLKURTZ: When you turn over a locked sequence and then you change timing, a dummy track is a quick way to see if there was a cut. If there is space, a break, that’s where you changed the timing. So you can let them know, or you know when you get something back if it’s different. For me, it is just an easy visual way to immediately see that the timing has changed. Because if the dummy track is full, then nothing has changed, but if it has splices in it or spaces, then things have changed since you turned it over last time.
HULLFISH: So what’s in that dummy track? Is it just filler?
HILLKURTZ: It’s just a video mixdown of whatever we turned over.
HULLFISH: Ok that makes sense. It’s very interesting that you carry two different animation tracks — the two latest versions of animation. Let’s explain what the difference is between a lighting track and an animation track.
HILLKURTZ: When the animation shot comes in it’s the animator’s performance of the character(s) to the voice and location and actions, but the lighting and the color in that shot is all kind of drab. Then it actually goes through a few departments before we get it back. The shot will go through VFX, CFX (character), OCC, and Lighting, etc, then finally it’s all composited. Those departments put all the textures on any cloth, costumes, or things in the background, and add the lighting. What in live action would be the lighting rigs, they add that in the lighting stage. And then they composite it all together in a polished final shot. To me, it’s just magical. You see the animation and it works great and then you get Compo back and it’s like, “Wow!” With just the animation pass, you can’t really tell if it’s day or night. You can’t tell if the room is dark except for one spotlight. We know in theory that that’s what it is going to be but when you get it back you see that it actually works, and it’s kind of magic.
HULLFISH: The difference is that the lighting stage makes the animation look photo-realistic compared to the animation stage.
HILLKURTZ: Yeah, pretty much. The lighting (compo) stage is what goes up on screen in the end.
HULLFISH: What about your audio tracks? How are you managing audio tracks?
HILLKURTZ: I do 1 through 6 for dialogue. That will depend on the size of the cast. But having worked on Pets 2 or Madagascar 3 type films, there are a lot of characters. Then maybe 7 through 16 for sound effects. And then about eight tracks for music, because I do a lot of music editing with temp music.
Long ago I started to color code each character’s dialogue a different color. Scratch dialogue is all one color – white – because then we know if there’s scratch in there at the end when it shouldn’t be. So, say, Max is blue, and Snowball is yellow. My assistant and I just pick whatever colors feel right for the characters. That way, in the timeline, at a glance, you can see who’s talking, which I also find helpful, because you can look at it and think, “Oh wow, they’re talking a lot.” In my head, when I see the timeline with the different colors of the character’s dialogue it just makes it make more sense.
HULLFISH: Do you do that with Avid’s timeline local clip color or by setting the color in the bin and using source clip color?
HILLKURTZ: Source Clip color. Same thing with sound effects – we make those green – and I personally use blue for temp score and light blue for needle drop temp music. Then, when we get music in from the composer, it’s a different color. If something comes from the music editor, it’s a different color, so I can just see what I need to work on, what’s still temporary and what’s final.
HULLFISH: Do you have different types of audio tracks? Like, I would think dialogue would all be mono. Do you do stereo? Do you do left-center-right? Do you do 5.1?
HILLKURTZ: All the dialogue is mono. For the project, we just stick with stereo. SFX are mono, and music is dual mono. When we do internal screenings, there’s not a lot of ways to play left-center-right so we play stereo.
HULLFISH: You mentioned how important music is to you. Let’s talk a little bit about temp music and what you used and since there was a Secret Life of Pets 1, was a lot of the temp music from that score or from that composer?
HILLKURTZ: Well I started out using a lot of temp from Pets 1, as it’s so great, but in this movie, there are a few locations other than New York City, so I started using other stuff. Alexandre Desplat did the first one, so I looked through his scores, but his scores are all so different. There are some composers where you can listen to a track and know the composer, but with Desplat, they’re all so different, so I basically just used whatever worked for the scene. We pulled a lot from Planes: Fire and Rescue, which is a fabulous score, and from things like Night at the Museum and Shaggy Dog that work well for animation. And I also temped with Avengers, and Eight Legged Freaks. There was some Malificent in there, probably somewhere between 20-25 different scores in total. I do pay a lot of attention to music, so in a four-minute stretch I could have pieces of ten different scores in there. You wouldn’t necessarily know, because they all work together. I like to score the emotional moments so I spend a lot of energy doing that.
Because animation is such a long process, and we’re carrying this along for two or three years with a screening every couple of months, it has to play like a final movie, basically, so we have a lot of sound effects in there, filling it out: footsteps and dog collars and things that you wouldn’t necessarily put in if you were just going to hand it off soon to the sound effects team. But it’s got to play for two years so you can’t really leave that empty. For sound effects passes, I have my assistants do that a lot. Then I’ll listen to it and say, “I want a twing here instead of a twang.” So we’ll look for something else. I get really great sound effects passes by my assistants.
HULLFISH: What Avid resolution were you working in, since you were doing so many screenings?
HILLKURTZ: The project was HD 1920 x 1080. The boards were imported as DNxHD 36, and the shots (Layout, Animation, and Compo) were imported as DNxHD 175.
HULLFISH: When I’m cutting live action, I’m always pretty diligent to add room tone, if it was recorded, or to fake it, if not. It just makes the room seem alive and helps smooth out the other audio edits. Do you do anything to provide sonic atmosphere for animation — other than sound effects?
HILLKURTZ: Oh yes. That’s one of the first things that I like to put in, because, as you said, it just keeps everything alive — if you’re in the city It’s nice to hear horn honks, if you’re in the country it’s nice to hear crickets and birds. It gives you a sense of place. Even if you don’t have time to add footsteps or any of that other stuff, I would certainly at least have the benefit of that in there.
HULLFISH: Do you have a bunch of “fake Max’s Apartment room tone?”
HILLKURTZ: I did have the timeline of Pets 1 that I had access to, so I could use that and it was very helpful. And, yes, as you go, you create the library of each location’s atmos track so the next scene where you don’t have to even think about ‘finding’ it. They replace it all when the film goes to the Skywalker sound team, but I carry it until we get to that point.
HULLFISH: But, as you mentioned, you’ve got two years before you get to that point where you still want it to sound good. What are some of the things your team has to do to handle turnovers?
HILLKURTZ: This is accomplished by my assistants. When we lock a stage — when the director and I decide that we’re ready to move on — it’s actually a different process for every studio. Usually, in our process, the term is “to publish.” We “publish” to another department. So the assistants tell the department how many frames each shot is and if it’s changed from the last time we published. The very first time we publish it from storyboards, that’s when we come up with the shot numbers and how long that shot is, and then it’s forever in the pipeline as that shot number. So if we have a new shot, we give it a new number. But every time we publish it back or turn it over to the next apartment the assistants basically have a huge list of each shot and what has changed — or not — since the last time.
HULLFISH: And is that kept in like a codebook for a live action film? Is it some kind of cloud-based database so that everyone has access to it?
HILLKURTZ: Yes, we use Shotgun software for tracking, so everybody can see the current version and current notes. Also, the important notes are different for each department. Every department needs to see different information, so that’s all out there in the system for everybody to see.
It’s the department managers’ responsibility to update the tracking system if any shot needs to be put “on hold” for any reason. Say the dialogue will be changing, or the wardrobe is changing color, or the shot is in a scene that is being re-written, etc. Then the rest of the crew know that this is a shot that should not currently be worked on.
My associate Tom says:
As well as supporting the editor creatively and technically, the editorial department is responsible for communicating all cut changes to the other departments. In order to do this efficiently, the film is broken down into sequences of a few minutes each. A sequence can contain one longer scene or several short ones. Within each sequence, every shot has a unique number, much like a VFX workflow in a live action film.
Once the animatic of a sequence is ready, it can be published to go into production.
In the first instance, the assistant editors assign a unique shot number to each shot and send the information to our pipeline via an XML and QT movie.
With each subsequent stage of production – layout, animation, VFX, lighting, etc. – updated versions of those shots are delivered to editorial. In layout, it is common to have timing and shot changes. As the sequence progresses through the stages of production, changes become less frequent and by the time composited lighting shots are sent, the shot timings are fairly stable.
When a sequence moves to the next stage of production, and whenever the editor makes changes to a sequence in production (dial changes or timing changes, etc.), it is the assistant editors’ responsibility to communicate the changes so everyone working on those shots is kept up to date. As with the first publish, this is done using an XML and a Quicktime to update our in-house database which tracks the status of every shot in the movie, which everyone in the production pipeline can reference to be sure to be up to date and if they have any questions. Also, you cannot underestimate the benefits of a face-to-face conversation with the relevant heads-of-department if the changes are subtle or complex!
HULLFISH: Care to share any misperceptions that you think live action editors have about animation editors?
HILLKURTZ: I’ve been amazed at the people who ask me about animation editing and say, “So, all of the shots are there. Do you just have to put them in order?” People don’t think through the process. Years ago, I went to a Q and A with the editors of Where the Wild Things Are and they were describing the steps where they were figuring out the process of working through the effects in the movie and I asked them, “Did you ever think about talking to an animation editor? Because this is basically what we’ve done for years.” And they said that they hadn’t because it never dawned on them that that’s really what it was because they thought of it as mostly visual effects as opposed to animation. They didn’t think about doing an animation pipeline. People just don’t think it through. Maybe because it’s like two different languages, or it’s maybe like the difference between English and American: You think it’s the same, but there are really so many different words and so many different nuances that it can be even more confusing. But I was in live action production for 10 years before I started editing (and in my childhood), so I think I’m bilingual in that way. It’s such a different beast altogether.
HULLFISH: I still laugh at all the people who think that the animation outtakes are real at the end of a PIXAR movie. There are NO outtakes in animation!
HILLKURTZ: Right!? Everything is created.
HULLFISH: I also had a friend who was a live-action director and for some reason, someone gave him a job as an animation director, and he was asking me all these questions about animation and one of the things he couldn’t wrap his head around was that there is no such thing as “coverage.” You don’t have multiple angles of a shot in animation. You have the ONE angle that you choose in storyboards and layout.
HILLKURTZ: Yeah. You don’t have six cameras rolling at the same time. I’ve also talked to live action editors who come into animation and they’re kind of shocked that you don’t have handles.
HULLFISH: Do you have any handles at all on a shot? Like with VFX, you commonly get 8 frames.
HILLKURTZ: We usually have zero handles. Whatever number of frames that we send through boards is the exact number of frames we get back from layout. I worked on Astro Boy and that was different. There was a big battle scene and the layout was much more primitive, but the head of layout did shoot it with four different cameras from different angles, and I had four layers of the whole scene, and I could cut it more like a live action, and that was actually pretty awesome. I haven’t had that since then, though. Animation shows generally don’t do that.
HULLFISH: As long as the render didn’t take too long, it should be easy, right? It’s the same animation, just different “virtual” camera placements. We chatted a bit about sound effects, but you just pulled stuff from the previous movie or you have a library?
HILLKURTZ: Yes. I’ve just been collecting a library since day one as an editor, and every studio has other libraries.
HULLFISH: Talk to me a little bit about working with an animation director.
HILLKURTZ: It’s three years of togetherness in the editing room, so it’s usually a pretty close relationship. It does depend on the people and personalities, as does any movie cutting room. I’ve worked with directors that just absolutely need to hear every take of every dialogue – they just want to control more. And I’ve worked with directors who basically let me do a pass and watch the scene, and then they ask about just one line, “Do we have a more emphatic take on this line?” We’ll go through and listen to some and maybe replace it or maybe not. It’s very collaborative.
I also like for editorial to be a place where the director can just come and “be.” They don’t have to be “on” all the time: just hang out and talk about Star Wars for a few minutes, or whatever. I think that’s important, too. You have to build that relationship so they trust you, and your work. I also like to really learn the director’s language. My first editing job was in TV at Sony, and so every week or every few days I’d be working with a different director on a different episode. Each one of those directors had their own language. Like, “Nudge that a smoodge” for one director meant six frames but for another director it meant two frames. You really had to tune in to that particular director. I learned very early on to learn the language of the director and what they were really saying and what they’re feeling and thinking. Ultimately it’s my job to get their vision on the screen right? There are times when I might disagree, but my rule is that if I bring something up three times and it doesn’t pass, then I’ll let it go.
HULLFISH: The other difference between collaborating with a live-action director and an animation director is that with live action, the director is completely involved during production with shooting and the editor is left largely alone for the initial edit because the director has more important things to do. Then later, the director is kind of all yours after the assembly. But in animation, it’s almost the reverse. You’re building the story with them side-by-side from the beginning, then as they get more involved with layout and animation, they have other things they have to attend to.
HILLKURTZ: That’s very true. And sometimes you have to fight for director time because they’re off in other departments doing other things, but it’s also interesting because they have a global vision of what’s in process and what can be done or not within the perimeters of the production. But sometimes the director can be distracted by other things and other departments so they need the editor to have that global vision for them. There might be things that they approve in another department and then you get footage into editorial and you say, “I don’t think this is really working,” but the director couldn’t see that in the other department because it was more specifically about a particular shot or particular scene. Whereas when we have it in editorial, in the grand scheme of things, we need to make sure it all works together for the whole film and not just in individual meetings or reviews or approvals.
HULLFISH: Talk to me about the screenings that you mentioned — every few months for years. I’m assuming that you wanted to be at all those screenings and how much were you learning from your personal interpretation or feeling as the audience watched it and how much of it was response cards and screening notes?
HILLKURTZ: Then, every few months screenings are just for internal crew and execs, and they are very informative. Then, towards the end of the project, we will have a couple of public ‘preview’ screenings. It was really interesting at the first preview to see it with “real” people. You could ‘feel’ the room and could feel if something was falling flat, or working way better than you thought, or the feel of the structure of the three timelines. Then we would all go talk about it and discuss what we thought worked and what didn’t and how much should change or not.
HULLFISH: So you were in Paris for this edit, but there is also an LA office?
HILLKURTZ: Yes, there’s an LA office, and I have an assistant editor there which is nice because I can time out the storyboards and then go home for the night while he can work on — for example — a sound effects pass. They do a lot of the scratch dialogue recording in LA for the American voices. And he would also attend all the voice records in LA which is very helpful. Also, the writer is there in LA, and the producer is there, and quite a few production people, etc.
HULLFISH: It’s interesting that you’ve got an assistant in L.A. in addition to your Paris assistants. That’s pretty cool.
HILLKURTZ: I had an amazing team on this show, and everyone worked so well together. Christophe Ducruet, my 1st assist in Paris, GianDe Feo, my 2nd assist in Paris, and Sam Craven in LA. It’s really useful because Sam can hand stuff off to the LA crew or marketing, and when we have reviews, he runs it on the LA side. The Paris & LA assistants can split up the work when there is production dialogue so quite a bit will be ready to go when we arrive in the Paris office after a record or such. It’s very helpful to have him there to liaise with whatever the L.A. office might need from us. He would deal with most of the turnovers as well. To the composer, sound, marketing, etc.
HULLFISH: Tiffany, thank you so much for a really informative interview. Wonderful talking to you.
The first 50 interviews in the series provided the material for the book, “Art of the Cut: Conversations with Film and TV Editors.” This is a unique book that breaks down interviews with many of the world’s best editors and organizes it into a virtual roundtable discussion centering on the topics editors care about. It is a powerful tool for experienced and aspiring editors alike. Cinemontage and CinemaEditor magazine both gave it rave reviews. No other book provides the breadth of opinion and experience. Combined, the editors featured in the book have edited for over 1,000 years on many of the most iconic, critically acclaimed and biggest box office hits in the history of cinema.
Cameras are really good these days. Whether your choice is mirrorless, DSLR, full-frame, or APS-C, the quality of modern cameras is rather outstanding. But there are some features that are particularly overrated, even needless. Here are five of them.