5 Screenwriting Tips You Can Learn From ‘Mindhunter’

Mindhunter brought serial killers into our homes. Joe Penhall and David Fincher scared the pants off us in Season One. How?

Serial killers seem to be all the rage on podcasts, in movies, tv, and streamers. But few shows capture what it’s like to understand these people. To know why they attack and thus learn how to catch them. Mindhunter is a singular achievement for Netflix. It successfully dramatizes the FBI work while still maintaining the horrific edge that brings people in.

I mean, a naked guy blows his head clean off within the first minutes of the pilot. And that’s not even the most gruesome part of the series. Since the show is grounded in reality, a lot of work goes into creating each season. Now, as Season Two premieres, I wanted to look at this video that chronicles David Fincher, Jennifer Haley, Cameron Britton, and the original “Mindhunter” John Douglas’ work on how they worked with show creator Joe Penhall to make such an incredible show,

Check out the video from Behind the Curtain and let’s talk more after the jump!

Read More

Is Your Faster Lens Really Worth Paying Lots of Money For?

Wider apertures aren’t really the selling point of a more expensive lens, but we know what is.

All too often, our Gear Acquisition Syndrome is fueled by specs we read online. Take faster lenses for instance. Lenses that can squeeze out an extra stop for better low light performance and delicious bokeh may be tempting, but is getting that extra stop of light worth paying up to a thousand dollars or more? Well, in the cases of lenses, it isn’t always about the specs. It’s about what’s under the proverbial hood.

Lenses aren’t just sold because of how fast they are. You’re paying for the quality of the glass. After all, you get what you pay for. And a good lens is going to last you, and that’s what you’re paying for. – Daniel Norton

Read More

Here’s What Canon’s DualPixel AF Can Do For You

Not only can dividing pixel boost autofocus sensitivity, but also dynamic range.

When it comes to technology development, I love happy accidents. The features that appear, not because they were designed in, but because they just so happened to be a bonus. Apollo Flight Director Gene Kranz would say he didn’t care what something was designed to do, but what it CAN do. And thanks to another patent application this week, we’ve come to learn a bonus to Canon’s Dual Pixel AutoFocus … and that’s the potential for increased dynamic range.

Read More

The Best Filmmaking Deals of the Week (8.16.19)

Highlighting our Deals of the Week, save hundreds on a 15″ Dell XPS laptop that will serve all of your mobile editing needs.

This week in filmmaking deals: Dell offers its 15″ XPS 9570 laptop for $400 less than retail, and SanDisk offers its 256GB Ultra microSDXC memory card for its lowest price yet. Also, the CLAR Illumi b-color LED gets a $300 price cut and the popular Nikon 35mm f/1.8G AF-S DX Nikkor lens is $45 off.

Read More

Virtual Production: It’s the future you need to know about

In an age of dizzying technological churn, it’s hard to know which emerging fringe tech is worth paying attention to. Not so with virtual production: Anyone who’s experienced it can instantly see where things are headed. So what exactly is it, and when will you—the Indie filmmaker—be able to get your hands on it?

 

Bringing the virtual into the physical

Most people—if they have an awareness of virtual production—think of it as some kind of previsualization system, or picture Peter Jackson on a greenscreen set looking at CG trolls through a VR headset. Now while all this is true, and part of the history of virtual production techniques that has brought us to today, the virtual production of tomorrow will be far more intrusive into the physical realm.

Lux Machina, Profile Studios, Magnopus, Quixel, and ARRI in collaboration with Epic Games, have been showing a prototype of a virtual production set based on real-time projection of background environments via massive LED panels. At its basest level, this is an updated version of the old rear-projection systems used for shooting in-car scenes, where moving footage of the background was placed behind the actors. This is, however, a gross oversimplification. The system is superior to its historical predecessor in every way.

The benefit of using the LED light panels is that they actually provide a physical lighting source for the live action actors and props.

An example of a game engine skybox unwrapped.

Essentially, we’re dealing with the physical equivalent of what game developers call a “skybox,” a cube containing the HDR lighting environment intended to illuminate the game set. Full-scale LED panels project the background image from stage left, stage right, the main rear projection, and the ceiling. Presumably foreground lighting could be provided by either more traditional stage lighting sources, or perhaps a matrix of LED PAR cans that can simulate the portion of the HDR coming from the front of the scene. (And alternative option is a curved background screen configuration.)

What’s even more significant is the ability for the background to update its perspective relative to the camera position. The camera rig is tracked and its ground-truth position is fed into Unreal Engine, which updates the resulting projections to the LED panels. The specific region of the background directly within the camera’s frustum is rendered at a higher resolution, which looks a little strange on set, but obviously makes sense from a processing efficiency standpoint (the entire setup requires three separate workstations running Unreal Engine: one for the two side stage panels, one for the background panel, and the third for the ceiling).

The result of the perspective shift means that backdrops feel much more natural with sweeping jib and crane moves. The real-time nature of the projection also means there’s the potential to have secondary movement like a breeze blowing through trees, or last-minute directorial changes, like literally moving a mountain to improve the framing of the shot.

Lighting can also be changed in an instant. Switching from sunset to sunrise to midday is as simple as dialing up a different sky. Changing the direction the sun is coming from is as simple as rotating the sky image. Try doing that in the real world.

The beginning of the end for exotic travel?

One of the most exciting aspects of this virtual production stage is the potential to take a script with globetrotting locations and bring the budget from the nine-figure realm down to an indie level. Imagine being able to shoot in Paris, New York, the Everglades, and the deserts of the Middle East from the same soundstage, all in the course of a three-week production schedule? We’re not there yet, but it seems likely that over the next three to seven years the GPU compute power, the sophistication of the software, and the display tech will get us there. Crowd simulation software could even provide digital extras for the mid and far-ground.

And then there are the exotic sci-fi and fantasy scenes. Such sets are extremely complex to build, and greenscreen composites are often unsatisfying. The CG often feels a little too ethereal. With the CG set illuminating actors and foreground props on the virtual production stage, there’s an immediate improvement in the way the real and computer-generated scene elements mesh.

No doubt Tom Cruise will still be filmed hanging from real skyscrapers in Dubai, but for indie features and episodics hungry for a “bigger” look, the virtual production soundstage will almost certainly kill the exotic production budget. Film crew members will need to find themselves a cosy travel magazine show to work on if they want to any jet-setting perks.

Limitations

I’ve been careful to talk about this as the future; there are a couple of limitations that hold it back from being 2019’s solution to location shooting. Part of this is simply the nascent state of the tech; everything is still very much in the prototype stage. This is not to say that it can’t be used for production today, just that there are certain compromises that would need to be accepted.

I was extremely impressed with Epic’s David Morin, who is spearheading the virtual production initiatives at Epic, and his cautious approach to promoting the venture. He’s clearly mindful of the way overhype has caused the bubble to burst too early on many VR and AR technologies (along with the entire stereoscopic industry) and is thus being careful not to reach for hyperbole when talking about the current state of the art.

To me, the biggest present limitation seems to be the display technology. Don’t get me wrong: they look pretty. The LED panels simply lack the dynamic range to replace the 14-16 stops a modern cinema camera can capture in the real world. The dynamic range is probably adequate for most television and OTT content, but with the industry pushing hard into HDR standards, a reduced contrast range will have limited appeal.

A possible compromise is the system’s ability to instantly convert the background to a greenscreen. This allows directors to visualize the scenery on-set, while still performing the final composite as a post process with floating point precision. Of course, at that point you lose some of the benefit of the interactive lighting provided by the panels. The scene can still be lit from above and the sides, but much of the background illumination will obviously be replaced with greenscreen illumination. Still, a significant improvement on a traditional greenscreen.

(I can’t help feeling that by pulsing the refresh rate of the LCD panels the engineers could extract some kind of segmentation map, which would allow a higher dynamic range background to be composited in post, while still using the LCD panels to illuminate on set…)

The solution to this problem is mainly dependent on future advances in the display hardware. Another limitation that’s more intrinsic to the system is the directionality of the light sources. This is an issue that also affects HDR lighting in CG scenes. The panels can simulate the general direction from which a light source is coming, but it can’t angle that light to only illuminate a specific object or set of objects in the scene. You couldn’t, for example, simulate a light raking across one actor’s face without it also affecting another actor at the same time.

This is the kind of granular control DP’s and gaffers expect on a shoot. That’s not to say traditional lighting hardware can’t augment the virtual production lighting of course. Additionally, the virtual production stage lends itself to outdoor scenes. Exteriors favor the sort of ambient lighting the virtual production system excels at, so directional lighting control may not be as important for these shots as it would be for interiors. This isn’t so different from the way many outdoor scenes are shot now, with bounce cards and reflector being used to control lighting rather than high-powered practical lighting.

Baking for speed

Obviously, an essential component in the adoption of virtual production is the realism of the computer-generated scenery. There are three main ways to produce realistic background scenes in CG: baking, Unreal Engine’s dynamic lighting system, and real-time raytracing.

In baking, lighting and shadow information is rendered at offline speeds, then applied as a texture map to the scene geometry. Think of it like shrink-wrapping artwork onto a car, only in this case the artwork you’re shrink-wrapping includes shadows and shading detail. And instead of wrapping a car, you’re potentially shrink-wrapping an entire landscape.

The benefit of this method is that you can take as long as you like to render the imagery, because the render doesn’t need to happen in real-time. The real-time renderer then has very little heavy lifting to do, since most of the lighting is already calculated and it’s simply wallpapering the scene geometry with the pre-rendered images.

The downside of this method is that it locks down your set. As soon as you move a background set piece, the lighting and shadows need to be recalculated. It’s also unable to account for specular highlights and reflections. These change with the camera’s POV at any given time, and so need to be calculated in real-time rather than baked into the scene geometry’s textures.

Baking, then, can work for situations where the background set design is locked down and you’re not dealing with highly reflective objects, like metallics.

The next alternativeUnreal Engine’s Dynamic Lighting systemuses the traditional “cheats” animators have been using for years to simulate the results of bounced light without actually tracing individual rays. Using shadow maps and techniques like Screen Space Ambient Occlusion (SSAO), real-time changes to lighting can look quite realistic, again depending on the subject matter. The more subtle the shading and scattering properties of background scene surfaces, the harder the resulting real-time render is to sell to an audience.

What we really need is real-time raytracing.

RTX in its infancy

Raytracing is—as the name suggests—the process of tracing virtual rays of light as they bounce around the scene. Due to the way microfaceted surfaces scatter light rays, it requires a crazy amount of calculation work by the computer. Nvidia announced last year a new generation of graphics cards with silicon dedicated to real-time raytracing, dubbed RTX.

Right now, artists and developers are still coming to terms with how to best use the new hardware. I have no doubt that we’ll see some very clever ways of squeezing more realism out of the existing cards, even as we anticipate enhancements in future generations of the hardware.

All that to say, with today’s tech there are certain virtual environments that lend themselves to in-camera real-time production, while others would not pass the same realism test. As RTX evolves, we’ll see just about any environment viable as a virtual production set.

Too expensive for the independent filmmaker? Perhaps not.

Before you tune out and decide that this is all out of your price range, consider that a day rental for a virtual production studio may not end up being significantly more than a traditional soundstage. Obviously the market will set the price, but there’s no reason to assume that the cost of renting a virtual production stage in the near future will be stratospheric.

Even the expense of creating the environment to project can be quite reasonable. Companies like Quixel allow access to a massive library of scanned objects and backgrounds for just a few hundred dollars a year. Putting together a set can be as simple as designing a custom video game level.

And if you don’t want to create your own set? Stick on a VR headset and do some virtual scouting of predesigned environments, then simply pick the one you want to license for your shoot.

Even more affordable “in-viewfinder” solutions will soon be available for everyday use. These systems are more akin to the early virtual production systems used by Peter Jackson and James Cameron, but they will allow filmmakers to see a representation of the final composited shot through a monitor or viewfinder as they shoot.

Virtual production in your hands today, thanks to Epic’s free Virtual Camera Plugin

Is there any way to get your hands on virtual camera technology today? For free? Well, if you’re interested in virtual production for previsualization, the answer is yes. Last year Epic released a Virtual Camera Plugin for their Unreal Engine editor. You can download both the editor and plugin for free from their website.

With the plugin and an iPad (and a computer running Unreal Engine) you can physically dolly and rotate a virtual camera to experiment with framings. This is a great way to previs upcoming shoots.

Take moviola.com’s course on Previs using Unreal EngineCheck out moviola.com’s free series on Previs using Unreal to see how you could quickly build an entire replica of your shooting location, and then use the Virtual Camera Plugin to block out your shoot.

For more information on the Virtual Camera Plugin, check out this guide by Epic.

Virtual production is the future

Virtual sets have already become a major part of filmmaking. And not just for your Game of Thrones-style fantasies; plenty of primetime dramas rely on CG backdrops to enlarge environments far beyond what their budgets would permit to shoot in camera.

Bringing the virtual elements onto the live action set is the next logical step. Ultimately, it’s poised to democratize the use of computer-generated environments the way digital cameras and affordable NLE’s have democratized the rest of the filmmaking process. If nothing else, this is definitely a space to watch.

The post Virtual Production: It’s the future you need to know about appeared first on ProVideo Coalition.

What Happens When Four Photographers Shoot the Same Model?

What Happens When Four Photographers Shoot the Same Model?

The neat thing about photography is how different photographers can have vastly different creative ideas and interpretations of the same subject. This awesome video follows four photographers as they shoot the same model; see what they came up with!

[ Read More ]

How ‘Easy Rider’ Changed Everything About Indie Film

The 1969 counterculture film “Easy Rider” shook the studio system to its core and helped usher in one of Hollywood’s most creative eras.

The film is a revolutionary road movie, starring Peter Fonda as Wyatt and Dennis Hopper as Billy, plus a star-making turn from then-newcomer Jack Nicholson as ACLU lawyer George Hanson.

Wyatt and Billy are drug-dealing hippies traveling from Los Angeles to the American South, including New Orleans, during the height of the tumultuous Vietnam War era. They meet other hippies, do drugs, pick up hitchhikers, and get into mischief before meeting their violent ends.

Criterion has called it “the definitive counterculture blockbuster.” The film was a surprise cult hit and led to a seismic shift in many areas of the filmmaking process.

How Easy Rider Pioneered Indie Filmmaking

Fonda and Hopper partnered on the film, co-writing the script, at a time when the movies making money were generally happy, big-budget, shiny projects.

Read More

Real-Time In-Camera VFX Could Be the Green Screen Future

During the Unreal Engine User Group at SIGGRAPH 2019, the team at Unreal Engine showed an impressive real-time in-camera VFX technology that could change how films are made. While this term is often overused, this technology could be a game-changer and kill the green/blue screen in the near future. Let’s take a closer look.

Unreal Engine

Unreal Engine is a game engine developed by Epic Games which is out since 1998. Indeed, you can use it to build video games like Fortnite. But, you can also use Unreal Engine’s real-time technology for architecture purposes, VR and AR, and even cinematography.

The Unreal Engine is an eco-system of tools and applications. The entire source code is available for free, so everyone can modify it and expend the various engine features. If you are not a “code guy,” don’t worry. It’s easy and intuitive to try and create whatever you want in the Unreal Engine without having to write a single line of code.

The kind of photorealistic graphics or videos you can pull out of the Unreal Engine are stunning.

UnrealEngineInCameraVFX_Featured

Image credit: Unreal Engine.

Real-Time In-Camera VFX

The team at Unreal Engine has revealed an exciting collaborative project with Lux Machina, Magnopus, Profile Studios, Quixel, ARRI, and Matt Workman. I like to see people like Matt Workman involved in this kind of project. To make it short, Matt is a cinematographer and founder of Cinematography Database. Also, he is the creator of Cine Designer, which is one of the most potent pre-production and lighting tool for DPs available on the market.

UnrealEngineInCameraVFX_01

Matt Workman during the Unreal Engine User Group at SIGGRAPH 2019. Image credit: Unreal Engine.

During their Unreal Engine User Group event at SIGGRAPH 2019, the whole team experimented around real-time in-camera VFX. Here is the setup:

  1. A studio with four giant LED panels (three walls and a ceiling) to act as a background.
  2. All four LED panels are powered by nDisplay to render the Unreal Engine Project simultaneously on all four displays.
  3. Various engine-controlled ARRI SkyPanels all over the studio. That way, the team can match in real-time the lighting of the “real subject” and the lighting of the background.
  4. A couple of sensors on the camera that are linked to the Unreal Engine. That way, the Unreal Engine 3D background moves accordingly to the camera position (and focal length) in the space.

The results in the video at the beginning of this article are impressive. The actor and the props are perfectly blended into the environment/background. All of that happens in real-time in-camera without the need for green screens.

UnrealEngineInCameraVFX_02

Image credit: Unreal Engine.

The End of Green Screen Shooting?

The beauty of this system is that everything in the background is CG. If you don’t like something in the scene, or something is not in the right place for a specific camera angle, move it out of the way. In Unreal Engine’s example, you can see how easily they pick a rock and displace it in real-time.

However, you could also do it in post-production if you shoot your film on green/blue screen, right? It’s true, but it means you would have to modify your 3D scene and render it again.

UnrealEngineInCameraVFX_03

Image credit: Unreal Engine.

The main advantage of this technology is that everything is happening in real-time in-camera. It can save you a tremendous amount of time in post-production. However, the major drawback of this technology is that once the shooting is over, you can’t change your mind on what’s happening in the background.

In this “prototype” stage, I think that this Unreal Engine technology could be helpful for commercials or short lengths projects. Of course, it’s not ready for multi-million dollars feature films, yet. But the VFX future looks exciting for sure.

What do you think of this Unreal Engine real-time in-camera VFX technology? Do you think it could replace green/blue screen shooting in the next years? Let us know in the comments down below!

The post Real-Time In-Camera VFX Could Be the Green Screen Future appeared first on cinema5D.

Pinnacle Studio 23 Update For Prosumers Editors

Corel Corporation has introduced an update to its popular video editing software: Pinnacle Studio 23. This new version features loads of new tools typically found on much more advanced NLE such as Multicam editing, video masking, 360 video editing, color grading keyframing, and so on. Let’s take a closer look at it.

The Easy-To-Use Video Editor

If you’re not familiar with Pinnacle Studio, it’s a Windows-only video editing software initially designed for beginners and prosumers users. The idea behind Pinnacle Studio is that everybody should be able to do simple video edits.

The previous Pinnacle Studio versions are easy to use and feature fundamental video editing tools: import, cut, modify your footage a bit, and hit export. It is straightforward and doesn’t feature a ton of sophisticated tools made for professional video editors.

In the end, if you want to edit a family holiday video, you want to get the job done as fast as possible. Indeed, you have other things to do in your life. Most of my friends that need short video edits for a birthday video don’t want to mess with too many tools and features they don’t need/want/understand. They want a video that looks great, they want to get it done quickly, and that they can edit in their spare time.

To do so, you need an affordable NLE that features intuitive tools: this is what Pinnacle Studio is all about.

PinnacleStudio23_Featured

Pinnacle Studio 23

The new Pinnacle Studio 23 update comes in three different versions: Studio, Studio Plus, and Ultimate. Each version has more tools than the other, and the Pinnacle Studio 23 Ultimate is their flagship video editor.

Pinnacle Studio 23 Ultimate features a lot of new powerful editing tools like:

  • Over 2,000+ effects, titles, and templates included, including effects from NewBlueFX.
  • Three- and Four-Point Editing for more flexibility and precision over your edit.
  • Compatible with HD, 4K, and 360 video footage.
  • You can now have an unlimited number of video/audio tracks.
  • It now has a video masking tool. You can easily enhance or remove elements in your video, blur faces, clone subjects, selectively apply effects to any portion of a clip, layer footage with text or shapes, and so on.
  • Clip Nesting allows you to group multiple clips. Also, you can use the new Multicam tool.
  • You can export the Alpha Channel only.
  • They improved the color grading panel. It’s possible to apply LUTs to your footage and even keyframe your color adjustments. With the “selective vectorscope” you can also key specific area in your video for adjusting skin tones only, for example.
  • And finally, it supports more formats and resolutions.

PinnacleStudio23_01

For a complete list of all the Pinnacle Studio 23 features, you can visit Pinnacle’s website.

Pricing and Availability

Pinnacle Studio 23 is available right now at Pinnacle’s website.

One of the significant advantages of Pinnacle Studio 23 is that it’s relatively inexpensive: the Pinnacle Studio 23 version retails for €59,95, the Studio Plus version is 99,95€, and the Studio Ultimate is 129,95€. No subscription plan, you pay it once, and it’s yours.

Have you ever used Pinnacle Studio to edit a video? What do you think of this new Pinnacle Studio 23 upgrade? Let us know in the comment section!

The post Pinnacle Studio 23 Update For Prosumers Editors appeared first on cinema5D.

Learn How to Create This Awesome Day and Night Shot in Camera

Learn How to Create This Awesome Day and Night Shot in Camera

It is pretty amazing just how much you can accomplish in post-processing software nowadays, but there is something particularly satisfying about creating an image that looks like it took lots of Photoshop work entirely in camera. This excellent video tutorial will show you how to recreate this fascinating day and night image all in camera.

[ Read More ]

“Move Behind the Fence or You’ll Be Arrested”: Roberto Minervini on What You Gonna Do When the World’s On Fire?

Since moving to the United States in 2000, Italian-born director Roberto Minervini has become one of the foremost documentarians of the American South. His fifth feature, What You Gonna Do When the World’s On Fire?, marks a departure in focusing, for the first time, on African-American lives in the region. Shot between Mississippi and Louisiana, the film weaves together three parallel threads: a pair of young brothers, Ronaldo King and Titus Turner, whose fierce bond is evident from the jump; a musician/singer/bar owner named Judy Hill, who conducts community meetings aimed at consciousness-raising; and members of the New Black Panther Party, seen […]

How the Ways We Simulate Sex on Screen Are Evolving

How the Ways We Simulate Sex on Screen Are Evolving

It is a strange thing when you think about it: films have long had fight and stunt coordinators to make the simulation of violence and dangerous situations both look realistic on screen and to protect the actors involved in the scenes. Yet, simulating sexual scenes is often done without expertise and often in very inappropriate ways that can and have seriously affected those involved. Here is how that is changing for the better.

[ Read More ]

The FAA is asking for input for its recreational drone test

Recently, the Federal Aviation Administration (FAA) granted recreational drone pilots access to Low Altitude Authorization and Notification Capability (LAANC). This removed a huge bottleneck for the pilots as they were extremely restricted on where they could legally fly. Understandably the public wanted to know if they’d be held to the same standards at Part 107-certified commercial remote pilots, who are required to pass a knowledge exam.

The FAA has officially responded by issuing a Request for Information (RFI) this week. They are currently looking to identify and work with stakeholders in the industry on the administration of a new aeronautical knowledge test for recreational drone pilots. Thanks to significant technological advancements over the past few years, operating a drone is relatively easy to the point where they can be flown safely with minimal knowledge. By updating Section 349 of the FAA Reauthorization Act of 2018, the government agency plans to educate current recreational pilots and bring them into the fold of safe, responsible small unmanned aircraft system (sUAS) culture.

The amended law will require recreational pilots to pass this newly constructed aeronautical knowledge and safety test, to demonstrate they understand the rules. The FAA is currently developing the testing material with stakeholders.

The amended law will require recreational pilots to pass this newly constructed aeronautical knowledge and safety test, to demonstrate they understand the rules. The FAA is currently developing the testing material with stakeholders. They are currently looking for third-party entities, testing designees, to collaborate with on administering the knowledge training and test content across various FAA-approved platforms.

Testing designees should have the ability to reach the widest audience possible and also develop a standard electronic record that will be issued immediately to the pilot upon successful completion of the test. They will also provide necessary documentation, similar to what a newly-minted Part 107 remote pilot receives, that can be shown to the FAA or local law enforcement if required.

Those interested in participating are encouraged to review the RFI and respond by September 12, 2019.

‘Write What You Know’ is Bullshit

“Write what you know” is the most popular piece of advice given to people all over the world. And it’s complete bullshit.

I love working at No Film School because I get paid a nominal fee to share my opinions on screenwriting and I get to examine popular theories and tips without having to serve any overlords. We’re not selling a product, we’re just aggregating advice. That means, if I say something that’s bullshit, I get called out on it in the comments.

Sometimes it’s great, sometimes it sucks.

But today, I want to focus our column on calling out the biggest piece of bullshit you’ll hear when you’re trying to break in as a writer…

“Write what you know!”

So, let’s jump in and keep it quick.

Read More

You Shouldn’t Be Using Lightroom for That

You Shouldn't Be Using Lightroom for That

I’m the first to admit I love Lightroom. Sure, it has its issues, like occasionally slow performance on good hardware and an admittedly aging interface, but I’m comfortable with it. There are a number of tasks, though, that you just shouldn’t be using Lightroom for. Want to know what they are?

[ Read More ]

How to Trick Instagram’s Algorithm for Higher Engagement (Maybe)

In mid-2016, Instagram started using an algorithm to order the photos you’re shown, which was a big change from the simple chronological feed that had been used since the beginning. If you’re not happy with the reach and engagement your photos on getting on Instagram, there’s a rumored trick you may want to try.

Here’s the rumor: Instagram’s algorithm is said to favor photos that have had Instagram’s own filters applied to them, even if the filters are applied so weakly that you can’t even tell the difference.

Illustrator Mariana Avila ‎Tweeted the trick last week after seeing influencer Courtney Quinn (@colormecourtney) share it with her 563,000+ followers:

Here are screenshots of the original post (which has since been deleted) and story by Quinn:

If you want to give this trick a shot, make sure you apply an Instagram filter to your photo before posting it. But if you don’t actually want the filter to affect the look of your photo, tap the filter a second time to bring up the filter strength adjustment slider. Set this to something like 2% (according to Quinn’s suggestion):

By doing this, you’ll trick Instagram’s algorithm into thinking you’re using an official filter, even if it had no noticeable effect on the photo, and it’ll be shared to a wider audience — according to the rumor, at least…

This trick hasn’t been verified, but people have responded to both Avila and Quinn’s Tweets reporting that the trick seems to have worked for them. We’ve reached out to Instagram for comment and will update this post if/when we hear back.

(via Mariana Avila via PDNPulse)

Landscape Photography with an Old $79 Camera from 2007

Photographer Toma Bonciu of Photo Tom recently decided to experiment with doing landscape photography with an outdated digital camera. He bought a Canon PowerShot S5 IS from 2007 for $79 from Amazon and took it out into the great outdoors to see what he could get.

The 8-megapixel camera only shoots JPEGs and has a focal length of 36-432mm (in 35mm terms). It does have full manual shooting, though, so that’s handy for photographers wanting more control.

“The question is, will you be able to see the difference?” Bonciu asks. “Will the photos look that bad with this camera? […] I’m really curious to see if you really need all that expensive gear.”

Here are the photos Bonciu ended up creating:

There were some downsides to the camera, of course — Bonciu had issues with glare/flare when shooting into sunlight. But considering the fact that it’s an old $79 camera that can be worn around the neck without having to carry an arsenal of lenses, Bonciu was impressed with the results.

“If you are a photographer who does photography as a passion or as a hobby, imagine walking only with something like this and not a big backpack on your back,” Bonciu says. “You can do beautiful photos [with just] a small camera, with a consumer camera.”

Learn From a Cinematographer: How to Shoot a Low Light Scene

Learn From a Cinematographer: How to Shoot a Low Light Scene

Gonzalo Amat, who’s up for an Emmy for his work on Amazon’s Man in the High Castle, tells us the steps he takes to make a scene stand out in darkness.

[ Read More ]