Let’s Edit with Media Composer – Working with Boris Sapphire’s S_LensFlare

If someone asks you what your favorite effect is, what would you say?  I always have the same answer.  S_Glow from Boris Sapphire.  For me, the effects in their lighting bundle are second to none, and S_LensFlare ranks right up there with S_Glow as probably my favorite effects in that category.  Not only does the effect have a ton of great presets (16 new ones as of the 2020 update), it’s super simple to use, it has Mocha integration and the Flare Designer is super awesome as well.  This one effect could easily have 10 tutorials done on it, as it really is that in depth, so what I decided to do for this lesson is focus on three main features of S_LensFlare.  First, we’re going to look at adding realistic elements to your shots, then we’ll move on and talk about integrated Mocha tracking, and how you’ll be able to lock your lens flares to a Mocha track you just created, and then we’ll move on and talk about the Flare Designer, and how you can access presets inside the Flare Designer, to create new flares to add to your toolkit!  Enjoy!

Channel: www.youtube.com/letseditMC_avid
Facebook: http://www.facebook.com/LetsEditwithMediaComposer
Twitter: @kpmcauliffe
e-mail: kevinpmcauliffe@gmail.com

 

Pitch Your Script Ideas to Industry Pros Through Coverfly Pitch Week (For Free)

Ready to take your screenwriting career to the next level? Coverfly wants to help you do it through Pitch Week.

Coverfly Pitch Week is now calling for entries for its Fall 2020 initiative.

Now in its 4th season, Pitch Week gives aspiring screenwriters the chance to get paired up with literary managers, agents, producers, and development executives via video conferences and pitch meetings.

Coverfly does this a few times a year. During its last month’s event, 85 writers were connected with industry pros who requested over 74 scripts. This time around, though, Coverfly is upping the ante and giving this opportunity to 100 writers.

If you haven’t heard of Coverfly before, it’s a huge database of carefully curated screenwriting competition entries. Think Match.com meets Rotten Tomatoes meets The Black List. Its algorithm helps track emerging screenwriters and hook them up with Hollywood reps.

And Pitch Week gives screenwriters the chance to pitch their screenplay ideas directly to professionals.

Read More

IFH 375: Making an “El Mariachi” Style $7000 Indie Film with Josh Stifter

Please Note: Once you press play it will take a few seconds for the episode to start playing. Making an “El Mariachi” Style $7000 Indie Film with Josh Stifter Today on the show we have writer/director Josh Stifter. Josh was chosen as one of the directors to attempt to make a $7000 feature film using…

The post IFH 375: Making an “El Mariachi” Style $7000 Indie Film with Josh Stifter appeared first on Indie Film Hustle®.

Canon EOS R5 and a new Speedlite appear for certification

It looks like the Canon EOS R5 has appeared for certification for its 2.4ghz/5ghz Wifi capabilities. The 5ghz specification was part of our original rumoured Read more…

Google’s AI Will No Longer Tag Photos with Gender, Will Use ‘Person’ Instead

Google has announced that its computer vision algorithm will no longer tag photos with gender. According to an email sent to Developers yesterday, the AI-powered tool will no longer use gendered tags like “woman” or “man,” and will default to “person” instead.

The change applies to the Cloud Vision API, which developers can use to tag photos based on what the computer “sees” inside the frame. According to Business Insider, who was sent a copy of the email in question, that API has now been changed in order to avoid potential bias.

“Given that a person’s gender cannot be inferred by appearance,” reads the email, “we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle # 2: Avoid creating or reinforcing unfair bias.”

Testing the API our for ourselves reveals that the change has already taken effect:

The kind of bias Google is referring to is the result of “flawed training data,” which necessarily leads to the algorithm making certain assumptions. Anybody who doesn’t fit within that algorithm’s trained binary of what a “man” or “woman” might look like will, therefore, automatically be misgendered by the AI. This is what Google is attempting to avoid.

The Artificial Intelligence Principle that Google mentions in the email specifically states that Google will try to “avoid unjust impacts on people,” especially as it relates to “sensitive characteristics” like race, ethnicity, gender, political beliefs, and sexual orientation, among others.

According to Business Insider, the decision has had predictably polarizing results. A policy fellow from Mozilla that they spoke with said the move was “very positive” and agreed with Google that a person’s gender cannot be inferred by appearance, while at least one affected developer responded too the change by asserting that “political correctness has room in APIs.”

(via The Verge)


Image credits: Header photo by Mitchell Luo, API test photo by Philip Martin, both CC0

Reviewer: The LomoMod No.1 with the Liquid Lens is ‘The Worst Camera I Ever Tested’

Vintage glass shooter and photography YouTuber Mathieu Stern recently got the chance to try Lomography’s special LomoMod No.1 cardboard film camera and special liquid-filled lens—an ostensibly “fun” combination that turned out to be a nightmare for Stern, who has dubbed it “the worst camera I ever tested.”

Lomography’s quirkier cameras have never been for the faint of heart, but you’d think that a lover of vintage glass with years of photography experience would be the ideal audience… maybe not.

While Stern appreciated the experimental aspect of the liquid-filled lens and the “interesting” results it produced, he pretty much hated everything else about trying to build and shoot with the LomoMod No.1. Shooting with this soft fixed-focus lens was itself a bit of a nightmare, but his main complaints boil down to the build-quality of the DIY medium format camera body.

It took Stern two hours to build the camera… wrong. Then another two hours to do it right—with some help from his wife—only to have the camera’s film advance knob break after 10 minutes of use. Since he had a spare, he was able to replace the knob with the one from the other kit. That knob broke after just one more photo.

His conclusion follows from this obviously frustrating experience:

“In the end, I really like some of the images I shot with this camera,” says Stern, “but the camera itself is one of the worst things I’ve ever used. It’s hard to make, hard to use, hard to finish one single roll of film, and the results are hard to predict.”

As an alternative for experimental photography fans who don’t want to tear their hair out, Stern actually suggests you slap Lomography’s liquid-filled lens on a digital camera—though even then, the lens is extremely soft, so in Stern’s words, “don’t look for sharpness…”

Check out the full review up top to hear all of Stern’s thoughts about this camera, and unless you’re really hankering for a camera that will take you hours to build and may very well break after just a few minutes of use … we’d suggest you skip this particular camera model.

(via Fstoppers)

Video: Photographer Sean Tucker on recoloring images using Photoshop

Photographer and filmmaker Sean Tucker has published a new video teaching viewers how to accurately recolor their images using Gradient Maps in Adobe Photoshop. The process involves selecting the objects to be recolored using the Pen Tool, then using the Gradient Maps to choose the target colors and transform the selections.

The new tutorial, which is approximately 20 minutes long, joins Tucker’s other videos, which include everything from discussions on ‘law and ethics in street photography’ to tutorials related to street photography, editing portraits in Photoshop, color grading footage and more.

Are these the 7 RF lenses Canon will be announcing in 2020? [CR1]

I have been sent a roadmap of coming Canon RF lenses in 2020. I have been unable to confirm this list of lenses, but I Read more…

How Do You Craft Artificial Intelligence for a Screenplay?

Writing humans is hard but writing artificial intelligence presents a whole new set of hoops to jump through.

We’ve already passed the original year from Blade Runner (2019), and while we live in a dystopia, we don’t have flying cars or robots posing as humans. Although, I get a lot of bot replies on Twitter, so maybe we’re coming close.

Humans have always been fascinated by science fiction. As technology gets better and better, we see great leaps in where our imaginations can take us. It’s not all aliens and time machines, but sometimes we even get nuanced looks at what makes us human, in the eyes of machines.

I’m talking about artificial intelligence.

Or the robots, computers, and mechanized life-forces created by humans to be…well…human.

Today I want to go over some challenges of writing artificial intelligence and things you should confront to make your screenplays deeper and more satisfying.

But first, check out this video from Richard DeZerga and let’s talk after the jump.

Read More