Shooting Tips

Auto Added by WPeMatico

Streaming and Live Feeds.

With some difficult times ahead and the need for most of us to minimise contact with others there has never been a greater need for streaming and online video services that now.

I’m setting up some streaming gear in my home office so that I can do some presentations and online workshops over the coming weeks.

I am not an expert on this and although I did recently buy a hardware RTMP streaming encoder, like many of us I didn’t have a good setup for live feeds and streaming.

So like so many people I tried to buy a Blackmagic Design Atem, which is a low cost all in one switcher and streaming device. But guess what? They are out of stock everywhere with no word on when more will become available. So I have had to look at other options.

The good news is that there are many options. There is always your mobile phone, but I want to be able to feed several sources including camera feeds, the feed from my laptop and the video output from a video card. 

OBS to the rescue!

The good news is that there is a great piece of open source software called OBS – Open Broadcast System and the Open Broadcast Studio streaming software.

OBSDemoApp2321 Streaming and Live Feeds.
Open Broadcast Studio Software.

 

OBS is s great piece of software that can convert almost any video source connected to a computer into a live stream that can be sent to most platforms including Facebook and YouTube etc. If the computer is powerful enough it can switch between different camera sources and audio sources. If you follow the tutorials on the OBS website it’s pretty quick and easy to get it up and running.

So how am I getting video into the laptop that’s running OBS? I already had a Blackmagic Mini Recorder which is an HDMI and SDI to thunderbolt input adapter and I shall be using this to feed the computer. There are many other options but the BM Mini Recorders are really cheap and most dealers stock them as well as Amazon. it’s HD only but for this I really don’t need 4K or UHD.

Blackmagic-mini-recorder Streaming and Live Feeds.
Blackmagic Mini Recorder HDMI and SDI to thunderbolt input adapter.

 

Taking things a step further I also have both an Atomos Sumo and an Atomos Shogun 7. Both of these monitor/recorders have the ability to act as a 4 channel vision switcher. The great thing about these compared to the Blackmagic Atem is that you can see all your sources on a single screen and you simply touch on the source that you wish to go live. A red box appears around that source and it’s output from the device. 

atomos_atomsumo19_on_set_in_studio_4kp60_1576110181000_1334246-e1584607746226 Streaming and Live Feeds.
The Atomos Sumo and the Shogun 7 can both act as 4 input vision switchers.

 

So now I have the ability to stream a feed via OBS from the SDI or HDMI input on the Blackmagic Mini Recorder, fed from one of 4 sources switched by the Atomos Sumo or Shogun 7. A nice little micro studio setup. My sources will be my FS5 and FX9. I can use my Shogun as a video player. For workflow demos I will use another laptop or my main edit machine feeding the video output from DaVinci Resolve via a Blackmagic Mini Monitor which is similar to the mini recorder but the mini monitor is an output device with SDI and HDMI outputs. The final source will be the HDMI output of the edit computer so you can see the desktop.

Don’t forget audio. You can probably get away with very low quality video to get many messages across. But if the audio is hard to hear or difficult to understand then people won’t want to watch your stream. I’m going to be feeding a lavalier (tie clip) mic directly into the computer and OBS.

I think really my main reason for writing this was really to show that many of us probably already have most of the tools needed to put together a small streaming package. Perhaps you can offer this as a service to clients that need to now think about online training or meetings. I was lucky enough to have already had all the items listed in this article, the only extras I have had to but are an extra thunderbolt cable as I only had one. But even if you don’t have a Sumo or Shogun 7 you can still use OBS to switch between the camera on your laptop and any other external inputs. The OBS software is free and very powerful and this really is the keystone to making this all work.

I will be starting a number of online seminars and sessions in the coming weeks. I do have some tutorial videos that I need to finish editing first, but once that’s done expect to see lots of interesting online content from me.  Do let me know what topics you would like to see covered and subject to a little bit of sponsorship I’ll see what I can do.

Stay well people. This will pass and then we can all get back on with life again.


Streaming and Live Feeds. was first posted on March 19, 2020 at 9:54 am.
©2018 “XDCAM-USER.COM“. Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at contact@xdcam-user.com

Temporal Aliasing – Beware!

As camera resolutions increase and the amount of detail and texture that we can record increases we need to be mindful more and more of temporal aliasing. 

Temporal aliasing occurs when the differences between the frames in a video sequence create undesirable sequences of patterns that move from one frame to the next, often appearing to travel in the opposite direction to any camera movement. The classic example of this is the wagon wheels going backwards effect often seen in old cowboy movies. The cameras shutter captures the spokes of the wheels in a different position in each frame but the timing of the shutter relative to the position of the spokes means that the wheels appear to go backwards rather than forwards. This was almost impossible to prevent with film cameras that were stuck with a 180 degree shutter as there was no way to blur the motion of the spokes so that they were contiguous from one frame to the next. A 360 degree shutter would have prevented this problem in most cases. But it’s also reasonable to note that at 24fps a 360 degree shutter would have introduced an excessive amount of motion blur elsewhere.

Another form of temporal aliasing that often occurs is when you have rapidly moving grass, crops, reeds or fine branches. Let me try to explain:

You are shooting a field of wheat, the stalks are very small in the frame, almost too small to discern individually. As the stalks of wheat move left, perhaps blown by the wind, each stalk will be captured in each frame a little more to the left, perhaps by just a few pixels. But in the video they appear to be going the other way. This is  because every stalk looks the same as all the others and in the following captured frame,  the original stalk may have moved  say 6 pixels to the left. But now there is also a different stalk just 2 pixels to the right of where the original was. Because both stalks look the same it appears that the stalk has moved right instead of left. As the wind speed and the movement of the stalks changes they may appear to move randomly left or right or a combination of both. The image looks very odd, often a jumbled mess, as perhaps the tops of the stalks appear to move one way while lower parts appear to go the other.

There is a great example of temporal aliasing here in this clip on Pond5 https://www.pond5.com/stock-footage/item/58471251-wagon-wheel-effect-train-tracks-optical-illusion-perception

Notice in the pond 5 clip how it’s not only the railway sleepers that appear to move in the wrong direction or at the wrong speed but notice how the stones between the sleepers appear to look like some kind of boiling noise.

Like the old movie wagon wheels one thing that makes this worse is the use of too fast a shutter speed. The more you freeze the motion of the offending objects or textures in each frame the higher the risk of temporal aliasing with moving textures or patterns. Often a slower shutter speed will introduce enough motion blur that the motion looks normal again. You may need to experiment with different shutter speeds to find the sweet spot where the temporal aliasing goes away or is minimised.  If shooting at 50fps or faster try a 360 degree 1/50th shutter as by the time you get to a 1/50th shutter motion is already starting to be as crisp as it needs to be for most types of shots unless you are intending to do some for of frame by frame motion analysis.


Temporal Aliasing – Beware! was first posted on February 13, 2020 at 5:22 pm.
©2018 “XDCAM-USER.COM“. Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at contact@xdcam-user.com

Using User Files and All Files to Speed Up Switching Modes on the FX9.

Sometimes changing modes or frame rates on the FX9 can involve the need to change several settings. For example if you want to go from shooting Full Frame 6K at 23.98fps to shooting 120fps then you need to change the sensor scan mode before you can change the frame rate. One way to speed up this process is to use User Files or All Files to save your normal operating settings. Then instead of going through pages of menu settings you just load the appropriate file.

All Files save just about every single adjustable setting in the camera, everything from you white balance settings to LUT’s to Network settings to any menu customisations.  User Files save a bit less. In particular User Files can be set so that they don’t change the white balance. For this reason for things like changing the scan mode and frame rate I prefer to use User Files.

You can add the User File and/or All File menu items to the user menu. If you place them at the top of the user menu, when you enter the cameras menu system for the first time after powering it on they will be the very first items listed.

Both User Files and All Files are found under the “project” section in the FX9 menu system. The files are saved to an SD card in the SD Card Utility slot. This means you can easily move them from one camera to another.

Before you save a file, first you have to give it a name. I recommend that your name includes the scan mode, for example “FF6K” or “2KS35”, the frame rate and whether it’s CineEI or not.

Then save your file to the SD card. When loading a User File the “load customize data” option determines whether the camera will load any changes you have made to the user menu. “Load white data” determines whether the camera will load and overwrite the current white balance setting with ones saved in the file. When loading an All File the white balance and any menu customizations are always loaded regardless, so your current white balance setting will be overwritten by whatever is in the All File. You can however choose whether to load any network user names and passwords.


Using User Files and All Files to Speed Up Switching Modes on the FX9. was first posted on February 11, 2020 at 9:55 pm.
©2018 “XDCAM-USER.COM“. Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at contact@xdcam-user.com

How We Judge Exposure Looking At an Image And The Importance Of ViewFinder Contrast.

This came out of a discussion about viewfinder brightness where the compliant was that the viewfinder on the FX9 was too bright when compared side by side with another monitor. It got me into really thinking about how we judge exposure when purely looking at a monitor or viewfinder image.

To start with I think it’s important to thing understand a couple of things:

1: Our perception of how bright a light source is depends on the ambient light levels. A candle in a dark room looks really bright, but outside on a sunny day it is not perceived as being so bright. But of course we all know that the light being emitted by that candle is exactly the same in both situations.

2: Between the middle grey of a grey card and the white of a white card there are about 2.5 stops. Faces and skin tones fall roughly half way between middle grey and white. Taking that a step further between what most people will perceive as black, something like a black card, black shirt and a white card there are around 5 to 6 stops and faces will always be roughly 3/4 of the way up that brightness range at somewhere around about 4 stops above black . It doesn’t matter whether that’s outside on a dazzlingly bright day in the desert in the middle East or on a dull overcast winters day in the UK, those relative levels never change.

Now think about this:

If you look at a picture on a screen and the face is significantly brighter than middle grey and much closer to white than middle grey what will you think? To most it will almost certainly appear over exposed because we know that in the real world a face sits roughly 3/4 of the way up the relative brightness range and roughly half way between middle gray and white.

What about if the face is much darker than white and close to middle grey? Then it will generally look under exposed as relative to black, white and middle grey the face is too dark.

The key point here is that we make these exposure judgments based on where faces and other similar things are relative to black and white. We don’t know the actual intensity of the white, but we do know how bright a face should be relative to white and black.

This is why it’s possible to make an accurate exposure assessment using a 100 Nit monitor or a 1000 Nit daylight viewable monitor. Provided the contrast range of the monitor is correct and black looks black, middle grey is in the middle and white looks white then skin tones will be 3/4 of the way up from black and 1/4 down from white when the image is correctly exposed.

But here’s the rub: If you put the 100 Nit monitor next to the 1000 Nit monitor and look at both at the same time, the two will look very, very different. Indoors in a dim room the 1000 Nit monitor will be dazzlingly bright, meanwhile outside on a sunny day the 100 Nit monitor will be barely viewable. So which is right?

The answer is they both are. Indoors, with controlled light levels or when covered with a hood or loupe then the 100 Nit monitor might be preferable. In a grading suite with controlled lighting you would normally use a monitor with white at 100 nits. But outside on a sunny day with no shade or hood the 1000 Nit monitor might be preferable because the 100 nit monitor will be too dim to be of any use.

Think of this another way: Take both monitors into a dark room and take a photo of each monitor with your phone.  The phone’s camera will adjust it’s exposure so both will look the same and the end result will be two photos where the screens will look the same. Our eyes have iris’s just like a cameras and do exactly the same thing, adjust so that the brightness is with the range our eyes can deal with. So the actual brightness is only of concern relative to the ambient light levels.

This presents a challenge to designers of viewfinders that can be used both with or without a loupe or shade such as the LCD viewfinder on the FX9 that which be used both with the loupe/magnifier and without it. How bright should you make it? Not so bright it’s dazzling when using the loupe but bright enough to be useful on a sunny day without the loupe.

The actual brightness isn’t critical (beyond whether it’s bright enough to be seen or not) provided the perceived contrast is right.

When setting up a monitor or viewfinder it’s the adjustment of the black level and black pedestal which alters the contrast of the image (the control of which is confusingly called the brightness control). This “brightness” control is the critical one because if the brightness adjustment raises the blacks by too much then you make the shadows and mids brighter relative to white and less contrasty, so you will tend to expose lower in an attempt to have good contrast and a normal looking mid range. Exposing brighter makes the mids look excessively bright relative to where white is and the black screen surround is.

If the brightness is set too low it pulls the blacks and mids down then you will tend to over expose in an attempt to see details and textures in the shadows and to make the mids normal.

It’s all about the monitor or viewfinders contrast and where everything stits between the darkest and brightest parts pf the image. The peak brightness (equally confusingly set by the contrast control) is largely irrelevant because our perception of how bright this is depends entirely on the ambient light level, just don’t over drive the display.

We don’t look at a VF and think – “Ah that face is 100 nits”.  We think – “that face is 3/4 of the way up between black and white” because that’s exactly how we see faces in all kinds of light conditions – relative levels – not specific brightness.

So far I have been discussing SDR (standard dynamic range) viewfinders. Thankfully I have yet to see an HDR viewfinder because an HDR viewfinder could actually make judging exposure more difficult as “white” such as a white card isn’t very bright in the world of HDR and an HDR viewfinder would have a far greater contrast range than just the 5 or 6 stops of an SDR finder. The viewfinders peak brightness could well be 10 times or more brighter than the white of a white card. So that complicates things as first you need to judge and asses where white is within a very big brightness range. But I guess I’ll cross that bridge when it comes along.


How We Judge Exposure Looking At an Image And The Importance Of ViewFinder Contrast. was first posted on February 10, 2020 at 10:02 pm.
©2018 “XDCAM-USER.COM“. Use of this feed is for personal non-commercial use only. If you are not reading this article in your feed reader, then the site is guilty of copyright infringement. Please contact me at contact@xdcam-user.com