Apple ha rilasciato FinalCut Pro X 10.4 un perfetto componente del puzzle del flusso di lavoro in HDR, quando poi la tecnologia HDR viene abbinata con uno dei registratori Atomos, finalmente possiamo gestire le riprese e le modifiche in HDR semplicemente come quelle in Rec.709. Le due società hanno una storia di stretta collaborazione con Apple
In Final Cut Pro X 10.4, you can import and edit monoscopic or stereoscopic 360° video in a wide range of formats and frame sizes. While editing, you can output 360° video to a connected VR headset, and simultaneously monitor the equirectangular video and the 360° video in the Final Cut Pro 360° viewer. When you’re finished editing, you can easily export your 360° project and share them to a variety of video-sharing and social media websites, including the YouTube VR channel, Facebook 360, and Vimeo 360.
DJI Phantom 4 Pro RC Quadcopter – RTF -Get fantastic 360 Video by adding Al Caudullo’s 360 Camera Mount
Apple today revealed upgrades for FCPX and Safari that enable you to work with and view 360 videos. This long-awaited good news for 360 editors who love their Mac. But we have already had reports from GoPro Fusion 360 users who have not been able to import 5.2K 360 videos. This is troubling since Apple states that the new upgrades will allow 360 video editing up to 8K. We will stay on top of this developement and report back any more news about this glitch.
In addition, there are other improvements including FCPX now allows editing two key HDR formats, Rec. 2020 HDR10 and Rec. 2020 Hybrid Log Gamma, and gives you both HDR-compatible scopes (to identify how bright your video will get) and HDR monitoring on external displays through an I/O device. And there are now advanced pro color grading tools built right into the app. With these, you will be able to fine tune fine-tune brightness, hue, luminance, saturation and white balance through an inspector.
Below is the full release from Apple.
Edit 360° video in Final Cut Pro X 10.4
Import 360° media into Final Cut Pro, edit your 360°project, then share your 360° video in monoscopic or stereoscopic format. While editing, monitor your 360° project on-screen, or with a VR headset.
In Final Cut Pro X 10.4, you can import and edit monoscopic or stereoscopic 360° video in a wide range of formats and frame sizes. While editing, you can output 360° video to a connected VR headset, and simultaneously monitor the equirectangular video and the 360° video in the Final Cut Pro 360° viewer. When you’re finished editing, you can easily export your 360° project and share them to a variety of video-sharing and social media websites, including the YouTube VR channel, Facebook 360, and Vimeo 360. <img → continue…
Apple has released a Final Cut Pro X update that adds a slew of new features and expanded support to its video-editing software, most notable among those features being support for 360-degree and VR video. This is a major update for the software, which has been optimized to fully leverage the greater processing power of the new iMac Pro desktop systems.
In version 10.4, Final Cut Pro supports editing 360-degree videos and viewing them in real time using an HTC Vive VR headset. According to Apple, the software supports importing, editing, and delivering these VR videos, with available edits including “immersive effects,” removing camera rigs, straightening the horizon, and adding standard videos/images to VR projects.
In addition to its new 360/VR capabilities, Final Cut Pro 10.4 adds support for high dynamic range (HDR) videos in Rec. 2020 HDR10 and Rec. 2020 Hybrid Log Gamma formats, as well as new advanced color grading tools, including color wheels with controls for adjusting brightness, saturation, and hue.
The latest version of Final Cut Pro also offers color curves with multiple control points, enabling users to make “ultra-fine color adjustments,” according to Apple. Or, as our Senior Reviewer Richard Butler put it: “Curves! Curves! At long bloody last, Curves!”
Users have both manual white balance and eye dropper color sampling options, as well as the ability to apply custom lookup tables (LUTs) from Color Grading Central, PremiumBeat, and select other color grading apps. The latest version of Final Cut Pro combined with the new iMac Pro desktops also marks the first time a Mac can be used to edit full 8K-resolution videos.
Apple lists the following additional features as arriving in Final Cut Pro 10.4:
Easily import iMovie projects from iPhone and iPad into Final → continue…
Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2
Like it or not, 2017 is the year that background-blurring Portrait Modes gained major traction in smartphone photography. Apple and Google both offer improved versions of the mode in their latest devices, making for better-looking results all around. But the two manufacturers take somewhat different approaches to the process, each with different limitations and strengths. Take a look some side-by-side shots to see how they square up.
Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2
Pixel 2 XL F1.8 1/60sec 4.459mm ISO 382
Because the Pixel 2 back cameras use both a depth map (stereo) generated from the split pixels as well as ‘segmentation’ (which uses machine learning to identify people / faces vs. background), both subjects in this photo are largely in focus. This is a result one wouldn’t expect from real optics, since the person behind should also be blurred. This doesn’t always happen with the Pixel 2, but sometimes it does if the subjects are close to one another and both identified as people / faces. Sometimes it’s actually desirable, but at other times it can feel unnatural.
Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2
Pixel 2 XL F1.8 1/40sec 4.459mm ISO 400
Because of the F1.8 lens and HDR+ noise averaging (with alignment of images), the Pixel 2 can take photos of even slightly moving subjects in low light. Again note the progressive blur here: the back of the baby seat is only slightly blurred as are the switches in the background but the trees against the sky very far away are far more blurred.
Portrait mode shootout: iPhone 8 Plus vs Google Pixel 2
With smartphones, image processing is as important—if not more important—than the camera hardware components themselves, which is why the chipset is a crucial element in the imaging pipeline. Most Android smartphones come equipped with Qualcomm’s Snapdragon chipsets, and the company has just unveiled its latest top-end product, the Snapdragon 845.
We will probably see the Snapdragon 845 in many of 2018’s high-end devices, and that’s a very good thing for the camera system.
As you’d expect from any new chipset, it’s faster than the predecessor Snapdragon 835, comes with more integrated AI processing power and supports higher data speeds. However, the Snapdragon 845 also offers a range of important improvements in the imaging department.
Images can now be captured in 10-bit color with a Rec. 2020 gamut, offering a wider range of tones and hues than the previous 8-bit color. While that’s impressive, the 845’s capability to shoot 60 frames per second at 16MP is even more important when you consider image stacking applications, such as HDR modes or low-light modes that combine several frames to average out noise and improve overall quality.
Another area of improvement is slow-motion video. Next year’s Android flagships will be able to record an impressive 480fps slow motion at 720p in HDR—unfortunately, 1080p resolution will still be limited to 120 fps, lagging behind the current iPhones’ 240 fps capability. Qualcomm is catching up with Apple in terms of video frame rates, though. Like the iPhones 8, 8Plus and X, Snapdragon 845-equipped phones will be able to record 4K footage at 60 fps.
Bargain hunters looking for a high quality monitor for designer work or photo editing, there’s a new panel in town that you will want to consider. Meet the Philips 328P6AUBREB P-line display, a 32-inch Quad HD (2560 x 1440) monitor with 100% sRGB coverage, 99% Adobe RGB coverage, and HDR technology.
This monitor is essentially Philips response to BenQ’s 27-inch SW2700PT, which retails for $600 and features nearly identical specs, albeit in a slightly smaller screen. Depending on your needs, you might actually prefer the higher PPI of the smaller BenQ monitor, but the ability to get a slightly higher contrast ratio—1,200:1 on the Philips vs 1,000:1 on the BenQ—and a bit more screen real estate for $100 less is definitely tempting.
Regarding the ‘HDR technology’ mentioned above, HDR viewing will definitely benefit from the relatively bright (for computer monitors, anyway) 450 nits typical brightness, and the wide color gamut support. However, the low contrast typically associated with IPS displays means you won’t want to use this as a primary HDR viewing or grading monitor.
Finally, before you put it on your wish list, there is one more thing to consider. Despite diving into the user manual, we couldn’t find any mention of direct access to the 12-bit internal LUTs for color calibration on the Philips monitor.
The BenQ monitors, by comparison, come with calibration software that directly addresses the 14-bit internal LUTs to calibrate your monitor without sacrificing the overall bit-depth of the display, minimizing the risk of banding. The color calibration solution BenQ provides, while of course requiring the purchase of a separate colorimeter or spectrophotometer, is definitely an advantage of its lineup.
The Philips 328P6AUBREB P-line display will officially go on sale in January for £439 → continue…
CamFi has launched the CamFi Pro, a souped-up version of its wireless remote controller that can allegedly transfer your photos at a rate of up to 10Mbps between the camera and a secondary device such as a laptop or smartphone. This, according to CamFi, will make the Pro “the fastest wireless camera controller in the world.”
The new device uses 5.8G communications to move data much more quickly than usual, making it possible to stream video in addition to sending larger files and sequences far more quickly. The idea is that users can transfer images directly to a computer or tablet as they are shot without tethering or relying on the slow transfer rates of most cameras’ built-in Wi-Fi systems.
The CamFi Pro will work with a range of Nikon, Canon and Sony cameras, and will allow users the choice to transmit Raw and JPEG files simultaneously, or send one type over Wi-Fi and save the other to the camera’s memory card.
Additional features include the ability to control the settings of a single camera while simultaneously triggering multiple cameras in sequence to create ‘time slice’ effects; support for HDR image capture (read: bracketing), focus stacking, and timelapse videos; and an Auto Print mode that lets the photographer send files straight to a printer via the CamFi Matrix software—ideal for providing high-quality prints on the fly while shooting events.
The company is raising funding via Indiegogo, and needs $5,000 to make the project viable. Prices start from $200 for backers, after which the cost is expected to raise to $300, so supporters of the campaign get a pretty sizable discount.
I’m running a digital film making workshop in Dubai, December 15/16th 2017.
This 1.5 day course will take you through composition, lighting, and exposure (including color, gamma and exposure index) as well as post production including different grading techniques including LUT’s, S-Curves and color managed workflows. It will focus on how to create high quality, film-like images using the latest digital techniques. It will also cover one of the hotest topics right now which is HDR.
Day 2 will include practical sessions where different shooting techniques can be tested to compare how they effect the end result.
Just a very quick note that the last UK event of the Sony Pro Tour for 2017 will be in Glasgow on Thursday the 7th of December. I’ll be there to answer any questions and to give an in depth seminar on HDR including how to shoot HDR directly with the Sony cameras that feature Hybrid Log Gamma.
The event is free, there will be a wide range of cameras for you to play with including FS5, FS7, the new Z90 and X80 as well as monitors, mixers and audio gear.
More info here: https://www.sony.co.uk/pro/page/sony-pro-tour-2017
Originally announced back in January and slated for a mid-year release, the HDMI 2.1 specification is just now making its debut on the world stage. Announced by the HDMI forum yesterday, the new specification offers users the ability to deal with 10K video resolution, as well as other data-intensive formats such as Dynamic HDR, uncompressed 8K HDR video, and 4K at 120fps.
A new Ultra High Speed HDMI Cable standard has also been announced that carries up to 48Gbps, and which is said to have ‘exceptionally low electro-magnetic interference’ to avoid conflict with other devices in the vicinity.
HDMI 2.1 is backwards compatible with earlier versions of the standard, as is the new high speed cable. For more information, visit the HDMI forum website.
HDMI FORUM RELEASES VERSION 2.1 OF THE HDMI SPECIFICATION
A Huge Leap Forward Supports Resolutions Up to 10K and Dynamic HDR and Introduces New Ultra High Speed HDMI Cable
SAN JOSE, California – November 28, 2017 – HDMI Forum, Inc. today announced the release of Version 2.1 of the HDMI® Specification which is now available to all HDMI 2.0 adopters. This latest HDMI Specification supports a range of higher video resolutions and refresh rates including 8K60 and 4K120, and resolutions up to 10K. Dynamic HDR formats are also supported, and bandwidth capability is increased up to 48Gbps.
Supporting the 48Gbps bandwidth is the new Ultra High Speed HDMI Cable. The cable ensures high-bandwidth dependent features are delivered including uncompressed 8K video with HDR. It features exceptionally low EMI (electro-magnetic interference) which reduces interference with nearby wireless devices. The cable is backwards compatible and can be used with the existing installed base of HDMI devices.
Version 2.1 of the HDMI Specification is backward compatible with earlier versions of → continue…
The finalized version of Google’s Android 8.1 operating system is expected to be released in December, but today the company has announced the availability of the last Developer Preview which, among other things, activates the formerly dormant Visual Core chipset in the Pixel 2 and Pixel 2 XL smartphones.
The custom-built system-on-a-chip (SOC) is designed to power and accelerate the Pixel 2 phones’ HDR+ function that achieves better dynamic range and reduced noise levels through computational imaging. The feature is already incredibly powerful, so we can’t wait to see how it gets even better with this additional hardware boost applied.
The latest Pixel smartphone generation comes with the chip built in, but it appears Google ran out of time before the Pixel 2 launch to fully optimize Visual Core implementation in the device, and therefore decided to not activate it. With the new software version, Visual Core can can now be turned on through an option in the Developer menu.
In addition to souping up the Pixel 2’s native camera app, this update also allows third-party apps using Android Camera API to capture HDR+ shots. Previously, this function has been exclusive to the Google Camera app.
There is a wide selection of third-party apps for all types of mobile photographers available in the Google Play Store. It’s no doubt a positive move by Google to make the capability of using HDR+ available to all of them. To install the Android Developer Preview, your Pixel 2 device needs to be registered in the Android Beta Program. Or you could just wait for the official Android 8.1 launch → continue…
Display technology has suppressed camera technology over the years, preventing filmmakers from unleashing their visual creativity. Nowadays, the boundaries are significantly reduced thanks to HDR TVs and HLG.
HLG connected to HDR
HDR (11 stops) vs SDR (6 stops)
First, a word about HDR. HDR (which stands for High Dynamic Range) means being able to display a bigger brightness and contrast range. An HDR TV display contains and is able to present ten times as much brightness compared to an regular TV with SDR (Standard Dynamic Range) in Rec. 709.
It must be noted that not everything will be presented brighter in the HDR display. For example, a white piece of paper will be presented as pure white, but only direct sources of light will be presented brighter on a proper HDR display.
A Rec.709 TV (Standard Dynamic Range) displays ONLY 5-6 stops – enough to present people, skin tones and other objects that fall between black and white, and delivers a perfectly usable image.
Cameras are far more advanced than TVs in term of HDR
Cameras have been capable of capturing much more than 6 stops for a long time now. For example, when shooting with a Log profile, a camera is capable of capturing 14 stops, but a TV will squeeze those into only 6 by the SDR (Rec.709) display, and we will see a flat and washed-out image.
It is important to emphasize that there is no such thing “Flat” picture profile. Those milky images are caused by the limitations of the standard Rec. 709 TV dynamic range.
In other words, when we shoot Log, we shoot HDR!
HDR TVs as a filmmakers’s lifeline
HDR TVs can display 11 stops of dynamic range, and thus have the ability to reproduce the full amount of stops in the Log/flat picture profile, which leads to a → continue…
Comment on the forum In the last few weeks you may have heard rumors about the Panasonic “GH5S” from the usual places. It all sounds quite compelling. Rather than a high megapixel stills and video hybrid, the “S” model would focus even more on cinema with some astounding capabilities not seen on anything remotely similar. But hang on to your GH5 for now, until Panasonic officially confirm what is coming! Remember these are just rumors and there is no price mooted – it could be much more expensive than the GH5 and not as capable for stills. The rumor sites …