Researchers with Switzerland’s École Polytechnique Fédérale de Lausanne (EPFL) have developed a soft exoskeleton that enables its wearer to control a drone using their upper body. Called FlyJacket, the exoskeleton is a human-robot interface (HRI) that offers “natural and intuitive control of drones,” according to the university, enabling inexperienced individuals to operate them.
Typical drone controls involve a touchscreen and/or joysticks, which researchers say are “neither natural nor intuitive” for operators. As an alternative, FlyJacket places motion sensors on the operator’s body, requiring them to spread their arms like wings (with supports to prevent fatigue) and move their upper body as if they’re flying to control the drone, while a VR headset provides a first-person perspective from the UAV’s camera.
Explaining the usefulness of the technology, the research paper states:
The development of more intuitive control interfaces could improve flight efficiency, reduce errors, and allow users to shift their attention from the task of control to the evaluation of the information provided by the drone. Human-robot interfaces could be improved by focusing on natural human gestures captured by wearable sensors. Indeed, the use of wearable devices, such as exoskeletons, has been shown to enhance control intuitiveness and immersion.
The team has tested FlyJacket using a Parrot Bebop 2 quadcopter. A smart glove expands the user’s abilities by recognizing certain finger gestures to trigger actions. Touching the thumb to the middle finger, for example, triggers the system to set a point of interest, which could be useful during search and rescue missions. The researchers are working to add additional controls to FlyJacket, including the ability to adjust the drone’s speed, according to IEEE Spectrum.
A paper detailing the technology is available from EPFL here.
There’s a new video compression standard on the block. It’s called JPEG XS, and while it’s made by the same team behind the ubiquitous JPEG image format, it serves a much different purpose.
JPEG XS was announced earlier this week by the Joint Photographic Experts Group (JPEG), headed by École Polytechnique Fédérale De Lausanne (EPFL) professor Touradj Ebrahimi. The mission of this new format isn’t to replace the standard JPEG image standard, but to supplement it by being a low-energy standard for streaming video content via Wi-Fi and 5G cellular networks.
According to JPEG, the mission of JPEG XS is to, “stream the files instead of storing them in smartphones or other devices with limited memory.” JPEG specifically mentions the benefits of JPEG XS for video captured and streamed by “drones and self-driving cars—technologies where long latency represents a danger for humans.”
|Photo by Samuel Schwendener|
What’s interesting is that JPEG isn’t trying to shrink the file size with JPEG XS. In fact, quite the opposite. Whereas the JPEG standard has a compression ratio of about 10:1, JPEG XS comes out to a 6:1 ratio.
“For the first time in the history of image coding, we are compressing less in order to better preserve quality, and we are making the process faster while using less energy,” said Professor Ebrahimi in the EPFL announcement post. “We want to be smarter in how we do things. The idea is to use less resources and use them more wisely. This is a real paradigm shift.”
JPEG XS is open source, as well as HDR-compatible, making it a prime candidate for content creators around the world. Already, the European Space Agency (ESA) has expressed interest in the standard. JPEG XS would serve as a perfect → continue…
Audio is 70% of what we see, apparently. In a VR experience audio theoretically should be even more important than in standard video or film. RØDE has caught onto this with the NT-SF1, an affordable mic for recording multi-directional sound.
From:: RedShark News
By Al Caudullo
As you might already have heard, Adobe has it’s own 360 Immersive toolsets with Skybox. But there are some other VR Tools sets that are quite good from BorisFX. Continuum, Known as The Swiss Army Knife of Plugins features the VR Unit.
Five very powerful VR Plugins in one unit.
And for those of you in Avid World, they bring 360 workflows to Avid Media Composer. Here is the full list of host that support Boris Continuum VR Init.
Adobe: Creative Cloud, CS5.5-CC 2018 – After Effects and Premiere Pro
Avid: Media Composer 6.5+
Blackmagic: DaVinci Resolve 12.5+
Sony: Vegas Pro 13
Magix: Vegas Pro 14+
Foundry: Nuke 9+
Apple: FCP X & Motion 5
For graphics cards, it is recommended that you run the latest graphics card driver version supported by your host application. A graphics card with a minimum of 1GB of RAM is required; 2GB of RAM is recommended.
Let’s take a look at how to use each one of these valuable VR Tools.
VR Insert allows users to easily insert a secondary source clip, title, logo, etc. into a 360/VR shot in true 360/VR space in either mono or stereo format in addition to providing a full array of controls to reorient the source in 360/VR space.
When we first place one of these elements into a 360 spherical space, the result is a 2D flat object in a spherical format. The sizing is just not right, and if you look at it in an HMD, it is obvious. By applying the VR Insert tool and selecting the appropriate settings.
There are a few things to keep in mind here.
First, when you apply the VR Insert, you need to select which layer the item is on that you are applying the effect to.
Second, in the Insert Apply mode be sure to → continue…
From:: Student Filmmakers
The World Surf League (WSL) and the Jeep® brand announced the release of their virtual reality (VR) experience, Jeep Sessions: A Surfing Journey in 360°.
Jeep Sessions: A Surfing Journey in 360° follows WSL Championship Tour surfers Jordy Smith and Malia Manuel as each embarks on a Jeep…
From:: Shoot OnLine
Yesterday, Adobe announced a ‘massive update’ to Adobe Camera Raw, Lightroom CC, and Lightroom Classic CC, adding new Adobe RAW and Custom profiles that showed the company was taking color and tonality more seriously. But that wasn’t the only update to come out of Adobe this week.
In preparation for NAB 2018, Adobe has also updated its video editing applications with useful new features for both After Effects and Premiere Pro users, and some really cool Adobe Sensei AI integration specifically for Premiere Pro.
The video above gives you a good overview, or you can can keep reading to dive a bit deeper.
Adobe After Effects
After Effects received a few interesting new features, including a new Advanced Puppet tool for creating complex motions, and Master Properties that allow you to apply changes to individual effects across multiple versions of a composition. Or, to let Adobe explain it:
With Master Properties, you can create compositions that allow you to control layer properties in a parent composition’s timeline. You can push individual values to all versions of your composition or pull selected changes back to the master.
Adobe has also added Immersive Environment into After Effects, providing 360-degree and VR content creators with a more efficient workflow.
Adobe Premiere Pro
Master Properties and the Advanced Puppet tool are pretty neat, but to see the most useful and impressive new features you’ll have to open Premiere Pro.
First and foremost, the new Color Match feature leverages the Adobe Sensei AI to automatically apply the color grade of one shot to another. This feature comes complete with Face Detection so Premiere can match skin tones where necessary, and a new split-view allows you to see the results of your color grade as you go, either as an interactive slider → continue…
To put up a full week of seminars and workshops is already a tradition in the HFF Film and Television School in Munich. Every year in March during the semester vacation, studios, auditoriums and the facilities are open to receive dozens of lecturers and professionals for a full week on experimenting equipment and to interchange knowledge. The concept is not so much directed to students but to professionals and teachers to improve knowledge, to get acquainted with new equipment, new trends and most of all to learn from each other’s.
This year was specially different. More workshops, more companies involved a lot more state of the art equipment supplied by the manufacturers and rental houses. It was indeed a very hectic week. In the building, every single studio, seminar room, theatre was occupied by a lectures in a total of 36 masterclasses running at the same time. Almost all fields were covered. Post-production, VR, Advanced post, VFX, 360º degrees, Camera, Lenses, Lighting, Documentary Film, Sound, sound for 360º degrees, Production, color grading, green production, green shooting, thesis shots for phd’s researches and extensive testing of cameras, lenses done by many of the lecturers present.
The industry, let’s say, the manufacturers and rental houses support the initiative by supplying large amount of equipment for experimentation and testing. This year was a record breaking of a considerable number of lenses. Rehoused from P+S Technik; Vantage with several anamorphic; spherical from Zeiss, ARRI, SW Sonderoptic/Leica and Fujinon. More than 50 companies amongst manufacturers, rental houses, post-production houses were represented during the week which culminated with the screening of all tests and workshops in the blue cinema room where attendees and lecturers shared their experimentations and tests.
To put all this together is only possible with the support of the SFT (Das Studienzentrum für Filmtechnologie) which stands → continue…
From:: Imago News
|NHK Fukuoka Broadcasting Bureau. Credit: Soramimi|
Japan’s national public broadcasting organization NHK is developing an 8K slow-motion camera capable of recording ultra-high-definition content at 240fps. The technology was announced in a press release (partially translated here), and will be showcased at NAB 2018 in Las Vegas next week. Though 8K monitors and televisions are still in their infancy, the broadcaster is pioneering 8K technologies in anticipation of future demand.
To that end, NHK also plans to showcase a new 8K VR display during NAB 2018. The display is designed to eliminate the pixelated look common to current VR headsets.
|NHK’s 8K 240fps camera|
Finally, future 8K broadcasts may benefit from the NHK’s new transmitter technology, which reduces an 8K broadcast from a huge 40Gbps to a more manageable (but still huge) 8Gbps. The transmitter then converts the content into an IP-based signal for live broadcasting, a process that allegedly happens in “tens of microseconds.”
According to AV Watch, NHK anticipates using its new 8K technology for sports broadcasts (think Tokyo 2020 Olympics) and other content featuring fast-moving objects starting later this year. Unlike existing solutions, the NHK system is said to offer better compression and transmission for a very low delay while maintaining 8K quality for live shows.
Google has announced that it is experimenting with light field technology to improve its virtual reality content. The company detailed the work in a recent blog post, explaining that it has modified a GoPro Odyssey Jump camera so that it features 16 cameras mounted along a vertical arc, which is itself mounted to a 360-degree rotating platform.
According to Google, light field technology is one potential way to give users a “more realistic sense of presence” within a VR world. Light field-based content presents objects in different ways depending on the position of the user’s head and their distance from the object.
“Far-away objects shift less and light reflects off objects differently, so you get a strong cue that you’re in a 3D space,” Google explains. VR headsets with positional tracking take this to a new level by determining where the user is “located” within the virtual world.
Using its rotating Jump rig, Google is able to capture approximately 1,000 outward-facing viewpoints on a 70cm sphere, which ultimately offers a 2ft / 60cm diameter volume of light rays. The company explains how its translates that data into VR content:
To render views for the headset, rays of light are sampled from the camera positions on the surface of the sphere to construct novel views as seen from inside the sphere to match how the user moves their head. They’re aligned and compressed in a custom dataset file that’s read by special rendering software we’ve implemented as a plug-in for the Unity game engine.
Demo content has been released to the public via the Steam VR app “Welcome to Light Fields.” Users will need a Windows Mixed Reality, HTC Vive, or Oculus Rift headset to view the content. Light → continue…