The American Society of Cinematographers celebrates its centennial by opening the new ASC ARRI Education Center.
The new 5,000 square foot space designed to inspire and educate aspiring filmmakers. Located behind the ASC Clubhouse, the ASC ARRI Educational Center provides meeting spaces as well as an archive of American Cinematographer, the ASC magazine that celebrates the art of filmmaking.
“We are pleased to support the ASC in their ongoing mission to educate and inspire the next generation of filmmakers.”
The Center’s opening lands on the 100th birthday of the organization, a moment that for ASC President Kees van Oostrum, signifies ASC’s devotion to training and education. “It is thrilling to see the educational commitment of the ASC solidified through this endeavor,” President Kees van Oostrum noted in an ASC press release. “For me, it manifests our mission for the next 100 years.”
The new OS has arrived and it brings a lack of support for older apps. We break it down.
Most filmmakers won’t update their OS the day a new one comes out. Why?
Because if you work on time-sensitive projects the possibility that a fresh bug will ruin your workflow and cause a missed deadline is just too high.
However, the new macOS, Catalina, shipped Monday, and if you are a filmmaker thinking you might want to update soon there are a few things that you should absolutely be aware of.
Catalina will only run 64bit apps
The big change here is that 32bit apps are no longer going to open. This means the end of Quicktime Player 7, the beloved video player that many of us have used since we first played a video on a computer. While it’s replacement “Quicktime Player,” has been around for a while now, it lacks support for pro formats like DNx (the primary codec for Media Composer, which is also popular for Windows users even when working in Resolve or Premiere) as a native format, and thus many filmmakers have stuck by 7.
What is the test screening process like for filmmakers? Let the director of Shazam tell you everything you’d ever want to know.
We all know (or at least have an idea of) how test screenings affect the film that eventually ends up on the big screen, but what is that process like for a filmmaker? In this hilarious video, David F. Sandberg (Shazam, Lights Out), also known as ponysmasher online, offers the truth behind test screenings. He breaks down how test screening results are delivered, what producers and studio execs are looking for, and how one weird opinion can change everything about your movie.
Sandberg offers up some great insider knowledge about test screenings, and it can no doubt help some of you know what to expect when your films start heading in that direction. He exposes what the Hollywood test screening process has been like for him and gives us a rare look inside a part of filmmaking few of us have been able to see.
Anthropocene: The Human Epoch directors and cinematographers unpack the ambitious scale of the visually-stunning and perennially haunting project.
It’s fitting that Anthropocene: The Human Epoch, a film that attempts to convey the massive impact of humanity on the earth’s landscapes, would require such a large-scale production. The film’s three directors — Jennifer Baichwal (documentarian), Nicholas de Pencier (cinematographer), and Edward Burtynsky (photographer) —filmed in 20 countries and 43 locations over the course of five years. Their 29 cameras captured more than 50 terabytes of footage, totaling over 203 hours and ten different codecs.
The result is a haunting overview of the myriad ways in which humanity has dominated, and effectively re-engineered, our planet. It’s one thing to know, intellectually, that 75 percent of non-ice-covered land is occupied by humans. Or that 85 percent of forests have been cleared or degraded through human use, as narrator Alicia Vikander tells us in the film. It’s another thing entirely to be cinematically immersed in that devastation.
Apple has released Final Cut Pro 10.4.7. This new update of FCPX introduces a new Metal engine for increased performance, as well as optimizations for the upcoming Mac Pro and Pro Display XDR. I personally use FCPX and it has been a great investment. I paid $299 USD for it 8 years ago and it … Continued
Another awesome podcast today with a cinematographer that I have wanted to get on the show for a very long time. Christopher Probst ASC is our guest on the show today and we only just scratched the surface of his work and knowledge.
So much so, that we are doing a Part #2 with him as we barely touched on any of his actual work.
It was a great conversation and I am thankful there are people like Christopher out there smart enough to explain complex things in an easy manner to people like myself.
Patreon Podcast – Prisoners in Silhouette
We are dipping our toes in familiar waters this week over on the Patreon Podcast. We take a long look at establishing shots and style in Roger Deakins ASC BSC Prisoners.
Lots of great things to go over in this film but we focus our time on shooting silhouettes and wides. This was a fun one to check out so make sure you give it a listen and check out the stills over on the Patreon page to follow along.
To see the images and listen to the special breakdown podcast click the link below:
Welcome to the ProVideo Coalition Podcast! With so much news coming out every day it’s nearly impossible to keep up with whats going on in and around the industry. So each week (or as close to weekly as we can get) we will sit down to record a week in review show to give you a rundown while you take a run, drive home or decompress on the couch.
The ProVideo Coalition Podcast is available on Anchor and Spotify. Subscribe so you don’t miss future episodes! Have a question/ comment? Shoot us a message on Instagram (@provideocoalition) or send us an email: firstname.lastname@example.org.
Artlist has just announced their first-ever horror film contest with $30,000 worth of prizes up for grabs including a brand new Blackmagic Design Pocket Cinema Camera 6K. What’s required for entry? To create a submission, you will need to feature a track from the horror collection and create a short film that is 3-minutes or … Continued
In 1984, a young director stepped behind the camera to create an original sci-fi action movie about time-travel, cyborg assassins, and a woman who gives birth to the savior in the future war against the machines. The Terminator was a runaway hit, popular with audiences and critics alike. The movie launched the career of James Cameron and kicked off a multi-billion-dollar franchise that expands across film, television, comic books, video games, and theme park attractions.
Thirty-five years after the original movie was released, the series is still as popular as ever. Terminator: Dark Fate picks up in present day after the events of Terminator 2: Judgment Day, with Linda Hamilton and Arnold Schwarzenegger reprising their now iconic roles. James Cameron is coming back to the franchise as a producer and writer and stepping into the director’s role is Deadpool director Tim Miller. Skydance’s David Ellison produces.
Miller’s debut feature film was edited on Adobe Premiere Pro after a strong recommendation from Gone Girl director David Fincher. Miller and his editorial team knew that Premiere Pro and Creative Cloud were the only choice for the incredibly demanding editing and VFX workflows behind Terminator: Dark Fate.
From unknown artist to Hollywood director
One of the reasons that Miller insisted on working with Premiere Pro is his long history with Adobe, which began decades ago. Before he became a director of Hollywood blockbusters, Miller was an unknown animator and visual effects artist. Much of his day was spent working on Adobe creative solutions.
Even then, Miller always leapt at the chance to interact with Adobe product teams. He signed up to become a beta tester, and he shared ideas of features or tools that he thought could help him work more efficiently or create something new.
“People may think that Adobe only listens to me because I’m working on a big movie, but I had that same level of connection even when I was just a dude with a computer,” says Miller. “Adobe’s goal is to make the best tools for the artist, and my goal is to have the best tools available. So I’m a big fan of working with a company like Adobe that actively asks for your feedback. Everyone benefits.”
Back and better than ever
Thanks to the partnership with the Adobe Premiere Pro team, the editing workflow has only gotten better since Miller directed Deadpool. Shared Projects offers rock solid collaboration making it easier for multiple editors to work together.
“If you’ve ever worked on feature films, you know that the assistant editors end up spending a lot of their time on housekeeping tasks, like organizing clips into the right bins,” says Miller. “Adobe Premiere Pro is very good at helping us automate and streamline those tasks so that our assistant editors can spend more time working on the actual edit.”
Seamless edits and visual effects
Another reason that Premiere Pro was the best choice for this film is its seamless integration with Adobe After Effects. Every movie has visual effects, whether it’s adjusting lighting to create a night shot or replacing the background in a scene. But with a sci-fi action movie like Terminator: Dark Fate, the number of visual effects grows exponentially. Miller and his team handled post-vis temp visual effects in-house using After Effects to make sure that shots worked before sending them to more costly visual effects vendors.
“You can have people working in Adobe After Effects and feeding it back into the edit in a way that’s organic and doesn’t waste a lot of time,” says Miller. “It makes the edit very quick and easy.”
Final Cut Pro Updates to 10.4.7 with new layouts to support the XDR monitor and the arrival of Sidecar.
Apple just rolled out their new revision of macOS Catalina, and along with it a major revision for their video editing application, Final Cut Pro X. The headline feature for filmmakers is a brand new Metal-based graphics engine that promises faster playback, rendering, exports and realtime effects on any mac that supports Metal.
The Sony FX9 is an affordable, compact, lightweight cine camera that can democratize Full-Frame. With a new companion Full-Frame zoom lens, this system is about the same size and weight as Super35 predecessors and still shoots both formats. Full-Frame cine becomes mainstream. And, history repeats itself. read more…
Few things are more terrifying than a corrupted or accidentally formatted memory card. When that recently happened to me, I hastily hopped on the Internet to find memory card recovery software. After several hours of searching, I came across an obscure but free tool, and a few hours later, all my images and videos were safely and securely transferred to my hard drive.
Palmer Luckey has moved on after being ousted from Facebook. His second act as a founder is California-based manufacturer, Anduril. Based in Irvine, the upstart has raised over $41 million since 2014. They’ve developed the Anvil, a quadcopter designed for short, fast flights up to 100 mph. The Anvil is equipped with sensors to identify, track, and intercept targets.
According to Luckey, the best way to respond to the threat of a drone illegally entering unauthorized airspace, especially military bases, is to physically smash into and disable it. ‘All the soft kill systems are a waste of time,’ Luckey told Bloomberg in an interview, referring to a spate of anti-drone technologies that have recently surfaced.
Equipped with electro-optical and infrared sensors, that are effective both in daylight and at nighttime, rotors on the bottom, and strategically-placed flight-critical parts, the Anvil takes out hostile drones from below. An operator can authorize an attack through a handheld remote while viewing a live feed. One main issue is the damage sustained to the Anvil as a result of a high-speed impact. Luckey believes in their durability and is not too concerned.
Anduril is already working on larger, faster versions of the Anvil that will target a small aircraft, helicopters, and even a cruise missiles. It remains to be seen if crashing into rogue unmanned vehicles will be an effective solution to securing airspace. Said Luckey, via Twitter, ‘the best way to kill fast drones piloted by hostile humans is with even faster drones piloted by AI. The United States cannot allow the skies of the world to turn into the Wild West, our ability to take out aerial threats in a matter of seconds is part of the solution.’
A couple of weeks ago, we shared some photos of the world’s largest optical lens, which had just been shipped to the SLAC laboratory in Menlo Park, CA, where it would be joined with the world’s largest digital camera. Unfortunately, we obviously weren’t actually there for the reveal of this record breaking lens, but YouTube channel Physics Girl was.
For those of you who haven’t followed our coverage of the Large Synoptic Survey Telescope (LSST), here are the cliff notes. The LSST will live on a mountain in Chile, where it will use a 3.2-gigapixel camera and some massive optics to capture a 15-second exposure of the night sky every 20 seconds. At this rate, the LSST will be able to image the entire visible southern sky every few nights.
But the world’s largest digital camera can’t just snap photos on its own—it needs the world’s largest lens. That’s where this monster comes in:
This is, without a doubt, one of the most incredible pieces of optical engineering in history: a two-lens system made of fused silica that’s 5-feet wide and took 5 years and $20 million to create. We shared several photos of this lens last month after it arrived at the SLAC National Accelerator Laboratory in California, but Physics Girl host Dianna Cowern was actually invited down to the laboratory to witness the arrival and reveal for herself, and see the camera that it would be going inside.
She was appropriately impressed:
“It’s easy to forget that this is a camera lens. The one we were looking at, that piece of glass, is a five-food marvel. It looked like a perfect water droplet,” says Cowern.
In addition to capturing video of the lens as it was (very very VERY carefully) revealed, Cowern was also able to go inside the clean room and see the 3.2-gigapixel camera itself, which she found even more awe-inspiring. This is about the closest look we’ve gotten so far at this camera, and it will be one of only a few because, once fully assembled, many of these components won’t be visible any longer.
So whether you’re interested in science, camera sensors, optics, telescopes, or just want to see something amazing, definitely check out the full video up top. And if you want to see many more pictures of this lens, click here.
What is your background?
I was an ad/design major in college and when I started working on films, I came up through the production side of VFX. I did some on-set data wrangling, coordinating, plate supervision, producing and supervising now.
How did you get involved on this show?
I knew Michael Schaefer, the president of New Regency, from when he was a producer on THE MARTIAN and ALIEN: COVENANT through Scott Free that I consulted on for Ridley. He called me on this one to see if I wanted to get involved.
How was the collaboration with director James Gray?
I found working with James quite interesting as I got to have many conversations about what the meaning of a scene was and what he was thinking when he wrote it. He’s very open and freely shares his feelings. He’s also very trusting in his crew and creates relationships that allow us to freely share out thoughts about the work as well.
What was his expectations and approach about the visual effects?
James is was so focused on story, character and the realty of what was happening that it wasn’t good enough to just create a nice looking shot but they were required to not detract from the story, and to tell their own story that served the overall movie and had to be scientifically plausible. All the graphics had to be designed with function in mind. And he would ask “what do those numbers mean? », referring to a little text in the corner of the screen. And if there wasn’t an answer about the true function, it had to be fixed.
How did you organize the work with your VFX Producer?
Well, I was the VFX producer too, so I would shift gears as needed. During pre-production, we hired Brad Parker to supervise the plates for us. And he did a great job, which allowed me to focus on planning of the overall production to make sure we shot the pieces we needed to shoot. And then during post, it’s become fairly routine for me to juggle both jobs, so I just have to prioritize based on what’s important.
Can you tell us more about the previs and postvis process?
I hired Halon (with Clint Reagan, whom I did PROMETHEUS with) in pre-production as we had some tricky set pieces to figure out. Mostly it had to do with how much of the sets would the art department build and how much would be a digital extension. The antenna sequence and the end sequence at Neptune come to mind as falling into that category. And then there were aspects where we had to help flesh out the story more once we got into the reality of trying to convert what was on the page into something that was shootable and we could successfully put into production in post.
For postvis, Halon came back, this time with Casey Pyke, leading his crew. This time it was all about helping editorial and myself to work out the sequences and start populating many of the plates and blue or black screens with the appropriate background.
Where was filmed the various parts of the show?
All the stage work was done in downtown LA at the LA Hangar Studios. Most all of the interiors were location work around LA that art department converted or enhanced to work for the vision of the movie. The moon rover chase was shot at Dumont Dunes, CA in the Mojave desert.
Can you explain in detail about your work and the challenges of the fall of Brad Pitt from the Antenna?
The previs was invaluable in figuring this one out. We had to figure out how much of the antenna could we build, and could we stage the action in a way to allow only 2 pieces to be built. Once production built that on blue screen, they would shoot as many pieces with Brad as they could, and then use a double to do the stunts. We added a full CG environment and extended the antenna up and down as needed. From there, after he’s blown off the tower, we went into full CG shots for a bit. The close ups of Brad were on stage and we added the CG BG to those. Once “Roy” gets into the atmosphere, we went out to San Luis Obispo, CA and shot some aerial skydiving. Travis, our stunt guy, and a camera operator jumped from 13,000 feet out of a helicopter and did all the really great tumbling out of control, deploying the chute and then the landing shots. Many of those shots required BG replacement since he’s supposed to be much higher than the 13,000 feet we did. Once in post, the biggest challenge to that was creating the photorealistic earth BG environment from up high before the action takes place. That was about 80,000 feet in the air, and there’s just not much in the way of good reference from that altitude. The normal tricks of adding haze to the CG wouldn’t work because of the such thin atmosphere at that height. So it was a struggle to make things look believable in those conditions, but I think Olaf and Mr. X did a great job.
How did you work with the art department with the various spaceships?
The ILM art department did some of the initial designs for us during the early days. As we started to figure out the angles we’d be shooting, with the help of previs or storyboards, we’d all come to agreement on what would be built. Mostly it was the hatch doors so we could enter and exit the ships and then during the Neptune sequence, the spinning antenna that he flings off of was also built on stage. And then production built all the interior ships that we would film.
Can you elaborates about the design and creation of the Cepheus?
The idea with the Cepheus was that it was a long range traveling ship. After consult with JPL and NASA, they felt we’ll be using conventional rocket engines and fuel for quite some time when launching since you get a very high weight to thrust ratio. But once in space, the idea is that you need something to help keep the craft going, so technology like nuclear fusion would be used since you would need low, but continuous power. This ship, along with all crafts were designed with utility in mind. They’re not fancy, but they feel like they could have been designed by NASA and not a Hollywood production.
What was your approach about the zero gravity shots?
Because we were on a tight budget and shoot schedule, we didn’t have the ability to do elaborate work or use motion control like they did so successfully on GRAVITY. And this also kept in line with James’s desire to not detract from the story, so doing elaborate camera moves or anything like that would start to pull the audience out of the moment.
How was filmed the zero gravity shots?
The art department built a horizontal set that was the complete and full interior and then built a smaller piece that hung vertically that we’d extend by shooting plates from the horizontal one. The actors would be hung from a wire rig in the ceiling and lowered down the set to give the feeling of floating towards camera.
How did you work with the SFX and stunts teams?
Stunts had to deal with all the rigs required to sell the zero gravity. So with Rob and his crew, it was about figuring out how to accomplish what was done in previs, how he could build a rig to accomplish that and also how could we hide it the best we could to reduce the paint clean up. Frank & Roland in SPFX were involved in building a rotating rig used for the Roy & Dad spinning scene on Neptune, as well as providing elements to be used in various places. They did the big crash on the moon where they pulled one of the rovers into a concrete barrier set out in the desert. They used frozen chunks of coffee in the desert to simulate the moon regolith impacts when the ground is hit with the Stiletto fire.
Which stunt was the most complicated to enhance?
I think the above mentioned crash on the moon. There were many elements that had to be comped together for that. A FG rover piece, a separate piece for the stunt doubles to get their performance to line up with the timing of the shot, and of course the big impact into the concrete barrier which had a lot of debris. Then all the digital elements that got added in, like the moon surface, the solar farm, new concrete support columns, impact pieces and dust on top. It was quite an involved shot, but I think it works quite well.
How did you design and create the Surge effect?
James was keen to make this as organic and photographic as possible. We ended up filming out the shot with the final environment and before processing, Mark Van Horne at Fotokem created a set up to flash the frames and expose them to a light on his phone, which gave us the cyan blue flash. He also damaged some of the emulsion by spraying acid on the negative and then processed the whole thing. Once those elements were complete, Louis Mackall at Lola took the pieces into the Flame and layered in the frames as needed to get the ramp up and then actual hit to the camera.
What kind of references and indications did you received for the Surge?
We looked at old film damage reference and what happens when the emulsion sticks together after getting wet and then it’s peeled apart. It had a nice organic quality that really messed up the image in ways that don’t look like what you’d get when you approach this from a strictly digital side.
Can you explain in detail about the design and creation of the Moon base?
The idea was that this was once a military base and over the years as we began space tourism, the military created a public and government sectors. So as Roy lands on the moon, he’s entering the public and commercial areas. In the big matte shot we introduced some of the commercialism by adding in the logos. James wanted that same feeling when we go to our local airport to also have that feeling on the moon. So we added store fronts and kiosk advertisements to try to replicate that.
There is a big action sequence on the Moon surface. How does the zero gravity affects your work?
So in this instance, we were 1/3 gravity being on the moon surface and in a vacuum. So all the anim and effects simulations were run with accurate settings so we could see what would happen with the regolith, dust and impacts. While the computer may have simulated those correctly, ultimately we changed the settings slightly as the 100% accurate didn’t look as believable or what we referenced from actual moon footage. So we had to take some license and try to find what we thought represented the intent of the effects work.
How did you handle the helmets reflections challenges?
On the moon, we tried to keep as much of the reflection as possible that did work. We didn’t want to have to create CG rover reflections for every shot, so we always tried to extract that information from the plate. The terrain was created and mapped into the visor. The real helmets have a gold reflection but tint green, so we mimicked that as much as we could, but also vignetted out the middle slightly to allow us to see the faces of the actors.
Can you elaborates about the challenges of the lighting especially in Space and on the Moon?
Well everything is a single light source in space, you only get some fill if you’re in close enough proximity to a planet or moon. So we tried to stay as true to that as possible and used rim light and slight amounts of fill just to keep the shadow side from going completely black. On the moon, even though the surface is quite dark, the lack of atmosphere is essentially the same as space. A very bright and hard single source, but with a lot of bounce to fill in the shadows. The original lunar photography shows the extreme contrast and brightness quite well, so we tried to mimic that feel.
How did you choose the various VFX vendors?
I’ve had good luck working with MPC in the past on many projects and thought the work they did for us on PROMETHEUS was top notch, so I wanted that same quality with our ships and environments. MPC needed to reduce their workload and since we were looking for ways to maximize the budget, we agreed we could give Mr X a portion of their work. I worked with Olaf Wendt on a previous show and knew he and Mr X could do some great CG and environment work, so it was a standout to have them do the Antenna sequence since it was going to be a complex asset and environment. They also picked up some of the exteriors at Neptune, and did the nuclear explosion sims. Method Studios (Atomic Fiction when I awarded) was a stand-out company in my mind. They’ve done some great CG car work, the DEADPOOL stunt scene) and I liked their environment work, so I gave them the entire moon rover chase and they also did the spacewalk to Vesta right before the monkey sequence. And of course as soon as I read the script, I knew I wanted Weta to do it. We didn’t have a ton of money, so we were able to focus our resources on the animation and getting that right because the rest of the look I knew would quickly fall into place with them. Once we locked animation, their first submission to me was so good, we just had little tweaks to get the look dialed in. I also knew I would need a company that could handle a good volume of wire and rig removal work at a reasonable price and Bot came highly recommended by colleagues, so I took a chance and awarded them work. I also had 2 comp artists, Michael Shermis from Pixel Pirates, and Brad Gayo, that would help temp up shots during the post-vis phase and then work into doing additional work for final. Once all those pieces were locked in, I started to look at different vendors for all the little miscellaneous things that were different than originally planned or hadn’t been awarded. Soho VFX came in late in the game to do the monorail sequence and all the comfort room shots to add in the moving footage. We ended up with 12 vendors by the time we wrapped up.
Can you tell us more about your collaboration with their VFX supervisors?
For the most part, most all the conversations were over the phone or Skype. We would post-vis a scene, send it over to the vendor and then I’d cineSync with them to walk them through the creative brief and what we were looking for. Olaf and Guillaume did attend some in-person reviews on occasion since they were local to LA. Michael and Brad were in my office, so I would do daily reviews with them because it was so easy for them to submit shots and I could drop in to see them when I had a free moment. I would drop by Lola on occasion to work through some of their shots with them. The other vendors were all over the world, so the reviews had to happen virtually. We’d do an internal review, then review with James on anything he needed to look at, then I’d either do frame annotations if anything needed a drawing and then we’d send written notes. Then in our next vendor review, we’d review all the comments to make sure they were understanding the directions. It’s a great benefit that everyone is so used to working remotely so that we can now go after the best company, regardless of where they are.
Is there something specific that gives you some really short nights?
It usually has to do with tight schedules and will the vendor be able to turn around enough iterations in time.
What is your favorite shot or sequence?
It’s hard not to love the moon rover sequence. It was an original idea, a new technique for shooting, excellent execution of the shots and I had such a pleasurable working experience with Ryan Tudhope (VFX Supervisor), Jed Smith (VFX Supervisor), Aidan Fraser (VFX Supervisor) and the entire Atomic Fiction/Method crew.
What is your best memory on this show?
Lunches in post. The entire VFX, editorial, sound and music crew would all have lunch together with James. We shared a lot of laughs and many apple pies. At the end, the relationships we form and experiences we have are what we’ll remember for many years after we’ve forgotten the trial of making a movie.
How long have you worked on this show?
Just over 2.5 years from the time I first read the script until final VFX delivery.
What’s the VFX shots count?
848 shots, which was 76% of the movie.
What was the size of your team?
Spanning such a long production, we sometimes would find ourselves losing crew to previous engagements. But for the most part, we had about 5 that were around for the bulk of it. But one production supervisor, Elizabeth Willaman, was with me for the entire project, which gave a great continuity to the department.
What is your next project?
I’m not sure. I’m consulting on a few projects with New Regency for the time being.
What are the four movies that gave you the passion for cinema?
I grew up in the 70s and 80s, which were great for kids and movies. They had been making adventure movies for years but with the advancements in VFX, new story opportunities were starting to opening up. Movies like RAIDERS OF THE LOST ARK, BACK TO THE FUTURE, TOP GUN, GHOSTBUSTERS, ALIEN and JAWS and ET were incredibly entertaining. And then going to a place like Universal Studios and seeing JAWS and how not scary he looked really opened my eyes to the what could be accomplished with editing and of course music. 40 years later, who wouldn’t immediately get out of the ocean if you heard the JAWS theme playing?
A big thanks for your time.
WANT TO KNOW MORE? Method Studios: Dedicated page about AD ASTRA on Method Studios website. MPC: Dedicated page about AD ASTRA on MPC website. Mr. X: Dedicated page about AD ASTRA on Mr. X website. Halon: Dedicated page about AD ASTRA on Halon website.
With technology advancing, watching sports becomes more exciting each year, but it’s quite possible that this time, the event organizers at the world track and field championships in Qatar have gone too far!