The cloud, AI and virtual production were key themes of this year’s NAB, as the huge and busy Las Vegas show successfully reunites the industry after two years away
After two years of cancelled shows, NAB finally returned to international visitors this week. Broadcast flew over to Las Vegas to catch up with a wide range of companies, in our first face-to-face meetings since 2019. And, despite fears Covid concerns might minimise the number of visitors, especially from outside the States, the show was busy and buzzy with a truly global audience in attendance.
In terms of trends, conversations about the cloud absolutely dominated, often combined with AI, followed some way behind by virtual production.
Sony kicked off NAB 2022 on Sunday morning with a press conference where it singled out virtual production, the cloud, and the trend for live music and sports events experimenting with a ‘cinematic’ look as themes for its presence at the show.
It showcased its range of large-scale curved Crystal LED screens (pictured in the background in the image above), which are being pitched for Volume stages. Sony said the Crystal LED screens offer faithful colour reproduction and large dynamic range for virtual production backdrops.
In common with other manufacturers at the show, Sony showcased many products that weren’t completely new but hadn’t appeared on the showroom floor before. These included its F5500 system cameras, which slot into the live production workflow, with the same form factor as a conventional system camera, but with the ability to give live sports and music events the shallow depth of field cinematic look. Most recently, they were used to cover the half-time events at the SuperBowl. Sony also released a new 4K HDR system camera, the HDC-3200.
Sticking with sport, Sony showcased a number of possibilities for innovative fan engagement through real-time skeletal tracking in Hawkeye, including ‘live-to-AR’, which it’s able to generate within 2-seconds of the live action. Sony expects this to be a product within the next 12 months.
Adobe and Frame.io
Adobe used NAB to promote the integration of Frame.io into Premiere Pro. Adobe acquired the video collaboration platform last year for $1.275B, and is now including it as part of creative cloud subscriptions. Adobe is giving creative cloud subscribers the ability to use Frame.io to work on up to five different projects concurrently with another remote user; use Frame.io for an unlimited number of reviewers; make frame-accurate comments and annotations directly inside of Premiere Pro and After Effects, without leaving the timeline; and use Frame.io accelerated file transfer technology for fast uploading and downloading of media, with 100GB of dedicated Frame.io storage provided as part of every subscription.
Steve Warner, vice president of Digital Video and Audio at Adobe, said: “The combination of Premiere Pro and Frame.io gives customers the world’s only end-to-end solution for video creation from ingest to editorial to output. This is the first step toward building a powerful, cloud-based platform for the future of video creation.”
LiveU’s focus at NAB 2022 was on its LU800 multicamera remote production device. The LU800 takes and synchronises the feeds from up to four cameras and you can then vision switch the camera feeds using a touch screen display on the LU800 itself, or control everything remotely. The LU800 transmits the live vision mix via bonding up to 14 connections with up to eight 5G/4G internal dual SIM modems, supporting up to 70Mbps. The LU800 has been used on the Tokyo Olympics, the Beijing Games and a range of sports. Several Premier League clubs have also invested in the product for producing content for their own D2C platforms.
Virtual studio specialist Zero Density demonstrated the benefits of its Traxis TalentS unit, which is an AI-powered, marker-less tracking system. Zero Density developed it to enable more accurate tracking of talent inside virtual studios. It precisely extracts the talent’s 3D location from the image, sending the tracking data to Reality Engine to create accurate reflections, refractions and the virtual shadows of the talent inside the 3D space.
Zero Density also announced the launch of a new online learning platform for creators of real-time broadcast graphics and virtual sets. The Zero Density Academy features more than 50 in-depth video lessons, that are all free to access.
Grass Valley’s press conference shined the spotlight on how it has grown its cloud-based infrastructure platform AMPP to become what CEO Andrew Cross called “the SDI jack of the future”. He said almost all TV studios and production trucks used Grass Valley products, and Grass Valley had worked to transition all aspects of the workflow to the cloud. He displayed a long list of different components that have all been re-designed from hardware devices to cloud-based software processes. Cross said there would still be a requirement for hardware for certain parts of the workflow, but the company was aiming to become “the Apple of the broadcast industry” in the way its hardware and cloud-hosted software would combine seamlessly.
The metaverse finally came into conversation at Canon’s stand, with it dedicating a fairly sizeable area of its stand to VR. In a bit of a throwback to NAB 2017, Canon has developed a VR fisheye lens, for use with the Canon EOS R5 and R5C cameras. The snappily titled RF5.2mm F2.8 L Dual Fisheye Lens captures stereoscopic 3D 180-degree VR imagery to a single sensor at 8K. The idea is you can move from traditional shooting to stereoscopic capture with a simple lens swap and create immersive 3D experiences at high-quality and high-resolution. The list price for the lens is US$1,999.
As usual, Blackmagic had a big announcement for the show, with the launch of Blackmagic Cloud. Aimed at collaborative post-production projects using Resolve, users pay $5 per month to create and host a shared library that can be used by creatives working on the same project. Video and audio editors, colourists, VFX artists and so on can all access the Resolve project files through Blackmagic Cloud, from any location. When the project is finished, the work can be exported and the project closed, with no more payment required. Blackmagic will also refund any of the $5 that hasn’t been used. To make it easier to get the project files into the cloud, Blackmagic has created three storage devices, to be used in conjunction with Blackmagic Cloud. The cloud functions are built into the new hardware and into the major components of DaVinci Resolve
The first device is the Blackmagic Cloud Pod, which is US$395. Customers record onto USB-C flash disks and use the Blackmagic Cloud Pod to make the disk available on the cloud. Blackmagic Cloud Pod doesn’t have any storage internally and has two USB-C ports so it can host two separate USB disks on the network at the same time.
Next up is the Blackmagic Cloud Store Mini, which is available with 8TB of flash memory. Blackmagic Cloud Store Mini costs US$2,995. Finally, there’s the Blackmagic Cloud Store (pictured above), which is available in 20TB, 80TB and 320TB models, and will be available later this year, starting from US$9,595.
Mobile Viewpoint has expanded the capabilities of its AI sports production system, iQ Sports Producer, to enable it to work with a three-camera setup. The system automates the production of different sports events, with AI controlling the acquisition and vision mixing between the different cameras. The three-camera setup makes it possible to capture a lower tier football match, for example, with a camera installed in the centre of the pitch and a camera behind each goal. The AI tracks the ball, and vision mixes between the three cameras accordingly. It also recognises when there are set pieces, such as corners, free kicks and throw ins, and adapts the vision mixing to mimic a manned production. BT Sport is currently testing the system at Dagenham & Redbridge FC, for potential use in automating the capture of its National League football coverage from the stadium.
EVS showcased its XtraMotion system at NAB 2022, which is aimed at generating super slo-mo feeds out of normal camera feeds. It makes it possible for events that can’t afford to hire a dedicated Super Slo-mo camera, or where it isn’t practical to use one, to benefit from similar-looking footage. The system processes the camera feed in the cloud, using AI to create interpolations of frames, turning every five frames in 15 frames. It takes 30-seconds for the XtraMotion system to turn standard footage into super slow-mo. Last year, Fox Sports used XtraMotion for its NASCAR highlights coverage (as seen above).