Steve Wind-Mozley, chief marketing officer at Vizrt Group, makes his predictions for how broadcast tech providers can target the metaverse

Executive Steve Wind-Mozley 1920x1080 Vizrt

I raised a slightly skeptical eyebrow when Mark Zuckerberg announced that Facebook would be morphing into Meta. Like colonial powers of old, he seemed to be intent on planting his flag on the unclaimed territory of the metaverse, and to my mind, he seemed to be acting in a very antediluvian way, as he seemed to be ignoring the fact that the landscape of the metaverse is already populated and the fabric of it is, by definition, shared.

You could argue that the concept of the metaverse has been around for a while. For instance, Neal Stephenson coined the phrase in his 1992 novel, Snow Crash, which was about a dystopian future of people living in a virtual world because the current world had become so poor, no one wanted to live in it. The metaverse idea outlined in Snow Crash has inspired several people to combine the physical with the virtual. Far before Zuckerberg and Facebook’s announcement, and shortly after, Satya Nadella and Microsoft’s.

Perhaps mindful of this, Zuckerberg went on to explain in recent interviews that he sees the metaverse as being the next evolutionary step of the internet, a more cohesive meshing of the online world and the real world, where interoperability and integration are two of the cornerstones of the new normal.

Meta logo (2)

I know, this is turning into buzzword bingo – metaverse, evolution, new normal. I’m going to refer to the cloud, AR, VR, and XR in a moment, so hold tight – however I will not be trying to shoehorn crypto into this article.

The thing is the pandemic has changed the way we not only work and play but the way we ‘exist’. The way we experience work, education, play and entertainment has become much more ‘hybrid’ in nature – not just in terms of selecting which physical location we choose to be in at any juncture, but in merging the IP-based, compute-driven world with the more traditional one. Arguably, this hybrid world isn’t the new normal or the next normal, but the now normal.

In this new ‘metaverse’ augmented reality and virtual reality, (collectively known as extended reality or XR), become the foot soldiers in the very vanguard of change. They can deliver additional layers of information, enriching the existing experience, providing personalisation for consumers previously unimaginable, offering unique opportunities in how we interact, in some cases replacing in-person interactions entirely. However, for those experiences to be credible, immersive, usable, believable, monetisable, they must be delivered in real-time, they must be accurate, and the graphics must be exceptional.

Without the accuracy and high quality, it won’t capture and hold consumer attention and without their buy-in, any attempts at traversing the metaverse are a bust. When I say accurate, I mean in the sense that they must be true to the viewer’s perceptions, and they must be contextually relevant. First, they must be seamlessly rendered with imperceptible levels of latency (this is that annoying delay experienced before a transfer of data, so no lagging allowed) and secondly, they must be data-driven.

Achieving low latency in the metaverse requires the workflows and computer vision solutions that are being deployed every day in a myriad of uses cases, like TV broadcasts, sports events, video calls, and live stream eCommerce (vCommerce).

In these spaces, we are seeing partnerships between tech vendors as they build integrated ecosystems to allow artists and visual storytellers to make the unreality of the metaverse look and ‘feel’ believable. An example of this is the recent native integration of Epic’s Unreal Engine into Vizrt’s live broadcast graphics Viz Engine 4.3, creating a single robust workflow that enables creatives to harness the state-of-the-art rendering abilities in a seamless and highly accessible manner. It’s this type of coalition that gives the metaverse, and real-world storytellers and service providers that will construct it even more options when they look to deliver immersive computer-generated graphics to viewers in ways that will literally make them believe their eyes.

A real-world use case of this approach can be seen if we look at Al Arabiya, an international Arabic television channel based in Dubai with over 140 million subscribers on social media. They adopted this innovative tech and the integrated workflow to bring a new perspective to viewers during critical broadcasts. It’s transported viewers into the Amazon Rainforest during conversations about climate change, and into the human bloodstream to view how COVID-19 viral particles compare to blood cells. With this technology, they can now bring viewers into a hybrid world without the need for VR headsets, delivering enhanced virtualized experiences while avoiding the inconvenience of their audiences having to don specialised equipment.

Secondly, the data-driven aspect speaks of the two cornerstones mentioned earlier, interoperability and integration. As with the internet that the metaverse is part of, inter-connectivity is the essential ingredient. And the speed and precision by which this is measured is governed by internet protocol - or IP.

At its core, IP is a transport protocol that enables the communication between different hosts. While this has certainly come a long way since its invention in 1974, with the level of computing power and low to no latency needed to power the metaverse, there is, I think, only one option for moving the required XR and data content around: NDI.

NDI focuses on solving the main issues around moving video over IP while retaining the need to minimize bandwidth consumption and reduce latency while maximizing quality. As a ‘standard’ protocol for IP-video, NDI is well placed to encourage the interoperability that is needed to ensure that the meta-ecosystem thrives.

As more experiences, from commerce to enterprise solutions for knowledge workers begin to blossom in the metaverse, augmented reality and virtual reality use will increase, along with the need to move more video (attached to a lot of data and also near instantaneously) over existing IP networks. At the same time, compute capabilities to drive this new paradigm will need to be highly optimized, be they on-premises or more increasingly in the cloud. See, I told you cloud would be in here somewhere.

Accessibility to live IP-based video and XR solutions expeditiously becomes even more important as the ability to contribute, participate, and influence correlates increasingly with mission success. Some tech vendors that historically worked within the broadcast space are perhaps best positioned to help champion this democratization of the metaverse as they are experienced in harnessing the power of IP, computers, and networks to the creative plow of visual storytellers everywhere.

Facebook may have planted its flag in the new continent that is the metaverse, but innovative technology pioneers across broadcast and live video production are the ones with the wealth of experience and skills across many of the competencies that will increasingly become vital to the exciting task of taming this brave new (albeit already existing) world. Who knows, maybe the GPUs that are currently mining crypto currencies can be reassigned to back to their original mission of delivering engaging graphics…oh whoops. Sorry.

Executive Steve Wind-Mozley 1920x1080 Vizrt

Steve Wind-Mozley is chief marketing officer at Vizrt Group.