Consulting

Into the Metaverse: 6 Next-Gen Technologies Shaping the Future of Reality

Into the Metaverse: 6 Next-Gen Technologies Shaping the Future of Reality

For the metaverse to flourish, a set of next-gen technologies must be invested in and developed. But as Mark Zuckerberg said in his video about Facebook’s recent name change to Meta, this is a process that may take decades. 

Still, despite the promise of a full-blown metaverse being not quite accessible in our current reality, Gartner’s yearly report on tech hype mentioned that mixed reality (MR) has reached a plateau of productivity where it is adopted by more and more users, thanks to the multiple environments that are currently on the market. Plus, as Allied Market Research pointed out, the MR industry is expected to reach $454.73 billion by 2030.

So, what is the metaverse, and what are the tech trends that will drive its movement from vision to reality? Read on to find out!

What is the Metaverse?

The metaverse is a virtual-reality space in which users can interact with a computer-generated environment and other users. According to Mark Zuckerberg’s keynote, the metaverse is definitively about “experiences.” Meta’s take on the metaverse is “a new phase of interconnected virtual experiences using technologies like virtual and augmented reality.”

To understand what is already happening in this space, we can look at what big companies like Apple, Microsoft, and Meta are doing and examine the topics being discussed at major industry events like AWE and ISMAR. Based on that analysis, here are six of those next-gen technologies.

3D Reconstruction

When we talk about extended reality (MR/VR/AR), we are referring to spatial information; this involves 3D models, audio, and other multimedia that can be spatialized. For this to happen, the first approach is to develop actual 3D content.

We’ve seen what the iPad’s LIDAR and the Hololens 3D Kinect Sensors are capable of, and we can also look at the metrology industry to see what state of the art on 3D reconstruction is. Companies like Zeiss, Nikon, and Faro have 3D reconstruction sensors that are precise to units even lower than millimeters. Of course, these devices are not cheap by any means. Still, we have consumer versions of these technologies, such as Apple’s LIDAR in its mobile devices and the Matterport 3D camera, which is not as precise as Zeiss’ or Faro’s solutions but comes at an affordable price for a consumer. There’s also Microsoft’s Azure Kinect, which is focused more on an IoT approach. Even Canon and other lens companies are releasing specific lenses for VR content.

Not only is 3D reconstruction important, but we also need to think about how we can handle all the data and make it actionable for users. Think of it like how Sketchfab allows us to have an enormous user-generated 3D content library and share it seamlessly over social media like Facebook. We need an environment that enables users of the metaverse to generate and access content on top of the real world or inside our virtual worlds from anywhere, across multiple devices, depending on the context.

We also need to consider each MR device and how it senses the world. As we’ve seen with devices like HoloLens, Magic Leap, and HTC Vive Pro, we can have a real-time 3D reconstruction of the world, allowing the digital content to interact with our space in real-time. For example, if we move a piece of furniture in the real world, the 3D content should react to it and extend our reality. 

Currently, these sensors accommodate what the SoC (system on a chip) on the device can handle, but if we examine how the fidelity of the reconstruction is relevant for the interaction between real and digital, we can see how this will work in the coming years when more robust silicon is available for XR devices. The levels of immersion will be astonishing!

Artificial Intelligence (AI)

Now that we’ve discussed 3D reconstruction, we can look at the next big XR integration, which is AI. Let’s consider an analogy. We have neural pathways in our brain that process spatial experiences; the first is the Dorsal Pathway which identifies relationships through space and movement. In our XR systems, this is accomplished by the 3D reconstruction systems and many algorithms like SLAM and visual-inertial odometry. Then comes the Ventral Pathway, which oversees recognizing and classifying objects, giving us context on what is happening around us. In our XR systems, this is accomplished by AI. In tandem with the 3D reconstruction systems, AI allows us to classify our environment and give context-rich information in the right scenarios.

That’s not all. We can also highlight empathetic AI, helping us augment our experiences and how we communicate and understand others. For example, the system can send us warnings if we fall asleep while driving or performing a hazardous task, which companies like Affectiva are trying to achieve. These can allow us to get closer to context-aware systems that can give real-time and valuable information to the metaverse users.

Internet of Everything (IoE)

But just as AI is necessary to give context-rich information, we need data to feed such context-aware systems and make them work. For this, connectivity between devices is a must.

Think of it like how smart devices allow you to command your preferred digital assistant, e.g., Alexa or Google assistant. Let’s go further from the Internet of things (IoT) into the IoE. We can feed our XR systems and the metaverse with data and interactions that are relevant for the user.

As showcased in the Meta announcement video, we saw a reconstruction of an apartment where a user turned on the TV by just looking at it and making a gesture. The IoE would enable this to become a seamless reality, allowing us to control other systems beyond our XR system and avoid the limitation of voice commands with our virtual assistants.

Digital Humans

A new level of immersion would also include a prominent role for digital humans — the NPCs (non-playable characters) and virtual assistants of the metaverse. According to Gartner’s 2021 tech hype cycle report, this tech will be relevant within the next ten years, and the best scenario for it is XR. 

Currently, there are companies like UNEEQ dedicated to this kind of tech. We will experience a giant leap from what we currently have with Alexa or Cortana, which are abstractions without a face, to a more empathetic digital assistant that can become our concierge and give us context-rich information resulting in an even more comfortable experience for the user. (This last statement considers that this tech will go beyond the Turing test and be obviously AI to avoid any dystopian ideas where we’re unsure if we’re speaking to a real person or an AI.)

The Actual Metaverse

At almost every AWE conference, Ori Inbar, CEO of Super Ventures, has mentioned the AR Cloud and how it will allow us to go fully spatial. In his words, “AR researchers and industry insiders have long envisioned that at some point in the future, the real-time 3D (or Spatial) map of the world, the AR Cloud, will be the single most important software infrastructure in computing, far more valuable than Facebook’s Social Graph or Google’s page rank index”.

Recently, at the AWE 2021 Asia edition, Alvin Wang Graylin, President of HTC China, highlighted some opportunities and challenges for the metaverse. He also shared six laws of the metaverse, which are worth mentioning:

  • There is only one metaverse
  • No person/company owns the metaverse
  • The metaverse is open and for everyone
  • The metaverse is device-independent
  • All users will have agency, and they can impact the metaverse
  • The metaverse is the next evolution of the internet

Neural Interfaces

The Neural interfaces being developed right now are going to be enablers for XR and other devices. However, there is still a long way to go since they connect neural paths to a chip — something that is much easier said than done.

For example, Neuralink is still in the R&D phase, and Next-mind launched this year, allowing you to control devices using non-invasive EEG technology. There is still a lot of research needed in this area before it becomes relevant in the market. Just as shown in this TED Talk, we can imagine a future where we control the input and outputs from our nervous systems, which opens the doors to new interfaces for communication. 

Make Your Move to the Metaverse

At Wizeline, we are no strangers to the metaverse. Our teams have contributed to conversations and projects around AR, VR, and XR, and we’re always up for the next challenge! 

Wizeline works with Fortune 500 brands and growing startups to build end-to-end digital products, implementing new trends to stay ahead of the competition. If you’re ready to explore how your brand can leverage the metaverse, we’d love to chat. Contact us today at consulting@wizeline.com to start the conversation!


Aisha Owolabi

Posted by Aisha Owolabi on November 3, 2021