The Metaverse & NFTs

The Metaverse & NFTs

About the author

Picture of Thomas Graham

Thomas Graham

Share This Post

What is the metaverse?

Over the last six months, the metaverse has become one of the most talked-about and frequently misunderstood topics around. So what exactly is it?

From a technical perspective, the metaverse is emerging as a set of persistent, 3D virtual environments that are linked together to create a virtual universe. A more straightforward description of the metaverse is that it is just the internet with a highly visual layer of interactive and immersive content on top. Coupled with web3 economic functionality, like decentralized blockchain payments and smart contracts, the metaverse promises a cornucopia of options for developers and creators to engage and incentivize users with exciting content experiences.

The online game Roblox is one of the most popular metaverse platforms currently available. Credit: Roblox

From 2D to 3D

The metaverse is often linked to VR and AR because these technologies allow users to fully immerse themselves in a 3D environment. Inside these 3D worlds, users interact with each other, play games, and build communities. But the metaverse is more than 3D gaming platforms and we don’t need cumbersome VR goggles to interact with it today. A rapidly growing number of companies are working on ‘metaverse’ initiatives that might take place in 3D environments, but for most users, they will ultimately be experienced through conventional 2D smartphone screens.

For example, large community-driven NFT projects, like Bored Ape Yacht Club and Meebits, along with the latest generation of upcoming drops, increasingly offer NFT holders and community members access to 3D assets and immersive experiences. However, most of the core user experience, including viewing persistent digital assets and chatting to distributed communities, still plays out on our smartphones. It will be some time until 3D engines, internet bandwidth, compute resources and consumer hardware are advanced enough to allow users to experience fully immersive ‘Ready Player One’ style metaverse content on a grand scale.

The highly desirable Bored Ape Yacht Club NFTs collection uses procedural generation based on 170 character traits to create the collection’s 10,000 unique apes. Credit: Bored Ape Yacht Club

Synthetic media and the metaverse

Synthetic media has a multitude of applications in the metaverse, allowing us to create user-specific content faster and with less human intervention. When we consider that the metaverse will be a more visual compute platform filled with user-generated content, it makes sense that AI models and synthetic media will play a large role in scaling immersive and interesting metaverse worlds. A good example today is how artists use AI models, like GANs, to create generative art represented as NFTs. Other NFTs collections utilize language models like OpenAI’s GPT3 to make them interactive and develop unique characters.

We believe one of the most significant applications of synthetic media in the metaverse will be the generation of hyperreal personal avatars. Virtual representations of users are increasingly being generated with AI to produce life-like personas and developers are similarly exploring ways of wrapping AI around existing “dumb” avatars to bring them to life. To heighten the immersion of virtual beings in the metaverse, avatars can even be endowed with AI-generated voices to help users better express themselves or create a new identity entirely. Elsewhere in the metaverse, AI is being used to populate games with intelligent non-playable characters that can don clothing that was also fashioned by machine learning algorithms.

From surreal to hyperreal

Today, many new-generation metaverse avatars take cartoonish or stylized forms. However, hyperreal synthetic media will ultimately transform avatars into deeply personal reflections of our real-world selves, while still allowing us to change our appearance, such as our hair or clothing. For embodied meetings and phone calls in VR, hyperreal synthetic media provides the most compelling vision of a metaverse that can replicate the authenticity and intimacy of real-world interactions.

Facebook’s Codec avatars provide a glimpse of how hyperreal synthetic media could change the game when it comes to metaverse interactions. Credit: Facebook.

The importance of getting the metaverse right

Creating new content based on training data provided by a professional human performer or scaling a person’s likeness beyond their original performance using AI are hot topics of debate today. There is no doubt that significant effort is required to establish ethical, legal, and economic norms to ensure that the livelihoods, privacy, and online experience of creators and users are protected from abuse. We frequently work with policy and lawmakers on these issues and created Synthetic Futures to raise the level of discourse around the impact of synthetic media on society. Ultimately, we are hopeful that new web3 economic modalities and increased access to AI content creation tools will give users and creators more options to benefit from what they create, both financially and in terms of attribution and permissioning.

With the metaverse moving towards the mainstream, synthetic media will be integral to delivering the avatars, NFTs, and experiences that bring its worlds and communities to life. As the technology continues to evolve, hyperreal synthetic media, in particular, will be at the forefront of the metaverse’s evolution, providing ‘meatspace’ authenticity in a wholly digital world.

More To Explore

LayGa - Source:

Editable Clothing Layers for Gaussian Splat Human Representations

While the new breed of Gaussian Splat-based neural humans hold much potential for VFX pipelines, it is very difficult to edit any one particular facet of these characters, such as changing their clothes. For the fashion industry in particular, which has a vested interest in ‘virtual try-ons’, it’s essential that this become possible. Now, a new paper from China has developed a multi-training method which allows users to switch out garments on virtual people.

A film grain effect applied to a stock image - source:

The Challenge of Simulating Grain in Film Stocks of the Past

Hit shows like The Marvelous Mrs. Maisel and WandaVision use some cool tricks to make modern footage look like it was shot in the 1960s, 70s, and various other eras from film and TV production. But one thing they can’t quite pull off convincingly is reproducing the grainy film stocks of yesterday – a really thorny problem that’s bound up with the chemical processes of emulsion film. With major directors such as Denis Villeneuve and Christopher Nolan fighting to keep the celluloid look alive, it would be great if AI could lend a hand. In this article, we look at the challenges involved with that.

It is the mark of an educated mind to be able to entertain a thought without accepting it.