Meta CEO and Founder Mark Zuckerberg has recently shared how the rise of AI and metaverse technologies are reshaping the way consumers and businesses can experience the physical and digital worlds. For Zuckerberg it is the combination of our physical and digital worlds that defines our reality.
The way that most people access technology can be rather limiting, Zuckerberg opines. Considering that too often we rely on screens to tap into virtual spaces and content. In his view, this pulls us away from the moment and the people we are physically with. Mixed reality brings digital objects into the physical world and can create fully immersive experiences to explore.
At the Connect event, Zuckerberg explained that many of the foundational technologies needed to make this vision a reality are either already here or on the way.
Advances in Artificial Intelligence
Technology firms are starting to build state-of-the-art AI into apps that people already use. This includes Meta’s image generation model Emu (Expressive Media Universe). This uses text prompts to generate high-quality, photorealistic images in just seconds.
Instagram will soon use the technology from Emu, such as the functionality to transform photos or co-create AI-generated images with connections. An example is the ‘restyle’ function that allows the user to reimagine their images by applying a range of visual styles (such as “watercolour” or “collage from magazines and newspapers, torn edges,” as examples).
Another function is the way the backdrop to an image leverages learnings from a feature called the ‘Segment Anything Model’. This permits the user to change an image’s scene or background.
Prompts like “put me in front of a sublime aurora borealis” or “surrounded by puppies”, as further examples, place the subject in the foreground while creating the background described.
Images created with restyle and backdrop will indicate the use of AI to help reduce the odds of people mistaking them for human-generated content. Meta is also experimenting with forms of visible and invisible markers to help people distinguish AI-generated content.
To further expand on the possibilities of artificial intelligence, Meta is building AI studio. This is a new platform for creating AIs that can help people to get things done.
People will be able to interact with these AIs across the Meta range of products. This can include profiles on Instagram and Facebook, and the ability to chat with them in WhatsApp, Messenger, and Instagram. Eventually, these will become embodied as avatars in the metaverse.
This has begun rolling out in the U.S. in beta mode. There is no indication when other parts of the world will be able to experiment with the technology, however.
Meta AI is a new assistant that people can interact with like a person. This functionality uses a custom model based on Llama 2 technology and has access to real-time information through a partnership with Bing search. Emu is built into Meta AI, so the user can generate high-quality photorealistic images in seconds.
The plans also include developing AIs that have more personality, opinions, and interests, and hence these are said to be more fun to interact with.
In a second article, some of Meta’s new hardware, designed for furthering interactions with the metaverse, is profiled.