Meta, the company formerly known as Facebook, today hosted a presentation where it announced its rebranding and showed what the future of IT would look like, but in the midst of all that noise, she actually had some interesting technology demonstrations. The former showed his lifelike Codex avatars and an environment they can exist in, which the company said was rendered in real time and reacted to real-world objects. Mark Zuckerberg also spoke about the company’s work with neural interfaces that let you control a computer just by moving your fingers.
We’ll discuss the demos in a moment, but it’s definitely worth seeing for yourself. You can view them on the Facebook keynote archives or on Reuters‘YouTube feed if you prefer this interface.
During the presentation, Meta showed off the work they have done on their codec avatars to give users greater control over their eyes, facial expressions, hairstyles, and appearance. The company also showed the ability to simulate how the avatar’s hair and skin reacted to different lighting conditions and environments and even how it worked on interactive clothing.
The presenter made it clear that this technology is “most definitely still research”, and it’s no wonder – there is an immense amount of material needed to create one of the avatars, and Meta probably uses very computers. powerful to make it. But it’s at least exciting that Meta’s goal is to enable us to render ourselves with graphics equivalent to what advanced video game engines are capable of.
The company also showcased their real-time environment rendering, which they believe will eventually be a place where you can use your avatar to interact with others. The system also allows people to interact with real life objects, with changes being reflected in the virtual world. While realistic environments aren’t new, being able to make changes to them in the real world and see those changes happen in a virtual environment is a really cool demo (although the tracking points added to real-life objects are a nice touch). little distracting in which they stand out).
When it comes to the neural interface, Meta relies on something called electromyography, or EMG, to transform the signals your brain sends to your hand into computer commands. He showed off the bracelet he uses to do it earlier this year, and the demo Meta gave today didn’t really show anything that wasn’t there in 2019 when he bought one. company called CTRL-Labs and acquired its technology, but it’s always good to see that it’s being developed.
Meta said he’s looking for EMG input instead of brain-reading devices, so he may have shown how you interact with his future devices. Compared to visions of people playing VR ping pong for an avatar audience, this seemed like the most realistic thing Meta showed on Thursday. This doesn’t mean that the Oculus (uh, Meta, apparently) Quest 3 will come with an EMG bracelet, but in a sea of hype without a lot of substance, its demo and the other actual research that Meta showed was a show. for sore eyes. Check them out if you’re looking for a break from the rest of the Facebook / Meta news.