Former AMD chief and current Intel graphics guru Raja Koduri wrote an article on the Intel PR site about the company’s plans to create the “plumbing” of the Metaverse, which includes some interesting projections for the ‘to come up. Since this is Intel PR, the coin naturally leans in the direction of how Intel can power all kinds of fun interactions, and has been doing it for a long time now (no argument on that), but what l ‘Koduri’s article makes it clear, this is the metaverse as demonstrated by Mark Zuckerberg recently is both a huge deal and a very, very far from reality.
To kick things off, Koduri describes the metaverse as such, “a utopian convergence of digital experiences fueled by Moore’s Law – an aspiration to enable rich, real-time, and globally interconnected virtual and augmented reality environments that will enable billions people to work, play, collaborate and socialize in whole new ways. Yes, that’s a lot of buzzwords bundled together in one sentence, but that’s basically what we’ve all seen so far: a 3D world where our avatars interact with each other in a whimsical setting while we’re sitting at home with some sort of HMD, or maybe even augmented reality glasses. Despite how hackneyed Demonstrations of this technology have been so far, Raja states that he sees the metaverse as the next great computing platform, in the same way that the mobile and the Internet have revolutionized computing in the modern age.
So what do we need to make it happen? First, Raja describes what is needed to create such a world, which would include “compelling and detailed avatars with lifelike clothing, hair and skin tones – all rendered in real time and based on sensor data. capturing real-world 3D objects, gestures, audio and more; data transfer at very high bandwidths and extremely low latencies; and a persistent model of the environment, which can contain both real and simulated elements. He concludes by asking how the company can solve this problem on a large scale – for hundreds of millions of users simultaneously? The only logical answer is that it can’t, saying, “We need orders of magnitude more powerful computing capabilities, accessible at much lower latencies across a multitude of device form factors,” he writes.
To do this, Intel has divided the problem of serving the Metaverse into three “layers” and says the company has worked on all three. They consist of: intelligence, operations and calculation. The intelligence layer is the software and tools used by developers, which Raja says should be open and based on a unified programming model to encourage easy deployment. The “ops” layer is basically the computing power available to users who don’t have local access to it, and the “compute” layer is simply the raw power needed to make everything work, which is where Intel comes in with it. its processors and its upcoming Arc. GPU. Remember, this is a PR article after all.
This is where it gets interesting. Raja says that to power all of this, we’ll need 1,000 times the computing power provided by today’s cutting-edge technology, and he notes that Intel has a roadmap to achieving power at scale. zetta by 2025. It’s quite a bold claim that Intel will first reach exascale in 2022 when it delivers its Aurora supercomputer to the Department of Energy. Similarly, the first exascale computer in the United States, Frontier, is reportedly being installed at Oak Ridge National Laboratory, but will not be operational until next year. If we just got to exascale, how long will it take us to get to zettascale? According to Wikipedia, it took 12 years to go from terascale to petascale, and 14 years to go from terascale to exascale, so it’s reasonable to think that it could be another ten years, at least, until we reached the zettascale.
Koduri concludes his piece with an end goal, which he believes is achievable. “We believe the dream of delivering a petaflop of computing power and a petabyte of data to less than a millisecond of every human being on the planet is within our grasp.” In a certain context, in 2013 we noted that people were skeptical about the possibility of achieving exascale by 2020, but we did, albeit a year later. Note, however, that the conundrum of power has never been solved.
Earlier this year, when China claimed to have built two exascale systems in secret, it was reported that the power consumption was around 35 MW per system. This is significantly higher than the 20 MW target initially set. We can expect these numbers to improve over time as machines become more efficient, but any push towards zettascale will require a fundamental overhaul in power consumption and scaling.