Intel: Realizing Metaverse is a Huge Challenge

Technology companies are currently competing to enter the world of the metaverse, including in developing their ecosystem.

Intel also recently released their views on the metaverse, via an opinion piece on its official website by Raja Koduri, Senior Vice President, General Manager of the Accelerated Computing Systems and Graphics Group.

Koduri admits that indeed the metaverse may become the next major platform in computing after the world wide web and mobile.

“There is reason to believe that we are on the cusp of the next major transition in computing. A transition that enables persistent and immersive computing at scale,” said Koduri.

Koduri said computer animation in films today is almost indistinguishable from live-action. Playing the game now also provides a very realistic graphic display, also Virtual Reality and Augmented Reality displays have been growing more sophisticated and creating immersive experiences in the last few years.

The pandemic is also considered to have forced many people to rely on digital technology, as the only way to communicate, collaborate, learn, and survive.

“The explosion of decentralized digital financial technology is inspiring business models that encourage everyone to play a role in creating this metaverse,” said Koduri.

Even so, according to Koduri, the metaverse also requires a lot of things ranging from convincing avatars, clothes, realistic hair and skin colors with real-time display, super high data transfer, and bandwidth, to very low latency.

“Imagine solving this problem at scale – for hundreds of millions of concurrent users – and you’ll soon realize our current computing, storage, and network infrastructure is not sufficient to realize this vision.”

Koduri says it takes many times over for more powerful computing capabilities and can be accessed at much lower latency across many device form factors.

“To enable this capability on a large scale, the entire internet channel will require a major upgrade,” said Koduri.

Leave a Reply