What is the metaverse and what does it mean for you? Michael Kagan, Chief Technology Officer at NVIDIA, explains the metaverse (or omniverse, as NVIDIA calls it) and links concepts around cloud computing, data centers, digital twins, and AI.

If you've wondered what the metaverse is or how our digital world is changing, you'll want to watch this interview.

The conversation includes these topics:

Michael Kagan has been NVIDIA's CTO (Chief Technology Officer) since May 2020. He joined NVIDIA through the acquisition of Mellanox, where he was CTO and a co-founder of the company, founded in April 1999. From 1983 to April 1999, Kagan held a number of architecture and design positions at Intel Corporation. Kagan holds a BSc. in Electrical Engineering from the Technion — Israel Institute of Technology.

Transcript

Michael Kagan: Train your robots in the virtual world to be better robots in the real world. Omniverse is the future. It's the bicycle of imagination or, as Steve Jobs called it, the computer bicycle of the mind.

About Michael Kagan, CTO, of NVIDIA

Michael Krigsman: What is the metaverse? Michael Kagan, Chief Technology Officer of NVIDIA explains.

Michael Kagan: I started my career with Intel where I was designing the microprocessors. The first processor that I was the architect of was i860 XP, which was the great vector machine, which was a little bit ahead of its time. No software support, so it vanished.

Then I managed the Pentium MMX design. This was the program that actually brought CPU design to Intel in Israel.

In '99, I joined a small team of people to fund Mellanox, to start Mellanox Technologies, which is the high-performance network. We were doing high-performance networking.

After 20 years, NVIDIA acquired us, acquired Mellanox. For me, it was a great closing of the circle because NVIDIA is doing the state-of-the-art vector processors, but they do have all the right software systems and ecosystem support.

I'm the lucky one to be the chief technology officer at NVIDIA. Actually, my charter is to architect across the wealth of NVIDIA technologies.

NVIDIA, I think, is the most technological company I've ever seen or ever worked with. It's an amazing opportunity to basically build the computing of the 21st Century. That's what my job is as the CTO of NVIDIA to architect across technologies.

What is the metaverse or omniverse?

Michael Krigsman: NVIDIA is very involved with what people call the metaverse and NVIDIA calls the omniverse. When we talk about the metaverse or the omniverse, what is it?

Michael Kagan: Metaverse is a virtual world. It's a world where you can make your fairytale come true. Omniverse is the implementation of this idea.

You can create a virtual world. You can live in the virtual world. You can collaborate in the virtual world across the globe with no boundaries, geographical boundaries, or limitations, or even language boundaries. You can work together. You can connect this virtual world to the real world, and you basically can make your fantasy come through.

One of the projects that we are embarking on at NVIDIA is we are doing a simulation of the real world, of the Earth. Basically, with the omniverse, you can make history to be experimental science.

You can simulate what will happen in the future. For example, how do we handle global warming, and how do we handle climate change?

Make your fantasies come through. Making fairytales is great, but what you can also do, you can actually simulate large projects before you build them.

You may have heard that large car manufacturers like BMW are working with us. They're actually simulating their large project before they build it.

Once we build something, you have a digital twin of this project, be it a data center, be it a manufacturing plant, be it a smart city. You can have an interaction of this digital twin with its real twin, and you can train your robots in the virtual world to be better robots in the real world.

Omniverse is the future. It's the bicycle of imagination or, as Steve Jobs called it, computer bicycle of the mind. The omniverse is the bicycle of imagination.

On digital twin applications in the metaverse (or omniverse)

Michael Krigsman: Is it accurate to say that the metaverse or your implementation of it, the omniverse, is a digital twin, or are we talking about an elaborate digital twin? I think many of us are familiar with the concept of a digital twin in manufacturing. Is this the next evolution? Where do they fit together?

Michael Kagan: It can be exactly the digital twin, as you mentioned. It can be the digital twin of a much higher scale. And it can be a digital world, which starts with a digital twin and then you evolve it further and basically try out what will happen in the real world by simulating it in the omniverse.

This digital twin can actually interact with its real counterpart. You can get the sensors data. You can stream it into the digital twin and simulate what's going to happen and how to respond and how to react, be it good or bad.

Michael Krigsman: What are the components of constructing the metaverse or your implementation, the omniverse? You're a chip designer, and you're looking at technologies across NVIDIA. Behind the scenes, what do we need to make this happen?

Michael Kagan: The omniverse is a tool to collaborate and to cooperate. You create a virtual world in the data center that simulates. It can have an accurate simulation of physics. There are interfaces that you can connect to this virtual world and make changes, updates. You can work on your projects from multiple places, from multiple plugins.

There are great tutorials and great presentations on the technical details on the omniverse on the GTC. Whoever is interested to get some more specific details is welcome to see the GTC recordings.

On computing platforms and the metaverse (or omniverse)

Michael Krigsman: What are the foundations? I know you're working on microprocessors. NVIDIA obviously creates GPUs. What are the pieces that must come together – you have the data – in order to build this kind of simulation?

Michael Kagan: It's an accelerated computing platform that is capable of simulating very large projects. It's a lot of data that feeds this platform. Actually, this is an accelerated platform that processes the data based on the AI models that were trained in the real world and sometimes on the simulated data.

There are loads of physics that we simulate in AI that actually cannot be simulated in the sequential machines, minimal machines. In order to have this such simulation, you need to have a data center-scaled computer, and you need to have the right infrastructure to accelerate the AI-based data process.

On AI and the metaverse (or omniverse)

Michael Krigsman: Why AI? Where does AI fit into this?

Michael Kagan: AI is the new way of processing the data. If you look at the traditional way of programming the computers, programming machines, you basically write the program. It's like explaining to somebody what to do, and then the computer does what you have done.

The thing is that we cannot explain or we don't know how to explain different things. Some things we just can't explain.

We can't explain how to distinguish between cat and dog. You can't explain how to distinguish between man and woman. But kids know how to. You show the kid the dog and show him the cat, and he will know which one is which. The way it's done is because he learned from examples.

The analogy here is that this AI is basically not a human writing software but software writing software. You give the computer; you teach the computer on the examples. You feed the neural network model, the AI model, with a lot of data from the real world. This training process results of the computer of this model being able to simulate physical phenomena – like in fluid dynamics – that you cannot simulate on the real computer.

Michael Krigsman: You are taking this notion of simulation and then extending it out to a very far degree. Again, is this an accurate although very simplistic way of looking at what's going on?

Michael Kagan: You can look at it this way where we simulate the real world, and we can make changes. This omniverse enables you to trial and error what will happen in the real world (if you try it in the virtual world).

On collecting data for advanced digital simulations

Michael Krigsman: Where do you get the data from in order to make this possible? If you're simulating the real world and you have a jet engine, well, you know the properties of that jet engine. But if you're simulating, if you're running a virtual environment, for example, where is the data coming from?

Michael Kagan: There is a lot of data that is available out there. We have our own sensors, like in our data centers or NVIDIA has a fleet of cars that are running. They are collecting data.

Once we have this model of the virtual world, we actually can generate data, and we can generate data that is either hard or very rare conditions on the outside. If you want to train your autonomous vehicle fleet to handle hazardous conditions on the road, you can get that much information from the real sensors, on the real cameras on the road. But in the virtual world, we can simulate hazardous conditions. By using the simulations, you can train your fleet of cars to handle them.

If you build an infrastructure that needs to sustain an earthquake, you can simulate an earthquake. You don't need to wait for an earthquake to get this data.

Once you have a digitally accurate model or ability to simulate digitally accurate simulations, you can generate data by yourself. This way you can actually train that more data you generate, the better your models are trained. Then you have better solutions and, basically, you're building a better world.

On federated learning and autonomous vehicles

Michael Krigsman: Right now, there's a tweet chat taking place. You can ask your questions from Michael Kagan. He's the CTO of NVIDIA. When else will you have this opportunity? Ask your questions on Twitter using the hashtag #CXOTalk.

If you are watching on LinkedIn, just put your questions into the chat. You can ask whatever you like. We'll turn to questions in just a couple of minutes.

Michael, you mentioned this concept of federated learning, and you referenced autonomous vehicles. Where do these concepts not fit into the metaverse?

Michael Kagan: Actually, everything that moves is autonomous. The reason it's autonomous or will be autonomous is because when you move, you need to make a decision. You need to make a decision and you cannot rely on the decision being made for you somewhere.

You need to make a decision. It means that everything that moves will have an AI model that helps him to make the decision and respond to the conditions that are happening. These decisions are yet another learning.

In the morning, there is a fleet of cars going out there driving, making decisions, learning something. Then, by the end of the day, they come back to the parking. They actually feed everything that they learned. Each one of them, they feed it back to the data center.

This data center consolidates all the learnings and basically updates the model for the cars and then updates the cars with the new model, which is now basically integrating the learnings from all other cars that were running that day. You have some sort of perpetual learning at a much higher rate than before.

Look at, just an example, all the world's emperors were sending explores to explore the world. They were coming back a few years later. Now it happens momentarily. All this experience of the autonomous vehicles or robots that are out there are being consolidated and everyone gets an update of everyone else's learnings.

Michael Krigsman: Let's take a few questions from Twitter and LinkedIn. Let's jump first to LinkedIn. Suman Kumar Chandra asks, "How far are we from the metaverse being in primetime, and is this going to be limited to the technologically advanced community only?" In other words, when will ordinary people get a chance to experience this and the benefits?

Michael Kagan: In terms of technology advancement, the idea is that everybody will be able to connect to this virtual world and live in this virtual world. There are developers. There are users. If you look at the evolution of computing and data processing, at the very beginning, you had to read very large books in order to operate the computer. Today, you buy an iPhone or an Android machine. There is not even a manual there. Just everything is included.

It takes a lot of engineering and a lot of smart people to make it work this way. But then everybody can consume it, starting from three-year-old kids, and you don't need to be an expert in the subject to consume it and to take advantage of it.

The same with the omniverse. We will live in this virtual world, and it will have an interface to everybody. Everybody will be able to be part of it.

Developers is one thing. The users are other things. You can sell things there. You can buy things there. You can live there.

Michael Krigsman: We have another question now. This is from Twitter. Adam Martin asks, "In regard to NVIDIA's omniverse, what do you think the largest benefit will be for companies in the architectural and engineering space, and how should companies be using it today?"

Michael Kagan: The omniverse enables you to simulate large projects. You can live it before you build it.

  • Now, in chip design, we simulate devices to make sure that they work once we manufacture them.
  • If you build a large plane or you project a city, you can simulate it on conventional machines, but the omniverse enables you to simulate the city.
  • If you put antennas for cellular communication, you can simulate where to place these antennas, including the cellular reflection from the buildings.

You can basically simulate very large projects and, I think, in the near future, you are not going to build anything without simulating it first because, by simulating it, you can build much more sophisticated projects. And if you simulate them before you build them, then it will most likely work at the first shot. Otherwise, it's a huge amount of money we spend building the large plants that are not working in the end because there was some mistake on the way.

You can think about the omniverse as the ability to simulate every project that you build. One of the big projects – humanity is there – is climate. I mentioned before, we are building a computer that will enable us to simulate the entire Earth. It's all the way up there, way beyond the plant or city simulation.

On differences between smart manufacturing digital twins and the metaverse (or omniverse)

Michael Krigsman: How is this different from the conventional notion of a digital twin that's used today in manufacturing and all kinds of different purposes?

Michael Kagan: The digital twin is a twin of something that existed. A digital twin is the twin of the real thing.

You can start from a digital twin, but then you can change it to what you want to make the real plant, the real world to be, basically creating a digital twin of the real twin that was not built yet. You can see whether this model fits your needs, whether it makes sense, whether it's economical, efficient, and then you can build it.

Michael Krigsman: Say that one more time; the digital twin of the model that was not yet built.

Michael Kagan: You can start with creating a digital twin of something that exists. Then you can apply some inputs on this digital twin that do not exist or are very rare in the real world. Then you can see how that twin reacts, and then how does it respond. Then you can basically advance your real world to do what you want it to do and what you simulated in the digital world.

Michael Krigsman: If you're listening, you should be asking questions. When else are you going to get the chance to ask Michael Kagan whatever you want about digital twins, the metaverse, and the omniverse?

This is from LinkedIn. Christopher Jablonski says, "Once adopted on a mass scale, how will the metaverse be used in combination with augmented reality, wearables, conversational computing, and smartphones, or will it be largely mutually exclusive and overtake existing paradigms?" He's asking where this is all going to be going.

Michael Kagan: We are making more and more out of the real world to be able to be simulated. Whatever used to be a mystery before now becomes a technical problem that we need to solve.

The augmented reality is a micro digital world. You take the glasses or whatever you have for augmented reality, and you use it and you see some adjustments in the reality.

But you can go much further because you can try to foresee what's going to happen if something happens in the real world. You can see what will happen in your neighborhood if a volcano erupts nearby and simulate it.

Michael Krigsman: We have another question from Twitter. This is from Arsalan Khan. Arsalan is a regular listener and always asks such good questions. He says, "The metaverse requires a lot of collaboration to acquire and share data. What if companies are not willing to share this data? Maybe they don't want to share their autonomous car driving data. What happens then?"

Michael Kagan: Whatever is based on data and whatever is based on sharing the data, of course, you cannot take advantage of this if you don't have the data. Around the data, there has always been this question and this anxiety about confidentiality.

Not everything you need to know in order to make things right. Not everything is really confidential. But if somebody doesn't want to share his data, he will not share the data. Therefore, this model will not take advantage of this data.

Of course, he can have his own (be it on-premises or in the cloud) protected model that will be fed with the data that he is willing to use in this model and to create his own virtual world with his own data.

On cloud computing platforms, data centers, and the metaverse (or omniverse)

Michael Krigsman: What about the cloud? Where is the intersection between the cloud and these virtual worlds, metaverse, omniverse, data centers? How does that piece together?

Michael Kagan: Cloud computing and the cloud, in general, is making the services. Computing and the storage are the services. It's the basic generation.

It's like everybody has electricity at home because there are large electric plants that are generating this electricity and deliver this power to everyone. Based on this, you can build a lot of things. In this analogy, you need that much compute power to build the virtual world model.

You will need this powerplant or data center to host this virtual world, and this virtual world will be connected to the clients or to different people that will live in this world. They will feed the data. They will consume the data.

You can think about Waze as the micro-virtual world of traffic. We have people that use Waze, and they collaborate with each other. They actually create this notion or this world, this model, and this data of the real-time traffic on the road. They use it, and actually work with each other. They collaborate (without even knowing each other) by sharing the data and letting this data center, the Waze servers in the data center, to consolidate the data and to use it to steer the traffic.

From this example, you can go pretty much everywhere. It can be in social. It can be in smart cities. It can be everywhere. There is data that is coming to the data center that's being consolidated. It builds the model and updates or lets people to consume this new model and new world of basically this digitalization.

Michael Krigsman: You're saying Waze is an example of people coming together, collaborating (even though they're not necessarily aware that they are collaborating), and creating a specialized virtual world for themselves for that period of time in which they are connected using Waze.

Michael Kagan: Yeah. Waze is an example of another virtual world, a virtual world of traffic that's of interest for me where I am.

The most amazing thing about Waze, and that's why I like this example, is that there are millions of people. They are collaborating with each other implicitly, and they benefit of this collaboration. It could only be enabled by this technology that has an ability to collect data from millions of sensors, consolidated, and sent back the insights.

Michael Krigsman: This is kind of a metaphysical question. Is the virtual world created by Waze a temporary virtual world? After all, it's going on and on and on, from the Waze standpoint. But for me as a driver, it's very temporary.

Michael Kagan: There are some basic things that Waze learns this stuff, and there is, I know, momentary condition of the road right now. But you can ask Waze that I want to be in this particular place two days from now at 5 o'clock in the evening. When do I need to leave my home?

Okay. How would it know when I need to leave home because it's already collected data. It has an experience of what's going on, on the roads. It can give you a fairly accurate prediction of when you need to leave your home to be at the office or theater at 5 o'clock in the evening.

Michael Krigsman: If we go back to the data center again, is it powerful computing, powerful storage, and data? Are those the building blocks that are required to make this happen?

Michael Kagan: Actually, the data center is a computer on its own. You can think about this; we used to have what we used to call a computer is a box under the table or some bigger box that is in the room.

Today, the data center is the computer. It's like a powerplant. We have a single unit of computing that is delivering services to billions of users.

It requires communication. It requires computing. It requires computing acceleration. It requires a lot of data. And it learns all the time.

Michael Krigsman: It sounds like when the machine overlords take over, it's going to be the data centers that do it.

Michael Kagan: You can always get scared. At some point, people got scared about cars, then about the planes, and then about other things.

Of course, there is a way that bad guys will take advantage of the technology. You need to be aware of it, and you need to react.

By the way, that's yet another use or advantage of having digital twins, for example, of the data center. You can simulate the cyber-attacks, and you can be prepared to respond to cyber-attacks.

Michael Kagan: Now, a machine taking advantage of people; today, it's a human-machine team. The machine can process the amount of data that no human can process. It can generate insights no human can generate. A human can make a decision that no machine can make. They just work together.

On cryptocurrencies and the metaverse

Michael Krigsman: We have some really interesting questions coming up on Twitter. Arsalan Khan comes back, and he says, "Do you think fiat currencies must become digital because of the metaverse, because we'll have to use cryptocurrencies in the metaverse for consumers?" He's asking about the impact on currency of the metaverse, as it becomes more broadly used.

Michael Kagan: In some sense, it's analogous to the electricity. At some point, the currency was the real gold metals or pieces of gold that people were carrying with them.

Today, currency is some mutual spreadsheet or some mutual table that is in the bank. You go to the store. You sign some paper, and some number moves from one table to another in the bank. You go out with a cart from the store or some other thing.

Okay. The currency and the money becomes more and more sort of abstract. I was thinking that money, the currency, is sort of analogous to electricity.

Electric energy, it's of course nothing by itself. But using the energy, you can do pretty much everything you want. I think this is what is happening with the currency.

Right now, there are currencies that are associated with the countries and there are different exchange rates. I don't know how far it will hold, and it'll be interesting to see how this evolves with the digital world, with the digital currencies.

I don't have my crystal ball to tell you exactly what's going to happen, but I think that if you look at the evolution of the currency and the money and the finances getting more and more abstract and being some sort of a surrogate that you can use to build or to consume and to make various things.

Michael Krigsman: Just another reminder to folks listening. If you haven't asked questions, well, when else are you going to get to ask Michael Hagan, the CTO of NVIDIA, whatever you want? Man, you guys should be asking questions.

We have an interesting question from Rogin Robert. He is the innovation lead at Mazars London. He says, "Can you share some insights on any exciting firmwide transformation projects that you have executed recently?"

Michael Kagan: NVIDIA has a wealth of technology. No, we are accelerating the computing.

If you look at the computing acceleration that was based on Moore's law, it was 2x about every other year. In the last 20 years, the computing is 2x every year. That's because there is much more innovation on top of the computer stack. It's not just at the bottom on the project technology.

Some people refer to it as Huang's law. Jensen Huang, our CEO, he made this observation and prediction that accelerated computing will improve the performance by 2x every year, and that's what's happening in the last 20 years.

Making all these things work together, it's amazing. The more compute power we build, it's more consumed, and it creates even more demand for more compute power.

Michael Krigsman: What about the issue of heat? I can tell you; I of course deal with a lot of videos. I have the latest and greatest Intel chip and the highest-end NVIDIA graphics card, and my office becomes super-hot, and heat is a real issue. What about the heat issue? I'll ask you to answer that really quickly because we're going to run out of time.

Michael Kagan: Well, it is an issue. We are developing and deploying various technologies to handle it. [Laughter] It is an issue and, actually, it's one of the things on the top of the list on the design technology how to handle it.

On how the metaverse (omniverse) will change distributed computing and data storage

Michael Krigsman: Okay. We have another question from Twitter. This is a really interesting one. It is from Lisbeth Shaw. She says, "How will the metaverse and omniverse architectures change the architectures of distributed computing and data storage?" She's asking about the impact of metaverse and omniverse on the architecture of distributed computing and storage.

Michael Kagan: You don't need to go that far to the metaverse or the omniverse. The distributed computing, it started with this high-performance computing and now the data centers are built as the cloud-native supercomputer. This is the basic infrastructure that's based on fast data processing engines, which is GPU.

There are basically three pillars there. There is the CPU for a framework to run the application framework, a GPU that actually crunches the data, and the GPU which is the data processing unit that is the infrastructure computing platform running the operating system of this big machine. It's changing today and the omniverse is one of the big consumers of this computing technology.

Michael Krigsman: Can you describe for us the implications for business people? We have business people who watch this show, listening to this, and is very technology-centric. What are the implications for those of us in business?

Michael Kagan: You can try everything. You can try everything out before you build it or before you try to exercise it in the real life.

For business people, you can start. It will have clear implications on your business, and it will be in the heart of the business, this virtual world, that you can try it out. You can simulate what you want to do. You can see what the results look like.

Michael Krigsman: Arsalan Khan comes back again with a really, really interesting question. Again, it's kind of getting metaphysical here. He says, "Will the metaverse have a government? Who controls it? How will people get elected?" If the metaverse is a world, how is it managed?

Michael Kagan: That's an interesting question because I am also interested to know the answer. We will have to just leave it and see.

Michael Krigsman: Michael Kagan, any final thoughts before we finish up?

Michael Kagan: Not to be encumbered with this past history or some limitation that we impose on ourselves. New technologies will do amazing things for us.

Be optimistic. Computers will not take over humanity.

Michael Krigsman: I have one final question that I've been curious about for many years. In business, technology projects (before they go live) we all know can fail, have trouble, what have you. When you're designing microprocessors with billions and billions of transistors and at huge scale, how do you make sure that when you ship the thing out the door, this complex thing actually works?

Michael Kagan: Simulating a chip is like a little omniverse. We are doing the simulations and simulating everything that we are building.

Michael Krigsman: You don't go to bed at night sometimes, suddenly wake up, and realize, "Oh, my God. We had this horrible flaw that we haven't picked up and the thing is about to ship or it just shipped"? That doesn't happen?

Michael Kagan: I wouldn't say. When you develop the chip – and I'm in this business almost 40 years – every time, you have these concerns and afraid that it will not work. You know that you have done everything that you know how to do before you send the database to manufacturing. You always miss a beat when a chip comes back and, all of a sudden, it stops working.

Michael Krigsman: With that, Michael Kagan, Chief Technology Officer of NVIDIA, thank you so much for taking the time to share all this interesting knowledge with us. I really, really appreciate you being here.

Michael Kagan: Thank you very much.

Michael Krigsman: Thank you to everybody who watched. It's been a very, very fast 45 minutes.

Before you go, please subscribe to our YouTube channel. Hit the subscribe button at the top of our website so we can send you updates on these live shows. Tell people so we can get people to ask questions like you guys have been doing.

Thanks so much, everybody. I hope you have a great day, and we will see you again next time. 

Michael Kagan: Train your robots in the virtual world to be better robots in the real world. Omniverse is the future. It's the bicycle of imagination or, as Steve Jobs called it, the computer bicycle of the mind.

About Michael Kagan, CTO, of NVIDIA

Michael Krigsman: What is the metaverse? Michael Kagan, Chief Technology Officer of NVIDIA explains.

Michael Kagan: I started my career with Intel where I was designing the microprocessors. The first processor that I was the architect of was i860 XP, which was the great vector machine, which was a little bit ahead of its time. No software support, so it vanished.

Then I managed the Pentium MMX design. This was the program that actually brought CPU design to Intel in Israel.

In '99, I joined a small team of people to fund Mellanox, to start Mellanox Technologies, which is the high-performance network. We were doing high-performance networking.

After 20 years, NVIDIA acquired us, acquired Mellanox. For me, it was a great closing of the circle because NVIDIA is doing the state-of-the-art vector processors, but they do have all the right software systems and ecosystem support.

I'm the lucky one to be the chief technology officer at NVIDIA. Actually, my charter is to architect across the wealth of NVIDIA technologies.

NVIDIA, I think, is the most technological company I've ever seen or ever worked with. It's an amazing opportunity to basically build the computing of the 21st Century. That's what my job is as the CTO of NVIDIA to architect across technologies.

What is the metaverse or omniverse?

Michael Krigsman: NVIDIA is very involved with what people call the metaverse and NVIDIA calls the omniverse. When we talk about the metaverse or the omniverse, what is it?

Michael Kagan: Metaverse is a virtual world. It's a world where you can make your fairytale come true. Omniverse is the implementation of this idea.

You can create a virtual world. You can live in the virtual world. You can collaborate in the virtual world across the globe with no boundaries, geographical boundaries, or limitations, or even language boundaries. You can work together. You can connect this virtual world to the real world, and you basically can make your fantasy come through.

One of the projects that we are embarking on at NVIDIA is we are doing a simulation of the real world, of the Earth. Basically, with the omniverse, you can make history to be experimental science.

You can simulate what will happen in the future. For example, how do we handle global warming, and how do we handle climate change?

Make your fantasies come through. Making fairytales is great, but what you can also do, you can actually simulate large projects before you build them.

You may have heard that large car manufacturers like BMW are working with us. They're actually simulating their large project before they build it.

Once we build something, you have a digital twin of this project, be it a data center, be it a manufacturing plant, be it a smart city. You can have an interaction of this digital twin with its real twin, and you can train your robots in the virtual world to be better robots in the real world.

Omniverse is the future. It's the bicycle of imagination or, as Steve Jobs called it, computer bicycle of the mind. The omniverse is the bicycle of imagination.

On digital twin applications in the metaverse (or omniverse)

Michael Krigsman: Is it accurate to say that the metaverse or your implementation of it, the omniverse, is a digital twin, or are we talking about an elaborate digital twin? I think many of us are familiar with the concept of a digital twin in manufacturing. Is this the next evolution? Where do they fit together?

Michael Kagan: It can be exactly the digital twin, as you mentioned. It can be the digital twin of a much higher scale. And it can be a digital world, which starts with a digital twin and then you evolve it further and basically try out what will happen in the real world by simulating it in the omniverse.

This digital twin can actually interact with its real counterpart. You can get the sensors data. You can stream it into the digital twin and simulate what's going to happen and how to respond and how to react, be it good or bad.

Michael Krigsman: What are the components of constructing the metaverse or your implementation, the omniverse? You're a chip designer, and you're looking at technologies across NVIDIA. Behind the scenes, what do we need to make this happen?

Michael Kagan: The omniverse is a tool to collaborate and to cooperate. You create a virtual world in the data center that simulates. It can have an accurate simulation of physics. There are interfaces that you can connect to this virtual world and make changes, updates. You can work on your projects from multiple places, from multiple plugins.

There are great tutorials and great presentations on the technical details on the omniverse on the GTC. Whoever is interested to get some more specific details is welcome to see the GTC recordings.

On computing platforms and the metaverse (or omniverse)

Michael Krigsman: What are the foundations? I know you're working on microprocessors. NVIDIA obviously creates GPUs. What are the pieces that must come together – you have the data – in order to build this kind of simulation?

Michael Kagan: It's an accelerated computing platform that is capable of simulating very large projects. It's a lot of data that feeds this platform. Actually, this is an accelerated platform that processes the data based on the AI models that were trained in the real world and sometimes on the simulated data.

There are loads of physics that we simulate in AI that actually cannot be simulated in the sequential machines, minimal machines. In order to have this such simulation, you need to have a data center-scaled computer, and you need to have the right infrastructure to accelerate the AI-based data process.

On AI and the metaverse (or omniverse)

Michael Krigsman: Why AI? Where does AI fit into this?

Michael Kagan: AI is the new way of processing the data. If you look at the traditional way of programming the computers, programming machines, you basically write the program. It's like explaining to somebody what to do, and then the computer does what you have done.

The thing is that we cannot explain or we don't know how to explain different things. Some things we just can't explain.

We can't explain how to distinguish between cat and dog. You can't explain how to distinguish between man and woman. But kids know how to. You show the kid the dog and show him the cat, and he will know which one is which. The way it's done is because he learned from examples.

The analogy here is that this AI is basically not a human writing software but software writing software. You give the computer; you teach the computer on the examples. You feed the neural network model, the AI model, with a lot of data from the real world. This training process results of the computer of this model being able to simulate physical phenomena – like in fluid dynamics – that you cannot simulate on the real computer.

Michael Krigsman: You are taking this notion of simulation and then extending it out to a very far degree. Again, is this an accurate although very simplistic way of looking at what's going on?

Michael Kagan: You can look at it this way where we simulate the real world, and we can make changes. This omniverse enables you to trial and error what will happen in the real world (if you try it in the virtual world).

On collecting data for advanced digital simulations

Michael Krigsman: Where do you get the data from in order to make this possible? If you're simulating the real world and you have a jet engine, well, you know the properties of that jet engine. But if you're simulating, if you're running a virtual environment, for example, where is the data coming from?

Michael Kagan: There is a lot of data that is available out there. We have our own sensors, like in our data centers or NVIDIA has a fleet of cars that are running. They are collecting data.

Once we have this model of the virtual world, we actually can generate data, and we can generate data that is either hard or very rare conditions on the outside. If you want to train your autonomous vehicle fleet to handle hazardous conditions on the road, you can get that much information from the real sensors, on the real cameras on the road. But in the virtual world, we can simulate hazardous conditions. By using the simulations, you can train your fleet of cars to handle them.

If you build an infrastructure that needs to sustain an earthquake, you can simulate an earthquake. You don't need to wait for an earthquake to get this data.

Once you have a digitally accurate model or ability to simulate digitally accurate simulations, you can generate data by yourself. This way you can actually train that more data you generate, the better your models are trained. Then you have better solutions and, basically, you're building a better world.

On federated learning and autonomous vehicles

Michael Krigsman: Right now, there's a tweet chat taking place. You can ask your questions from Michael Kagan. He's the CTO of NVIDIA. When else will you have this opportunity? Ask your questions on Twitter using the hashtag #CXOTalk.

If you are watching on LinkedIn, just put your questions into the chat. You can ask whatever you like. We'll turn to questions in just a couple of minutes.

Michael, you mentioned this concept of federated learning, and you referenced autonomous vehicles. Where do these concepts not fit into the metaverse?

Michael Kagan: Actually, everything that moves is autonomous. The reason it's autonomous or will be autonomous is because when you move, you need to make a decision. You need to make a decision and you cannot rely on the decision being made for you somewhere.

You need to make a decision. It means that everything that moves will have an AI model that helps him to make the decision and respond to the conditions that are happening. These decisions are yet another learning.

In the morning, there is a fleet of cars going out there driving, making decisions, learning something. Then, by the end of the day, they come back to the parking. They actually feed everything that they learned. Each one of them, they feed it back to the data center.

This data center consolidates all the learnings and basically updates the model for the cars and then updates the cars with the new model, which is now basically integrating the learnings from all other cars that were running that day. You have some sort of perpetual learning at a much higher rate than before.

Look at, just an example, all the world's emperors were sending explores to explore the world. They were coming back a few years later. Now it happens momentarily. All this experience of the autonomous vehicles or robots that are out there are being consolidated and everyone gets an update of everyone else's learnings.

Michael Krigsman: Let's take a few questions from Twitter and LinkedIn. Let's jump first to LinkedIn. Suman Kumar Chandra asks, "How far are we from the metaverse being in primetime, and is this going to be limited to the technologically advanced community only?" In other words, when will ordinary people get a chance to experience this and the benefits?

Michael Kagan: In terms of technology advancement, the idea is that everybody will be able to connect to this virtual world and live in this virtual world. There are developers. There are users. If you look at the evolution of computing and data processing, at the very beginning, you had to read very large books in order to operate the computer. Today, you buy an iPhone or an Android machine. There is not even a manual there. Just everything is included.

It takes a lot of engineering and a lot of smart people to make it work this way. But then everybody can consume it, starting from three-year-old kids, and you don't need to be an expert in the subject to consume it and to take advantage of it.

The same with the omniverse. We will live in this virtual world, and it will have an interface to everybody. Everybody will be able to be part of it.

Developers is one thing. The users are other things. You can sell things there. You can buy things there. You can live there.

Michael Krigsman: We have another question now. This is from Twitter. Adam Martin asks, "In regard to NVIDIA's omniverse, what do you think the largest benefit will be for companies in the architectural and engineering space, and how should companies be using it today?"

Michael Kagan: The omniverse enables you to simulate large projects. You can live it before you build it.

  • Now, in chip design, we simulate devices to make sure that they work once we manufacture them.
  • If you build a large plane or you project a city, you can simulate it on conventional machines, but the omniverse enables you to simulate the city.
  • If you put antennas for cellular communication, you can simulate where to place these antennas, including the cellular reflection from the buildings.

You can basically simulate very large projects and, I think, in the near future, you are not going to build anything without simulating it first because, by simulating it, you can build much more sophisticated projects. And if you simulate them before you build them, then it will most likely work at the first shot. Otherwise, it's a huge amount of money we spend building the large plants that are not working in the end because there was some mistake on the way.

You can think about the omniverse as the ability to simulate every project that you build. One of the big projects – humanity is there – is climate. I mentioned before, we are building a computer that will enable us to simulate the entire Earth. It's all the way up there, way beyond the plant or city simulation.

On differences between smart manufacturing digital twins and the metaverse (or omniverse)

Michael Krigsman: How is this different from the conventional notion of a digital twin that's used today in manufacturing and all kinds of different purposes?

Michael Kagan: The digital twin is a twin of something that existed. A digital twin is the twin of the real thing.

You can start from a digital twin, but then you can change it to what you want to make the real plant, the real world to be, basically creating a digital twin of the real twin that was not built yet. You can see whether this model fits your needs, whether it makes sense, whether it's economical, efficient, and then you can build it.

Michael Krigsman: Say that one more time; the digital twin of the model that was not yet built.

Michael Kagan: You can start with creating a digital twin of something that exists. Then you can apply some inputs on this digital twin that do not exist or are very rare in the real world. Then you can see how that twin reacts, and then how does it respond. Then you can basically advance your real world to do what you want it to do and what you simulated in the digital world.

Michael Krigsman: If you're listening, you should be asking questions. When else are you going to get the chance to ask Michael Kagan whatever you want about digital twins, the metaverse, and the omniverse?

This is from LinkedIn. Christopher Jablonski says, "Once adopted on a mass scale, how will the metaverse be used in combination with augmented reality, wearables, conversational computing, and smartphones, or will it be largely mutually exclusive and overtake existing paradigms?" He's asking where this is all going to be going.

Michael Kagan: We are making more and more out of the real world to be able to be simulated. Whatever used to be a mystery before now becomes a technical problem that we need to solve.

The augmented reality is a micro digital world. You take the glasses or whatever you have for augmented reality, and you use it and you see some adjustments in the reality.

But you can go much further because you can try to foresee what's going to happen if something happens in the real world. You can see what will happen in your neighborhood if a volcano erupts nearby and simulate it.

Michael Krigsman: We have another question from Twitter. This is from Arsalan Khan. Arsalan is a regular listener and always asks such good questions. He says, "The metaverse requires a lot of collaboration to acquire and share data. What if companies are not willing to share this data? Maybe they don't want to share their autonomous car driving data. What happens then?"

Michael Kagan: Whatever is based on data and whatever is based on sharing the data, of course, you cannot take advantage of this if you don't have the data. Around the data, there has always been this question and this anxiety about confidentiality.

Not everything you need to know in order to make things right. Not everything is really confidential. But if somebody doesn't want to share his data, he will not share the data. Therefore, this model will not take advantage of this data.

Of course, he can have his own (be it on-premises or in the cloud) protected model that will be fed with the data that he is willing to use in this model and to create his own virtual world with his own data.

On cloud computing platforms, data centers, and the metaverse (or omniverse)

Michael Krigsman: What about the cloud? Where is the intersection between the cloud and these virtual worlds, metaverse, omniverse, data centers? How does that piece together?

Michael Kagan: Cloud computing and the cloud, in general, is making the services. Computing and the storage are the services. It's the basic generation.

It's like everybody has electricity at home because there are large electric plants that are generating this electricity and deliver this power to everyone. Based on this, you can build a lot of things. In this analogy, you need that much compute power to build the virtual world model.

You will need this powerplant or data center to host this virtual world, and this virtual world will be connected to the clients or to different people that will live in this world. They will feed the data. They will consume the data.

You can think about Waze as the micro-virtual world of traffic. We have people that use Waze, and they collaborate with each other. They actually create this notion or this world, this model, and this data of the real-time traffic on the road. They use it, and actually work with each other. They collaborate (without even knowing each other) by sharing the data and letting this data center, the Waze servers in the data center, to consolidate the data and to use it to steer the traffic.

From this example, you can go pretty much everywhere. It can be in social. It can be in smart cities. It can be everywhere. There is data that is coming to the data center that's being consolidated. It builds the model and updates or lets people to consume this new model and new world of basically this digitalization.

Michael Krigsman: You're saying Waze is an example of people coming together, collaborating (even though they're not necessarily aware that they are collaborating), and creating a specialized virtual world for themselves for that period of time in which they are connected using Waze.

Michael Kagan: Yeah. Waze is an example of another virtual world, a virtual world of traffic that's of interest for me where I am.

The most amazing thing about Waze, and that's why I like this example, is that there are millions of people. They are collaborating with each other implicitly, and they benefit of this collaboration. It could only be enabled by this technology that has an ability to collect data from millions of sensors, consolidated, and sent back the insights.

Michael Krigsman: This is kind of a metaphysical question. Is the virtual world created by Waze a temporary virtual world? After all, it's going on and on and on, from the Waze standpoint. But for me as a driver, it's very temporary.

Michael Kagan: There are some basic things that Waze learns this stuff, and there is, I know, momentary condition of the road right now. But you can ask Waze that I want to be in this particular place two days from now at 5 o'clock in the evening. When do I need to leave my home?

Okay. How would it know when I need to leave home because it's already collected data. It has an experience of what's going on, on the roads. It can give you a fairly accurate prediction of when you need to leave your home to be at the office or theater at 5 o'clock in the evening.

Michael Krigsman: If we go back to the data center again, is it powerful computing, powerful storage, and data? Are those the building blocks that are required to make this happen?

Michael Kagan: Actually, the data center is a computer on its own. You can think about this; we used to have what we used to call a computer is a box under the table or some bigger box that is in the room.

Today, the data center is the computer. It's like a powerplant. We have a single unit of computing that is delivering services to billions of users.

It requires communication. It requires computing. It requires computing acceleration. It requires a lot of data. And it learns all the time.

Michael Krigsman: It sounds like when the machine overlords take over, it's going to be the data centers that do it.

Michael Kagan: You can always get scared. At some point, people got scared about cars, then about the planes, and then about other things.

Of course, there is a way that bad guys will take advantage of the technology. You need to be aware of it, and you need to react.

By the way, that's yet another use or advantage of having digital twins, for example, of the data center. You can simulate the cyber-attacks, and you can be prepared to respond to cyber-attacks.

Michael Kagan: Now, a machine taking advantage of people; today, it's a human-machine team. The machine can process the amount of data that no human can process. It can generate insights no human can generate. A human can make a decision that no machine can make. They just work together.

On cryptocurrencies and the metaverse

Michael Krigsman: We have some really interesting questions coming up on Twitter. Arsalan Khan comes back, and he says, "Do you think fiat currencies must become digital because of the metaverse, because we'll have to use cryptocurrencies in the metaverse for consumers?" He's asking about the impact on currency of the metaverse, as it becomes more broadly used.

Michael Kagan: In some sense, it's analogous to the electricity. At some point, the currency was the real gold metals or pieces of gold that people were carrying with them.

Today, currency is some mutual spreadsheet or some mutual table that is in the bank. You go to the store. You sign some paper, and some number moves from one table to another in the bank. You go out with a cart from the store or some other thing.

Okay. The currency and the money becomes more and more sort of abstract. I was thinking that money, the currency, is sort of analogous to electricity.

Electric energy, it's of course nothing by itself. But using the energy, you can do pretty much everything you want. I think this is what is happening with the currency.

Right now, there are currencies that are associated with the countries and there are different exchange rates. I don't know how far it will hold, and it'll be interesting to see how this evolves with the digital world, with the digital currencies.

I don't have my crystal ball to tell you exactly what's going to happen, but I think that if you look at the evolution of the currency and the money and the finances getting more and more abstract and being some sort of a surrogate that you can use to build or to consume and to make various things.

Michael Krigsman: Just another reminder to folks listening. If you haven't asked questions, well, when else are you going to get to ask Michael Hagan, the CTO of NVIDIA, whatever you want? Man, you guys should be asking questions.

We have an interesting question from Rogin Robert. He is the innovation lead at Mazars London. He says, "Can you share some insights on any exciting firmwide transformation projects that you have executed recently?"

Michael Kagan: NVIDIA has a wealth of technology. No, we are accelerating the computing.

If you look at the computing acceleration that was based on Moore's law, it was 2x about every other year. In the last 20 years, the computing is 2x every year. That's because there is much more innovation on top of the computer stack. It's not just at the bottom on the project technology.

Some people refer to it as Huang's law. Jensen Huang, our CEO, he made this observation and prediction that accelerated computing will improve the performance by 2x every year, and that's what's happening in the last 20 years.

Making all these things work together, it's amazing. The more compute power we build, it's more consumed, and it creates even more demand for more compute power.

Michael Krigsman: What about the issue of heat? I can tell you; I of course deal with a lot of videos. I have the latest and greatest Intel chip and the highest-end NVIDIA graphics card, and my office becomes super-hot, and heat is a real issue. What about the heat issue? I'll ask you to answer that really quickly because we're going to run out of time.

Michael Kagan: Well, it is an issue. We are developing and deploying various technologies to handle it. [Laughter] It is an issue and, actually, it's one of the things on the top of the list on the design technology how to handle it.

On how the metaverse (omniverse) will change distributed computing and data storage

Michael Krigsman: Okay. We have another question from Twitter. This is a really interesting one. It is from Lisbeth Shaw. She says, "How will the metaverse and omniverse architectures change the architectures of distributed computing and data storage?" She's asking about the impact of metaverse and omniverse on the architecture of distributed computing and storage.

Michael Kagan: You don't need to go that far to the metaverse or the omniverse. The distributed computing, it started with this high-performance computing and now the data centers are built as the cloud-native supercomputer. This is the basic infrastructure that's based on fast data processing engines, which is GPU.

There are basically three pillars there. There is the CPU for a framework to run the application framework, a GPU that actually crunches the data, and the GPU which is the data processing unit that is the infrastructure computing platform running the operating system of this big machine. It's changing today and the omniverse is one of the big consumers of this computing technology.

Michael Krigsman: Can you describe for us the implications for business people? We have business people who watch this show, listening to this, and is very technology-centric. What are the implications for those of us in business?

Michael Kagan: You can try everything. You can try everything out before you build it or before you try to exercise it in the real life.

For business people, you can start. It will have clear implications on your business, and it will be in the heart of the business, this virtual world, that you can try it out. You can simulate what you want to do. You can see what the results look like.

Michael Krigsman: Arsalan Khan comes back again with a really, really interesting question. Again, it's kind of getting metaphysical here. He says, "Will the metaverse have a government? Who controls it? How will people get elected?" If the metaverse is a world, how is it managed?

Michael Kagan: That's an interesting question because I am also interested to know the answer. We will have to just leave it and see.

Michael Krigsman: Michael Kagan, any final thoughts before we finish up?

Michael Kagan: Not to be encumbered with this past history or some limitation that we impose on ourselves. New technologies will do amazing things for us.

Be optimistic. Computers will not take over humanity.

Michael Krigsman: I have one final question that I've been curious about for many years. In business, technology projects (before they go live) we all know can fail, have trouble, what have you. When you're designing microprocessors with billions and billions of transistors and at huge scale, how do you make sure that when you ship the thing out the door, this complex thing actually works?

Michael Kagan: Simulating a chip is like a little omniverse. We are doing the simulations and simulating everything that we are building.

Michael Krigsman: You don't go to bed at night sometimes, suddenly wake up, and realize, "Oh, my God. We had this horrible flaw that we haven't picked up and the thing is about to ship or it just shipped"? That doesn't happen?

Michael Kagan: I wouldn't say. When you develop the chip – and I'm in this business almost 40 years – every time, you have these concerns and afraid that it will not work. You know that you have done everything that you know how to do before you send the database to manufacturing. You always miss a beat when a chip comes back and, all of a sudden, it stops working.

Michael Krigsman: With that, Michael Kagan, Chief Technology Officer of NVIDIA, thank you so much for taking the time to share all this interesting knowledge with us. I really, really appreciate you being here.

Michael Kagan: Thank you very much.

Michael Krigsman: Thank you to everybody who watched. It's been a very, very fast 45 minutes.

Before you go, please subscribe to our YouTube channel. Hit the subscribe button at the top of our website so we can send you updates on these live shows. Tell people so we can get people to ask questions like you guys have been doing.

Thanks so much, everybody. I hope you have a great day, and we will see you again next time.