Rethinking what’s possible with the technologies of tomorrow

Although the two-day COMPUTEX Forum concluded today, its last session beckoned toward the future, as NVIDIA and IBM shared technologies at the razer edge of what’s possible, already available today.

Advances in the metaverse and quantum computing are enabling possibilities unthinkable even a few years ago, with even more rapid changes to come

Although the two-day COMPUTEX Forum concluded today, its last session beckoned toward the future, as NVIDIA and IBM shared technologies at the razer edge of what’s possible, already available today.

To speak on their relative advancements in the metaverse and quantum computing, COMPUTEX organizer the Taiwan External Trade Development Council (TAITRA) invited to the virtual stage Richard Kerris, head of WW developer relations and general manager for Omniverse at NVIDIA, and Norishige Morimoto, CTO and vice president of IBM Japan.

“3D workflows have become essential across all sorts of industries,” including architecture, robotics, product design and more, Kerris said, with projects getting bigger all the time.

Imagine an animation studio with hundreds of artists, all working on the same project. Some people might be using Maya while others are in Substance Painter, some across the room and others across the world. Every time they want to share between platforms, they need to export, wasting time and opening the door to frustrating compatibility issues. Add to that the size of the data set they are working with, and the problem becomes exponentially more difficult.

Yet even with these challenges, 3D workflows will increasingly come to characterize the way work is done.

“We believe that anything that will be built will be visualized, anything that moves will be autonomous and anything that’s autonomous will be simulated,” Kerris said. “Rather than go to the expense of building something in the physical world, why not build it in the synthetic world? Learn what it’s going to look like, learn how things are going to affect it,” such as the way light hits a building or robots move across a factory floor. Tweaks could be made in the virtual world free of charge, and hundreds of robots trained at once rather than one by one.

To make this ideal a reality, NVIDIA created the Omniverse. This RTX-based platform allows 3D creators to collaborate on projects in real time, regardless of where they are, the size of the data they are working with or what tools they are using.

Numerous partners across industries are already learning the value of the Omniverse. In architecture, Kohn Pedersen Fox (KPF) is testing how weather and sun will affect buildings, while Woods Bagot is simulating how buildings will affect the landscape. Ericsson is modeling the most efficient places to install 5G nodes. Even systems as large as BMW’s factories are being simulated in Omniverse, while WPP has digitized entire filming locations.

“In the future, more content and more experiences will be done in the virtual world than in the physical world,” Kerris said, with Omniverse offering a connector between them.

The forum then turned to Morimoto from IBM, whose work on quantum computing promises unlimited possibilities that are to be achievable sooner than we think.

Advancements in computing have traditionally been driven by transistor shrinkage, over a half century dwindling from micrometer to nanometer, and now with IBM’s new 2nm node, transistors are being produced that are half the diameter of a coronavirus.

“However, that alone is not enough,” he said. While powerful, today’s machines are only able to process what Morimoto called “narrow AI,” meaning deep learning that requires a complete and accurate data set with only one answer. “Instead, we need to focus on much broader capabilities of AI: broad AI” unbounded by these limitations.

Even with continually improving semiconductors, “it’s very clear that human beings are going to run out of computation power really soon, so we need to come up with a third element of future computing: quantum,” he said.

As opposed to traditional bits, the qubits that power quantum computing are able to present many probabilistic states simultaneously, not just a 1 or a 0, while entanglement can link different qubits together, no matter how far apart they are. Already, quantum computers have more than 20 types of logic gates as compared with seven in traditional computing, enabling far more complex algorithms.

“All this may sound very complicated, but from a user point of view, it’s just as simple as running a piece of software,” Morimoto said.

However, there is still a lot of work to do, marked primarily by hardware scaling and programming challenges.

The first issue is cooling. At 15 millikelvin, the bottom of the dilution fridge that houses quantum chips is “the coolest place in the universe, ever,” Morimoto said.

Yet the core of quantum computing, the chip itself, is developing quickly. IBM in 2019 set out a roadmap to double the number of qubits on a chip every year, but with a jump from this year’s 127 qubit Eagle to a 1,121 qubit chip by 2023, the company is far ahead of schedule.

The quantum era is just getting started. As of March, IBM’s quantum network had 250,000 users and more than 30 quantum computers, and the network is growing fast, with a 10-year research partnership signed with the Cleveland Clinic, additional Quantum Hubs planned and commercial machines to find homes in Germany and Japan.

“If you can do this with today’s quantum computers, what will be possible, two years later, three years later, five years later?” Morimoto said.