SoftBank champions Nvidia's growing dominance in AI-use chips
US company's lightning-fast GPUs lead the field in applications like self-driving tech
YUICHIRO KANEMATSU, Nikkei staff writer
PALO ALTO, U.S. -- There is a good reason that SoftBank and a Saudi Arabian investment fund are ready to pour billions of dollars into a the U.S. chipmaker Nvidia -- it has the best graphics processors for AI and self-driving technologies out there. And SoftBank is betting big that Nvidia is in just the right position to lead these future-defining innovations.
The name Nvidia might not be too familiar to ordinary consumers, but the chipmaker is well on its way to cementing its position as a major player in artificial intelligence applications. The key is its graphics processing units.
Highlighting its potential, a 10 trillion yen ($89.9 billion) investment fund that Japan's SoftBank Group recently set up jointly with the Saudi Arabian government and other entities has decided to buy an equity stake worth about 400 billion yen in Nvidia.
Nvidia has already gained a dominant position in self-driving technology, and is regarded as a likely favorite to grow into a chip industry behemoth in the medium-to-long term.
The fund's acquisition plan reflects SoftBank Chairman and CEO Masayoshi Son's aim to invest in companies that support growth industries at the fundamental level in a long-term perspective. This was highlighted in his company's recent purchase of ARM Holdings, a British company dominant in semiconductor design.
Fei-Fei Li, an associate professor at Stanford University, is a noted researcher in AI technology, for which Nvidia chips are in hot demand.
Li is known for having developed a platform to process information collected by AI very quickly and at low cost. She is the chief scientist with Google's AI and machine learning team, and advises Toyota Motor's AI development team operating in Silicon Valley.
Supporting the research of scientists like Li is Nvidia's GPUs, which can process graphics data at lightning speed by dividing huge amounts of data into blocks and processing them in parallel. Initially, the technology enjoyed growing demand for game machines, which require the processing of large amounts of graphics data.
But the chips are useful for more than just graphics data. In deep learning, a development method that is helping speed the progress of AI, data is also divided into multiple levels for processing, making it an application that GPUs handle well.
Nvidia's chips are used for AI applications that demand high performance. These include a Harvard University project that aims to have an AI system pass a U.S. medical licensing examination, and Deep Instinct, an Israeli information security company that uses AI to achieve a remarkably high detection rate of computer viruses. Nvidia is now an important presence in the field, and any technician with the slightest association with AI knows its name well.
Companies that lead in global AI development, including Facebook and Amazon.com of the U.S., and China's Baidu, also use Nvidia GPUs.
The list goes on. Nvidia products, which come with support tools for developers, have found wide application, including in robot development, image analysis, image monitoring, functions to find matches between products and customers, investment decisions, and automated product design.
For now, GPUs still occupy only a small segment of the overall semiconductor market. But the application that promises to give it a significant boost is self-driving technology.
Automakers Audi of Germany and Tesla of the U.S. have developed their self-driving technology around Nvidia chips, from the earliest stages. Mercedes-Benz of Germany, and Ford Motor of the U.S., as well as Toyota, which until recently had remained cautious, all followed suit and tied up with Nvidia. The company's products have become the de facto standard for semiconductors for self-driving technology.
Derek Aberle, president of U.S. chipmaker Qualcomm, acknowledged that its rival Nvidia is leading the industry, especially in terms of forming an ecosystem for the self-driving segment. But he also said that his company is working to solidify relationships with companies in the field.
Last year, Qualcomm announced a $47 billion deal -- the largest ever in the chip industry -- to acquire NXP Semiconductors, a Dutch chipmaker with a strong presence in the auto industry.
At a company event in May, Nvidia CEO Jensen Huang asserted that GPUs will become the driving force in advancing semiconductor technology in what he described as an era in which Moore's law no longer holds true. Moore's law states that the density of an integrated circuit doubles every two years, and such seemingly nonstop technological advancement has supported chipmakers by creating fresh demand.
Intel's Executive Vice President Stacy Smith disagrees, saying that Moore's law is not dead at the U.S. chipmaking giant, which has long benefited from the rapid technological advancement reflecting the law. He said the pace of decline in production cost per transistor has not changed at his company, putting it three years ahead of the overall chip industry.
According to Smith, GPUs suffer delays in processing because they have to be used in combination with central processing units. That, he said, puts Intel's AI-use chips at a speed advantage, because they use a model in which all of the processing takes place inside a CPU.
Catching up fast
For now, Nvidia's revenue is just an eighth that of Intel, but its market capitalization has grown to about half that of Intel. The market considers the company a threat to Intel.
To counter the threat, Intel earlier this year announced a deal to acquire Mobileye, whose technology is widely used in advanced driver assistance systems. The U.S. company has also teamed with German automaker BMW to develop chips for automotive use.
Nvidia has its own problems that threaten its growth. In particular, Google, which has been its major customer, is now shifting to designing GPUs in-house. But Huang said Wednesday on his blog that his company's latest GPU product has a processing speed that far exceeds that of Google's device.