The cost of machine learning applications will come down as a result of semiconductors improving computing power. The lower cost will naturally result in a sharp rise in the number of users. Using AI will become a daily routine in work and private life. The commercialization of AI will really lead to the arrival of the "iPhone moment for AI." Once OpenAI becomes the norm, various kinds of applications will need the support of Nvidia's datacenter and dedicated chips. We can imagine how much the industry will benefit from this.
Different from traditional servers, 80% of AI servers use GPUs, and 10% CPUs. About 60–70% of ordinary datacenter servers rely on the computing power of CPUs. Nvidia's DGX server architecture is built on eight GPUs and two CPUs. GPT-3.5 needs to be powered by 10,000 A100 GPUs, and an ordinary machine learning server may only need 500–4,000 CPUs.
A lot of attention has been directed towards TSMC. Indeed, the foundry house is a big beneficiary of the AI movement, next only to Nvidia. And the movement is just beginning. Nvidia first launched its A100 and despite the release of Hooper 100 later, A100 remains attractive. Why is Nvidia standing out?
Before 2013, datacenter investments came to about US$55 billion a year, and sharp growths started emerging after 2013. Between 2017 and 2022, datacenter growth registered a CAGR of 11.8%. It is estimated in the next five years, datacenters running on Nvidia chips would have a CAGR of over 20% even if Nvidia's market share did not grow. More optimistic observers predict the growth at even more than 40%.
Nvidia is taking a big chunk of the market, and Taiwanese firms are sharing a lot of the benefits. Now any Taiwanese firms in the GPU ecosystem will be very busy. Wistron, Foxconn and Quanta reportedly have already obtained major orders. During Computex, Huang visited MSI and Gigabyte, and also announced a partnership with MediaTek for developing smart cabin solutions. Apart from servers, a lot more business opportunities from the EV and IoV sectors are expected to come knocking on the door soon.
According to IDC, the market for AI and related software will reach US$519.2 billion in 2023. Gartner said that AI spending accounted for 7% of companies' IT spending in 2020, and the proportion will increase to 10.5% in 2023. These estimates are based on the size of the entire global software market. What about the opportunities for hardware that runs the software?
A few years ago, Jensen Huang asserted that "software is eating the world, but AI is going to eat software." I think Huang is mostly correct, but hardware is irreplaceable, particularly computer chips. The biggest winners will still be hardware manufacturers. There will also be enormous opportunities for server- and datacenter-related businesses.