The core of Huang Renxun’s speech at the Beijing Chain Expo: The revolutionary arrival of AI

  • NVIDIA CEO Jensen Huang delivered a keynote at the 3rd China International Supply Chain Promotion Expo, emphasizing AI as the core driver of industrial transformation.
  • Key points from his speech:
    • AI's computational revolution: Shift from CPU-based logic to GPU/TPU-driven parallel computing, enabling machine learning from data rather than human-written rules.
    • Industry impact: AI is reshaping sectors like healthcare (medical imaging), logistics (autonomous driving), and consumer tech (WeChat, TikTok, Xiaomi smartphones).
    • Infrastructure shift: AI is becoming as essential as electricity, with decentralized cloud computing and "computing power as a service" emerging as critical enablers.
    • Future vision: Next-gen AI will integrate reasoning, physical-world understanding, and robotic collaboration, revolutionizing factories and supply chains.
    • Decentralized computing: Highlighted as vital for scalable, cost-efficient AI infrastructure, supporting real-time robotics and edge computing.
  • Huang positioned China’s supply chain ecosystem as a key beneficiary of AI-driven industrial growth.
Summary

On July 16, the 3rd China International Supply Chain Promotion Expo opened, and Nvidia founder and CEO Huang Renxun delivered a speech at the opening ceremony . Looking at Huang Renxun's speech, we will find that AI is the theme throughout the speech. PowerVerse, as a decentralized cloud computing platform focusing on Web3 infrastructure, will analyze based on Huang Renxun's speech.

Huang Renxun first briefly reviewed the development history of NVIDIA in his speech. He mentioned that in 2006, CUDA was born, turning GPU into a general-purpose computing engine and promoting the advent of the AI era. " Ten years ago, the operation of AlexNet on NVIDIA GPUs triggered the AI explosion. In the past, software relied on manual writing and ran on CPUs, but now AI learns through data and runs on GPUs. This transformation - from logic written by humans to intelligence from machine learning - is completely reshaping the chip and computer industry. "

In the past, software relied on human-written rules and ran on the central processing unit (CPU). Every step of the program was defined by humans, with clear logic and rigorous structure. However, the limitations of this model have gradually emerged: it is difficult to cope with increasingly complex real-world problems, unable to automatically extract rules from data, and unable to adapt to the ever-changing environment.

The development of artificial intelligence, especially the rise of deep learning, has brought about a new computing paradigm: instead of people writing logic, machines are allowed to "learn" intelligence from large amounts of data. Behind this change is a fundamental shift in computing needs.

Traditional programs deal with deterministic problems, while artificial intelligence deals with patterns and probabilistic relationships hidden in massive amounts of data. This shift has brought unprecedented computing power requirements, especially reliance on large-scale parallel computing. Although CPUs are good at processing complex logic, they are unable to cope with the matrix operations required by AI. Graphics processing units (GPUs), with their thousands of small cores, can efficiently perform parallel tasks and quickly become the core hardware for AI training. As a result, the chip industry has begun to shift from a CPU-centric model to a new model dominated by GPUs and dedicated AI chips.

At the same time, the logic of chip design is also undergoing fundamental changes. In the past, chips pursued versatility and compatibility, but the rise of AI has made chip manufacturers realize that optimizing for specific tasks can achieve efficiency far exceeding that of general-purpose chips. As a result, tensor processing units (TPUs), neural network processing units (NPUs), and AI acceleration chips have emerged, which have greatly improved the efficiency and energy efficiency of AI computing by reducing computing precision, optimizing memory bandwidth, and designing dedicated instruction sets.

This shift from "general" to "special" has driven a wave of innovation in the entire chip industry.

Not only that, this change has also prompted the reconstruction of computer architecture itself. The traditional von Neumann architecture separates computing and storage, resulting in frequent data transfer in AI computing becoming a performance bottleneck. In order to break through this limitation, a new architectural concept - Computing-in-Memory came into being. It embeds the computing unit directly into or close to the storage unit, greatly reducing the delay and energy consumption caused by data transfer, and becoming an important direction for future high-performance computing.

In terms of business model, the rise of AI services has also made computing power a marketable product. Cloud computing platforms, based on technologies such as blockchain, smart contracts, cryptocurrency, edge computing, and AI, can fully commercialize computing resources to form a decentralized cloud computing market, providing on-demand AI computing resources for computing power demanders . Enterprises do not need to build expensive computing centers themselves, and can train and deploy AI models . At the same time, it also provides channels for computing power suppliers to fully release idle computing resources that are not fully circulated .

This "computing power as a service" model is currently becoming a trend that is transforming the technology industry .

Huang Renxun shared in his speech:

AI is transforming every industry—from scientific research and healthcare to energy development, transportation, and logistics management.

AI powers iconic Chinese platforms such as Tencent’s WeChat, Alibaba’s Taobao, and ByteDance’s TikTok;

AI drives Xiaomi’s autonomous driving and smartphones;

AI empowers Baidu ’s AI search and Meituan’s extremely fast and convenient smart delivery service;

AI also provides support for Inferior Medical Technology’s medical imaging diagnostic systems, helping to improve medical standards in more than 20 countries around the world.

We can see that AI and the fields it affects are driving changes in each other. On the one hand, AI is spawning new applications and industries : from autonomous driving to medical image analysis, from smart voice assistants to content generation, AI is penetrating into all walks of life ; on the other hand, the fields transformed by AI are further driving their demand for high-performance chips and computing platforms.

In his speech, Huang Renxun said that AI is also our infrastructure, just like electricity and the Internet in the past. AI is also reshaping the supply chain and completely changing our production and logistics methods.

When AI becomes an infrastructure like water and electricity, and a basic need for all walks of life in society, chips and computers will no longer be just tools, but the core infrastructure of an intelligent society. This is what PowerVerse believes that Huang Renxun's speeches emphasize as a transformative trend in AI.

The next wave of AI will be robotic systems that have reasoning and execution capabilities and can understand the physical world. In the next decade, factories will be driven by software and AI, coordinating teams of robots that collaborate with humans to produce smart products led by AI. AI will be at the core of every industry, enterprise, product and service. AI has triggered a new industrial revolution and brought new growth opportunities to China’s excellent supply chain ecosystem. ” Huang predicted future trends at the end of his speech.

The popularization of robot collaborative systems, AI-driven robot systems, and smart products is an exciting scenario, but it is also a scenario that cannot be separated from the necessary support provided by infrastructure. Decentralized cloud computing will play a vital role. This computing model can provide flexible, efficient, and cost-effective solutions to support the data processing and analysis needs generated by large-scale robots . In terms of distributed edge computing, it reduces data latency, improves the speed and efficiency of real-time decision-making of robots, and enhances the reliability and availability of the system - the decentralized architecture allows computing tasks to be dynamically allocated among multiple nodes, so that even if some nodes fail, the entire system can still maintain normal operation .

From a market perspective, the decentralized cloud computing model helps reduce the operating costs of enterprises. Enterprises can flexibly rent the required computing resources according to actual needs, avoiding high hardware investment and maintenance costs. Especially in China's excellent supply chain ecosystem, small and medium-sized enterprises can more easily access high-quality AI services .

While promoting resource sharing and collaboration and building an open and win-win industrial ecosystem , different companies can achieve mutual benefit by sharing idle computing resources and further promote technological innovation and development.

Share to:

Author: PowerBeats

This article represents the views of PANews columnist and does not represent PANews' position or legal liability.

The article and opinions do not constitute investment advice

Image source: PowerBeats. Please contact the author for removal if there is infringement.

Follow PANews official accounts, navigate bull and bear markets together
Recommended Reading
1 hour ago
1 hour ago
2 hour ago
2 hour ago
3 hour ago
3 hour ago

Popular Articles

Industry News
Market Trends
Curated Readings

Curated Series

App内阅读