On June 11, OpenAI CEO Sam Altman published an article titled "The Gentle Singularity" on his personal website . In the article, he believes that humans may have entered the early stages of the "singularity" - the critical point where artificial intelligence surpasses human intelligence , and imagined the era of 2030 and beyond, " but in other aspects that are still crucial, the 2030s may be completely different from any previous era. We don't know how far intelligence can surpass human levels, but we are about to find out. " " By the 2030s, intelligence and energy - that is, creativity, and the ability to realize creativity - will become extremely abundant. These two resources have long been the fundamental limitations of human progress; with sufficient intelligence and energy (and good governance), theoretically we can have everything else. "
Intelligence represents a person's creativity, understanding and problem-solving ability.
Throughout human history, there are many brilliant and creative ideas or creativity, but due to the lack of knowledge base, tool construction, education system or communication means , these ideas and creativity often cannot be verified, recorded or widely used.
For example, Wan Hu of the Ming Dynasty in China tried to make his own rocket, tying himself to a chair and using the rocket to launch himself into the air, but his attempts failed many times. The pioneers of exploration lacked the basic theories of flight and the practical experience of their predecessors.
For example, ancient Greek philosophers imagined that the universe was made up of atoms, but due to the lack of experimental tools and scientific methods, this idea failed to promote technological progress. It was not until the rise of modern science that humans began to truly understand the world in a systematic way.
The above cases all illustrate that the accumulation and inheritance of intellectual resources make innovation possible.
Energy is the source of power that turns ideas into reality. For thousands of years, humans have relied on human power, animal power and simple natural energy to survive, and productivity has been limited for a long time. Even if there are smart ideas, they often cannot be realized because there is not enough energy support. It was not until the Industrial Revolution brought coal and steam engines that large-scale manufacturing and transportation became possible, and the foundation of modern society was established. It can be said that energy determines whether a society can turn the blueprint of creativity or thought into part of the real world.
Entering the 21st century, the development of artificial intelligence is enriching intellectual resources unprecedentedly. AI can not only assist research and accelerate discovery, but also enable everyone to obtain high-quality knowledge through personalized learning systems, thereby improving the cognitive level of the entire society.
At the same time, clean energy technology is also advancing rapidly - from controlled nuclear fusion to efficient solar energy, to smart grids and advanced energy storage systems, we are gradually breaking away from our dependence on limited resources and moving towards an era of abundant energy.
Of course, all of this requires a key factor: a good governance mechanism. Intelligence and energy are neutral in themselves. They can be used to create or destroy. Only by establishing a fair, transparent and efficient governance system at the institutional level can we ensure that these powerful resources are used to benefit everyone, rather than exacerbating inequality or causing conflict.
Taking climate improvement as an example, people can use AI to optimize climate models, combine it with clean electricity to drive carbon capture plants, and collaborate on environmental protection policies on a global scale. The seemingly intractable problem of climate change may be easily solved.
In this scenario , intelligence and energy will no longer be scarce resources, but will become a natural part of our daily lives, just like air and water. People will no longer be bound by basic conditions, but can focus on higher-level creation and exploration, whether it is art, philosophy, or the more distant stars and seas .
Automation of data center production
Sam Altman also wrote: “ If we have to build the first million humanoids the traditional way, but then they can operate the entire supply chain — mining minerals, refining metals, driving trucks, running factories, and so on — to build more robots, and so on to build more chip factories, data centers, and so on, then the pace of progress will obviously be completely different. As data center production is automated, the cost of intelligence should eventually converge to something closer to the cost of electricity. (People often wonder how much energy a ChatGPT query uses; the average query consumes about 0.34 watt-hours, the same energy as running an oven for more than a second, or a high-efficiency light bulb for a few minutes. It also uses about 0.000085 gallons of water, about one-fifteenth of a teaspoon.) ”
Through advanced robotics and intelligent systems, it may be possible to automate production in data centers .
In terms of robotics, humanoid robots can perform delicate manipulation tasks such as assembling servers, wiring, and installing cooling systems. These robots are able to optimize their work efficiency and accuracy through high-precision sensors and machine learning algorithms.
In terms of smart systems, production equipment is connected through Internet of Things (IoT) technology, so that the entire production line can be monitored and managed through a central control system. This not only improves production efficiency, but also allows real-time adjustments to deal with any problems that arise.
Going further, combined with DePIN such as decentralized cloud computing, through distributed ledger technology (DLT), such as blockchain, a transparent and secure data exchange platform can be created to ensure that all participants have fair access to resources and services. At the same time, the decentralized physical infrastructure network (DePIN) allows the use of idle or underutilized computing resources, including personal computers, data centers and even mobile devices, to build a global computing resource sharing pool.
DePIN helps data center production and increases the possibility of automation.
Can the cost of running AI models approach the cost of electricity?
In the future, whether the cost of intelligence , that is, the cost of running AI models , can be close to the cost of electricity is a question worth discussing.
As researchers continue to improve algorithms, quantum computing continues to innovate and break through, and semiconductor technology and dedicated AI chips (such as TPU, NPU, etc.) develop, the energy consumption per unit of computing power will show a gradual downward trend, which means that each watt of electricity can support more computing tasks.
When other DePIN resources such as decentralized cloud computing are combined, the decentralized resource model not only improves resource utilization, but also reduces dependence on the construction of new large data centers, thereby indirectly reducing overall costs.
If the cost of intelligence one day approaches the cost of electricity, it may open up a series of new possibilities and application scenarios:
● Popularization of intelligent services : As costs are greatly reduced, intelligent services will become extremely common, and almost everyone can afford high-quality AI services. For example, personalized health advisors, educational assistants, and even virtual friends will become an indispensable part of daily life.
● Accelerate scientific research : Researchers can more easily access powerful computing power, accelerating research progress in many fields from drug discovery to climate simulation. This means faster technological breakthroughs and social progress.
● New business models : As the value of computing resources is redefined, new business models based on the principles of the sharing economy may emerge. For example, users can earn revenue by contributing their own computing resources, while enterprises can reduce costs by renting these computing resources distributed around the world.
However, there are still many challenges in actual operation:
In terms of initial investment costs, building high-performance data centers or developing new hardware facilities requires a lot of upfront investment. Even if the computing cost is reduced, the cost of data transmission (including bandwidth fees and latency issues) is still a factor that cannot be ignored.
In the future, in certain scenarios, we may actually see the cost of smart services being determined primarily by the amount of electricity they consume. This will greatly promote the popularization of AI technology and the expansion of its application scope.
