Interview: The Round Trip
Compiled & edited by: Yuliya, PANews
As the AI wave sweeps the globe at an unprecedented speed, an "arms race" over computing power has begun. With Nvidia's market capitalization surpassing one trillion dollars and giants like AWS and Google Cloud virtually monopolizing cloud computing power, a profound challenge faces all AI innovators: will the high centralization of computing power stifle open innovation and lock the future of AI into the "walled garden" of a few companies?
With a successful track record of selling their company to Snapchat for $60 million and founding Product Science, which provides AI code optimization services to top companies, brothers David and Daniel Laborman, co-founders of Gonka AI, bring their continuous entrepreneurial experience from parallel computing to AR to offer the market a unique perspective on breaking this deadlock: building a completely community-driven decentralized AI computing network.
In their new Founders' Talk series, "The Round Trip," produced by PANews and Web3.com Ventures, David and Daniel elaborate on why they drew inspiration from the history of Bitcoin's infrastructure development to replicate the "ASIC revolution" in the AI field through an open financial incentive framework, aiming to completely break the shackles of computing power costs. They share how Gonka AI attracted $50 million in investment from industry giants like Bitfury and offer unique insights into the current "AI bubble" theory.

From games and AR to decentralized AI
PANews: Welcome David and Daniel! It's great to have you here. I know you both have very strong technical backgrounds and have been working in this field for many years. Could you share some of your background stories with our audience?
Gonka AI: Hello everyone. First of all, we are like brothers; our lives and careers have always been closely intertwined. Our story begins in 2003, when we developed a strong interest in parallel computing and decentralized networks.
Later, we entered the online gaming field, which is essentially a form of massively parallel computing—thousands of players interacting in real time via the internet. To improve the efficiency and reduce the cost of game animation production, we then plunged into the field of computer vision.
Computer vision then led us in a completely new direction: we started developing AR avatars for Snapchat . This experience was very successful, and Snapchat eventually acquired our company for $60 million, marking a significant turning point in our business.
Throughout our experiences with various projects and companies, we've always held onto one aspiration: to create something truly impactful for society, particularly in how we interact. Everything changed when AI entered our lives in a completely new form—Large Language Models (LLM). It's no longer the machine learning we were familiar with; it's a powerful tool capable of genuine dialogue and tangibly helping us solve problems. We see that this new generation of AI, based on the Transformer architecture, is not just about language models. From image and video generation to breakthroughs in biology, chemistry, and physics, and even more efficient nuclear power plant design and operation, this wave of AI is impacting almost everything.
Next, we will see the rapid development of robotic software and self-driving cars , and these changes are happening very quickly, right now.
But this brings with it a concern, not a "Terminator"-style sci-fi fear, but rather an anxiety about the current state of affairs. Currently, approximately 65% of global cloud computing power is controlled by three American companies (AWS, Google Cloud, etc.). If we add China's Alibaba and Tencent, these five giants control a staggering 80% of global cloud computing power. The core of AI is computing power, and at this stage, AI is almost synonymous with cloud computing power. These companies are fiercely competing, attempting to control 100% of AI computing power. If this trend continues, we will enter a very strange world:
Only a very few companies truly own and control all AI, and these AIs will:
- Replaces a large number of jobs
- Reshaping the entire economic structure
- Change the way society operates
Therefore, we believe that decentralized AI is a crucial and unavoidable issue.
This is why we eventually came to Gonka AI.
PANews: Indeed, you are not newcomers to the AI field. Before founding Gonka AI, you also founded Product Science, a company that received investment from well-known institutions such as Coatue, K5, and Slow Ventures. Could you talk about this experience and how it led you to Gonka?
Gonka AI: Absolutely. The computer vision we've been deeply involved in is essentially AI and machine learning. The earliest practical applications of AI largely occurred in areas such as image generation and animation production, which is how we built our reputation in the machine learning industry.
After leaving Snap, we founded Product Science . This company uses AI to provide code optimization services to world-leading companies such as Walmart, JPMorgan Chase, and Airbnb. While AI is now widely known for helping with code writing, equally crucial is ensuring that that code runs efficiently. Improving code performance is our core business until we fully shift our focus to Gonka and the decentralization of AI infrastructure.
Gonka AI's "Bitcoin"-like vision
PANews: You mentioned the issue of centralized computing power, which is indeed worrying. The recent massive outage of Cloudflare crippled half of the crypto world, and AWS also frequently experiences outages, each impacting a large number of applications. How will Gonka AI address this issue? It doesn't seem to be a general-purpose decentralized cloud, but rather more focused on the AI field.
Gonka AI: Yes, given the current predicament of highly centralized computing power, the only way out we see is decentralization.
At the model level, we've seen independent labs like DeepSeek demonstrate their ability to train high-quality models comparable to those of tech giants, but computing power remains a core bottleneck. Currently, many cutting-edge labs rely on infrastructure built by large cloud service companies, and in the decentralized field, no solution of comparable scale has yet emerged. Even Bittensor, currently the largest decentralized AI computing network, only has about 5,000 data center-grade GPUs. Meanwhile, companies like OpenAI and xAI are building massive clusters with millions of top-tier GPUs. The difference in scale is enormous.
We realized that the only way to truly make AI belong to the people and avoid single points of failure was to build a decentralized computing network of comparable scale. At this point, we drew tremendous inspiration from Bitcoin. We didn't just see it as "digital gold," but as one of the greatest frameworks for building large-scale infrastructure.
Over the past 15 years, the Bitcoin community has built an incredible infrastructure through decentralization. Today, the Bitcoin network boasts approximately 26 GW of data center space, exceeding the combined total of Google, Amazon, Microsoft, OpenAI, and xAI. This is a massive undertaking built by countless independent participants worldwide in an effort to break free from centralized systems.
Equally impressive is the speed of its hardware innovation. In 15 years, the energy required to compute 1 TH/s of Bitcoin computing power has dropped from 5 million joules to just 15 joules, an astonishing 300,000-fold increase in efficiency! We believe that if the same revolution can be brought to AI computing power, true "computing abundance" will become possible, and AI will be available to everyone on Earth.
Host: I noticed that Bitfury, an early Bitcoin infrastructure giant, just announced a $50 million investment in you. Does this mean the market is seeing a similar pattern? Bitcoin makes energy "fungible" because energy, whether in Siberia or Silicon Valley, can be converted into homogeneous computing power value. Are you making computing power "fungible" as well? Given that AI is very sensitive to factors like latency, will this be a challenge?
Gonka AI: We believe the same story will unfold in the computing power field. Currently, Nvidia's chips are extremely expensive, and the vast majority of the data center construction costs for companies like OpenAI go to Nvidia. But if we can replicate the innovative transformation of ASICs (Application-Specific Integrated Circuits) in the AI field, the world will be very different.
Once the hardware cost per unit of computing power drops significantly, energy costs will once again become a key variable. The fact that early mining companies and hardware manufacturers like Bitfury are now investing in this ecosystem is a strong signal that they have identified a pattern similar to that of Bitcoin's early development.
Back in 2012, GPUs were the mainstream mining devices, but just a few years later, ASICs, with their efficiency dozens of times greater than general-purpose chips, became the only viable mining path. And the companies that spawned these ASICs weren't large tech giants, but rather some little-known startups. This was entirely due to Bitcoin's financial incentive framework:
- Open competition: Whoever you are, you can earn the largest share of token rewards as long as you provide the most effective computing power to the network.
- Positive cycle: As the price of tokens rises, the rewards become more attractive, thus incentivizing more people to join the race to increase the network's total computing power.
- Lowering the barrier to innovation: A small company in South Korea or San Francisco can design a more efficient chip without a large sales team, relationships with giants, or even traditional investors. They can simply connect the chip to the network and start making a profit as soon as it proves effective.
This framework significantly lowers the barriers to entry and complexity of the "computing power production" business. We firmly believe that this scenario will be repeated in the AI chip field. Once the protocol is established, people can earn money by connecting their computing devices—whether it's their own computers, purchased Nvidia GPUs, or rented computing power from data centers—all can contribute to the network and receive rewards. We anticipate that within the next one to two years, this innovation driven by a financial framework will bring hundreds or even thousands of times more computing power to AI networks, completely breaking through the computing power bottlenecks we face today.
How will decentralized networks reshape the computing power market?
PANews: This model is interesting, reminiscent of early cryptocurrency miners using idle GPUs in schools for mining. Many companies now buy expensive H100 GPUs, but they sit idle most of the time because they don't know how to fully utilize them. Does your network also attract these types of users?
Gonka AI: We have indeed encountered many similar, and even more exciting, situations. Some very successful AI startups bought hundreds of H200 GPUs with investor money during the early hype, but only half of them are being effectively utilized to date.
Another, more common scenario is that many companies themselves rent computing power from large data centers to run open-source models. They later discovered that through our network, they could do something much smarter: instead of running the models inefficiently themselves, they used the same services through the Gonka network's API; simultaneously, they contributed the nodes with Gonka installed on their rented GPUs to the network. In this way, they could both use the AI models and earn token rewards, achieving significantly higher efficiency and returns than before.
To efficiently utilize GPUs, you need to handle tens of thousands of requests simultaneously, which is extremely difficult for a single project. Therefore, businesses either have to tolerate low utilization of their own (or leased) hardware or pay exorbitant API fees, neither of which is optimal. Connecting to the network and becoming part of the ecosystem is a better option.
Many participants in our network don't just have "idle" computing power. For example, data centers like Gcore and Hyperfusion are highly efficient commercial operators with limited idle capacity. However, in the past few months, they've discovered that connecting GPUs to the Gonka network can generate higher returns than renting them directly to customers, because they gain exposure to the value generated by network growth. Therefore, they've begun gradually migrating hundreds of GPUs from their rental business to our network.
This is precisely the key to how networks can scale from thousands of GPUs to millions. While giants like OpenAI have bought up the majority of GPUs on the market, millions of GPUs remain scattered among these independent players. They cannot compete individually, but together they form a powerful force.
This logic also applies at the national level.
A year ago, when we communicated with the governments of some countries, their mainstream idea was "we want to build our own clusters and create sovereign AI".
A year later, when we met with ministers from countries such as the UAE and Kazakhstan, they all clearly realized that independent players with a small number of GPUs simply could not compete with the giants.
However, if they join together in a large, trusted decentralized network, it is entirely possible to maintain their sovereignty, because everyone can trust a decentralized network.
The AI Bubble Debate: Is it a Tide of the Times or the Bursting of a Specific Bet?
PANews: Undeniably, the AI field is experiencing tremendous enthusiasm and rapid growth. But with investors and users holding high expectations, are we heading towards an "AI bubble"? Many are comparing it to the dot-com bubble of 2000.
Gonka AI: That's a very interesting question. Looking back at the dot-com bubble of 2000, although it experienced a "mini-burst," look at how the world has changed 25 years later. The internet was a real technological revolution, and the resulting shift in economic models was real as well. Those companies from back then have now grown into trillion-dollar giants, completely transforming our lives.
Compared to the internet, the changes brought about by AI will be far more radical and thorough. Imagine that in the next 30 to 50 years, everyone will have a personal robot that can do the work for them in the factory—this is not science fiction, but an imminent reality. Therefore, it is not irrational for investors to be willing to pour tens of billions of dollars into this technology.
Of course, there will be failed investments along the way, just as has happened in the venture capital field over the past 30 years, with a lot of money lost. But overall, the returns in this field are extremely high, and it has truly changed the world.
Therefore, whether it's a bubble or not depends on your perspective. Some companies go bankrupt due to flawed assumptions. For example, Gonka's assessment of the feasibility of decentralized AI may be wrong; conversely, all the investments betting on Nvidia today could be a huge bubble.
History has played out similarly before. In 2012, Nvidia's stock price surged due to the cryptocurrency narrative, as the market anticipated its dominance in the mining market. However, the ASIC revolution followed, and it almost completely lost that market. Now, AI is bringing even greater value growth to Nvidia, as the market anticipates a multi-trillion dollar market. This expectation may be correct, but no one can guarantee that Nvidia will maintain its dominance forever. What would happen if the ASIC revolution were to repeat itself in the AI field?
Imagine rebuilding the entire computing power of today's Bitcoin network using Nvidia's latest Blackwell chips instead of ASIC miners—you would need to invest $500 trillion! That's clearly unsustainable.
Therefore, we may not be discussing an "AI bubble," but rather a bubble created by "bets on specific companies and specific technological paths." If the market's assessment of Nvidia is wrong, then five to seven trillion-dollar companies may suffer heavy losses, but this does not mean that AI itself is a bubble. AI technology will not disappear, and its process of changing lives and businesses will not stop; it's just that the companies that carry this value may change.
PANews: I completely agree. Just like how we don't say "I'm using the internet" now, but rather "I'm using an app," and that app happens to use the internet, in the future, every application will use AI in some form. It will become so ubiquitous that we won't even be aware of its existence.
Gonka AI: Absolutely correct. If you look at the Nasdaq chart from its inception to the present, you'll see that the "mega-crisis" of 2000 was just a tiny ripple in a decades-long growth curve. People thought all goods would be sold online within five years—that didn't happen, but it did happen within 15 years.
The same applies to AI. A future where robots are ubiquitous may not happen within five years, but it is almost inevitable, and no force can stop it. From this perspective, it is certain that our future demand for computing power will increase thousands of times. What we need is a long-term economic model, like Bitcoin, designed for the next few decades, to support this vision.
