A brief analysis of McKinsey’s Lilli: What development ideas does it provide for the enterprise AI market?

McKinsey's Lilli case highlights key insights for the enterprise AI market, emphasizing edge computing and small models as transformative approaches. Here are the core takeaways:

  • Enterprise data security: Lilli demonstrates how edge computing can balance local data retention with AI capabilities, addressing sensitive data concerns without relying on public cloud processing.

  • Small professional models over large general ones: Enterprises prioritize domain-specific accuracy over broad capabilities, making smaller, specialized models more practical than large, generic ones.

  • Cost efficiency of self-built AI infrastructure: While initial investment is high, edge computing and small models reduce long-term operational costs, especially for large-scale employee usage, avoiding API dependency.

  • Edge hardware market growth: Chip manufacturers like Qualcomm and MediaTek are developing low-power, high-efficiency processors tailored for edge AI, creating new infrastructure demands.

  • Rise of decentralized Web3 AI: As enterprises adopt small models, decentralized solutions for computing power, fine-tuning, and resource scheduling (e.g., Web3-based platforms) will gain traction.

This shift from resource-heavy AI (e.g., high-end GPUs) to edge-focused, efficient solutions signals a more dynamic and practical enterprise AI market.

Summary

McKinsey's Lilli case provides key development ideas for the enterprise AI market: edge computing + small model potential market opportunities. This AI assistant, which integrates 100,000 internal documents, has not only achieved an adoption rate of 70% of employees, but is also used an average of 17 times a week. This product stickiness is rare in enterprise tools. Below, I will talk about my thoughts:

1) Enterprise data security is a pain point: The core knowledge assets accumulated by McKinsey over the past 100 years and the specific data accumulated by some small and medium-sized enterprises are extremely sensitive and are not processed on the public cloud. How to explore a balance state of "data does not leave the local area, and AI capabilities are not discounted" is the actual market demand. Edge computing is an exploration direction;

2) Small professional models will replace large general models: Enterprise users do not need general models with "billions of parameters and all-round capabilities", but professional assistants that can accurately answer questions in specific fields. In contrast, there is a natural contradiction between the versatility and professional depth of large models, and small models are often more valued in enterprise scenarios;

3) Cost balance between self-built AI infra and API calls: Although the combination of edge computing and small models requires a large initial investment, the long-term operating costs are significantly reduced. Imagine if the AI large model used frequently by 45,000 employees comes from API calls, the resulting dependence, increase in usage scale and product theory will make self-built AI infra a rational choice for large and medium-sized enterprises;

4) New opportunities in the edge hardware market: Large model training cannot be separated from high-end GPUs, but edge reasoning has completely different hardware requirements. Processors optimized for edge AI by chip manufacturers such as Qualcomm and MediaTek are ushering in market opportunities. When every company wants to build its own "Lilli", edge AI chips designed for low power consumption and high efficiency will become a necessity for infrastructure;

5) The decentralized web3 AI market is also growing: Once enterprises’ demand for computing power, fine-tuning, and algorithms for small models is driven, how to balance resource scheduling will become a problem. Traditional centralized resource scheduling will become a problem, which will directly bring great market demand for web3AI decentralized small model fine-tuning networks, decentralized computing power service platforms, etc.

While the market is still discussing the general capabilities of AGI, it is even more gratifying to see that many enterprise users are already exploring the practical value of AI. Obviously, compared with the past resource monopoly leaps in computing power and algorithms, when the market focuses on edge computing + small models, it will bring greater market vitality.

Share to:

Author: 链上观

This article represents the views of PANews columnist and does not represent PANews' position or legal liability.

The article and opinions do not constitute investment advice

Image source: 链上观. Please contact the author for removal if there is infringement.

Follow PANews official accounts, navigate bull and bear markets together
Recommended Reading
7 hour ago
11 hour ago
13 hour ago
14 hour ago
16 hour ago
17 hour ago

Popular Articles

Industry News
Market Trends
Curated Readings

Curated Series

App内阅读