Google senior executive says these startups face a tough road ahead
After an explosive two years for generative AI, a growing number of startups are discovering that a thin layer on top of someone else’s model is no longer enough. Darren Mowry, who leads Google’s global startup organization across Cloud, DeepMind and Alphabet, believes many companies built as simple large language model (LLM) wrappers or aggregators are entering a tougher phase where defensibility matters more than speed.
Why thin wrappers are stalling
LLM wrappers—products that add a user interface or narrow feature set on top of foundation models like Claude, GPT, or Gemini—thrived early on by focusing on specific audiences such as students, marketers, or sales teams. That approach helped teams move fast and ride the hype cycle, but expectations have shifted. Investors and customers now prize real moats: proprietary data, deep workflow integration, and differentiated IP.
Relying almost entirely on the underlying model to do the heavy lifting leaves little room for defensibility or margin. As Mowry suggests, the market’s patience for “white‑labeling” frontier models is waning. What’s working better are products that embed themselves into core workflows or bring exclusive data, compliance, and domain expertise—attributes that are costly and time-consuming to replicate.
What builds a moat in AI
- Proprietary or hard-to-replicate data: Unique datasets, first-party signals, and user feedback loops that meaningfully improve model outputs over time.
- Deep integration into workflows: Direct hooks into existing tools, processes, and systems of record that make switching painful and deliver measurable productivity gains.
- Domain expertise and guardrails: Vertical-specific reasoning, compliance, and safety layers that meet industry standards in areas like healthcare, finance, or law.
- Continuous learning and adaptation: Systems designed to improve with use, via reinforcement learning, retrieval pipelines, and fine-tuning tailored to customer contexts.
Some tools in coding and legal realms illustrate these principles by pairing strong product design with deep task understanding and tight integration, rather than relying on a generic chat interface.
Aggregators under pressure
AI aggregators—platforms that combine multiple models behind one interface or API and route workloads among them—also face mounting headwinds. While they offer orchestration, monitoring, and governance, the value pool available to middle layers tends to compress as model providers ship their own optimization, security, and enterprise features.
Enterprises increasingly want embedded intellectual property and decisioning logic—smart routing, retrieval, caching, and cost/latency optimization tuned to their data and use cases—rather than a generic switchboard for models. Without distinctive capabilities or services, many aggregators risk being squeezed as providers improve their offerings and customers grow more sophisticated.
A pattern seen in the cloud era
The dynamic mirrors the early days of cloud computing. Back then, numerous startups tried to resell or repackage infrastructure for simplicity and consolidated billing. Over time, as cloud platforms expanded enterprise-grade features and customers learned to manage complexity, only intermediaries that delivered real services—security, migration, architecture, and DevOps—sustained durable businesses.
AI is tracing a similar arc: the further you move from commodity access and toward outcomes, integration, and trust, the harder you are to displace.
Where the momentum remains
Despite the cautionary outlook for thin wrappers and aggregators, there’s still substantial opportunity:
- Developer platforms and “vibe coding” tools: Products that compress the software creation loop—from idea to runnable code—continue to attract users and capital, especially when they pair powerful models with IDE-level integration, context awareness, and collaboration features.
- Direct-to-consumer creative tools: Consumer-facing AI for video, design, audio, and storytelling is expanding rapidly. As generative media improves, expect new workflows for students, independent creators, and studios, along with demand for rights management, safety, and provenance.
- Biotech and climate tech: These fields are benefiting from unprecedented access to large, high-quality datasets, accelerating discovery, simulation, and optimization in areas like protein design, materials, and grid management.
How startups can recalibrate
- Own your differentiation: Build around data, workflow depth, metrics, and reliability—not just prompts and UI. Show compounding advantages over time.
- Move up the stack: Offer outcomes, not engines. Tie your product to business KPIs such as cycle time, quality, and cost, and integrate where decisions are made.
- Invest in trust: Security, compliance, governance, and observability are now table stakes for enterprise adoption. Treat them as product, not paperwork.
- Design for change: Models, costs, and capabilities will keep shifting. Architect for portability, evaluation, and continuous improvement.
The bottom line
The easy wins from slapping a polished interface on top of a frontier model are fading. Startups that thrive from here will look less like thin wrappers and more like products with real moats: unique data, deeper integration, operational rigor, and clear, defensible value. Aggregators can endure—but only if they deliver differentiated services and outcomes that customers can’t get from model providers alone.