Tech Trend Insights: Our AI Investment Thesis
AI/MLAI/ML Infrastructure Stack

Raghu Dhara, Adam Dawkins,

Tech Trend Insights: Our AI Investment Thesis

Contents

Original article published on November 17th, 2023.

Last updated on January 25th, 2024.

Introduction

As we engage with our LPs, founders, and VCs colleagues, our team has identified a number of trends around AI that we think are critical t0 consider when investing in AI companies. We are following these trends closely as we ensure that our investment strategy remains robust and responsive to rapid development and changes in the AI space. 

The AI category is transforming at an accelerating pace and continues to permeate many elements of both business and consumer applications.  We have seen and are still seeing a large amount of funding going into the AI space.  It is critical to differentiate defensible AI technology companies from startups that are unlikely to endure in this rapidly evolving category.

OpenAI’s recent developer event showcased many new features, including an API that lets developers build AI assistants that can retrieve information and a new capability that allows users to import external data to train their own customized versions of GPT without the need for coding.  These features are existential threats to hundreds of venture-funded AI startups whose core value propositions were built around offering similar capabilities. Startups of this profile lack defensible technology that will allow them to build durable business models. In many cases, these startups have built their core products using API wrappers for OpenAI’s ChatGPT, and we believe these companies will be hard-pressed to create any meaningful or sustainable enterprise value. 

We have been paying close attention to the rapidly changing AI space to ensure that our investment thesis on the space remains robust and oriented towards defensible companies with significant market potential and enterprise value creation. 

We believe the durability of the AI companies in Terra Nova’s portfolio have not been affected significantly by OpenAI’s recent feature launch or other recent expansion efforts by incumbents in the space. One reason for this is that a majority of our investments that feature AI capabilities are not purely stand-alone AI products. For example, our portfolio company Wellplaece leverages AI for its healthcare-specific marketplace platform. The defensibility of Wellplaece is rooted in more than its comparable capabilities to incumbent AI models, and the platform is well-positioned to leverage advancements in AI model capabilities in its own product roadmap. 

Our Investment Thesis & Perspective on AI

We have outlined our investment thesis and perspective on AI into three sections below: 

  1. Our approach to segmenting and analyzing AI startups into three distinct categories.
  2. Emerging trends that we have identified through our network and that are likely to influence the durability of AI startups.
  3. How we approach differentiating durable AI investments from other players in the category that face significant threats from faster-paced innovators or advantaged incumbents. 

Section 1: A Framework for Categorizing AI Startups

We'll begin by outlining our strategy for segmenting AI startups, which hinges on their placement within the data needs hierarchy's three distinct layers. This categorization is essential, as the success of AI enterprises is deeply linked to effective data utilization.

The competitive edge in AI is largely determined by access to unique data sets. For AI providers, especially those training or refining models, the availability of large, proprietary datasets is crucial for differentiation. Moreover, data's role extends to crafting personalized user experiences and forms the backbone of AI operationalization in business. To put it simply, the quality of input data directly affects the output for all involved in the AI value chain.

Thus, we believe categorizing emerging AI solutions in relation to their interaction with a company's data infrastructure is the most fitting approach, given the symbiotic relationship between AI effectiveness and data integration.

Our evaluation model in the data domain revolves around the 'Data Value Accrual Pyramid.'  Value accrual and market opportunity decrease as you ascend each layer of the data hierarchy pyramid. This framework divides data needs and their associated value creation into three distinct layers:

  1. Level One - Primary Data Needs: At the pyramid's base is the storage layer, fundamental to all data activities. It's where the most stable and enduring value is generated, with major players like Snowflake and Google BigQuery exemplifying this level.
  2. Level Two - Secondary Data Needs: The next tier is the operations layer, encompassing data structures like pipelines and operations such as AI model training. Companies like Databricks and Alteryx, which offer tools for converting data storage into actionable insights as well as operational capabilities, represent this layer's growing potential. AI startups in this space are innovating with solutions that enhance these capabilities.
  3. Level Three - Tertiary Data Needs: The pyramid's peak is home to end-user applications, where AI’s impact is most directly visible. This includes applications like chatbots, recommendation engines, and tools for tasks like automated reporting, summarization and retrieval. Startups like Adept.ai and Otter.ai exemplify innovation in this layer, with products like customized copilots and transcription services. This layer includes large foundation models that power horizontal solutions such as ChatGPT, as well as a crowded ecosystem of startups that utilize existing AI models to power their own products, such as those startups referenced above that have built API wrappers to power customized features on top of GPT and are now threatened by OpenAI’s new release of similar capabilities. 

With this framework in mind, we can break down where we are seeing AI startups that are targeting promising addressable markets, and where new companies are struggling. 


Section 2: Potential Opportunities Within the Data Pyramid 

The primary, storage layer

The issue of data storage is largely solved. While the primary layer holds the overwhelming majority of value accrual, disruptive startups in this category come along very infrequently. For now, the storage layer is also the least impacted by the recent boom of AI technology capabilities. 

That being said, were a paradigm shift to take place that lead to disruptive potential in the storage layer, we would be paying very close attention. There is an enormous potential market opportunity for the eventual technology that disrupts this layer.

The secondary, operations layer

The infrastructure layer, which includes operations such as processing, analysis, and AI model training and hosting, is the engine room of the data value pyramid where data is transformed into tangible business outcomes through innovative tools and platforms that can manage and leverage data effectively. 

We believe that the infrastructure layer is the most dynamic opportunity for innovative startups in the AI ecosystem over the next two to three years. 

Companies operating in this layer currently, like Databricks (and through its Python offerings, Snowflake), offer robust data processing and analytics capabilities that are critical for developing and deploying AI models.  Startups that excel in data operations infrastructure — whether through advanced data pipelining, efficient model training environments, or streamlined data integration —  will likely serve as the catalysts for the next generation of AI-driven innovation.

The tertiary, application layer 

The tertiary layer is the most visible segment of the AI landscape, encompassing the applications and services that businesses and consumers interact with directly.  The majority of current AI startups are building at the application layer, and the space has become heavily saturated with a number of indistinguishable business models and applications vying for market share.  As a result, many of the current venture-backed startups in this layer are struggling to maintain relevance and the challenge will likely become greater as larger foundation models like OpenAI’s ChatGPT expand their feature offerings within this application layer.  

Based on the above, we are extremely cautious regarding investments in the tertiary, application layer unless a startup can clearly demonstrate unique technology or the ability to create enterprise value.


Section 3: Seperating the Strong from the Weak 

Many AI startups at the application layer are being built around veneers on OpenAI; they are invoking various combinations of wrappers around ChatGPT API calls to build products for informational retrieval or summarization.  We believe these startups will continue to struggle as OpenAI expands its scope to offer these capabilities natively or through integrations with partners, such as Microsoft or Salesforce, who hold immense distribution advantages and host significant amounts of data for users of AI. Very few enterprises will choose to entrust their CRM and other core business functions to unfamiliar startups when these same capabilities can be rolled into platforms like Salesforce.

On the other hand, there are many business models in the application layer that should continue to see value creation due to either (1.) Innovative Data Collection Methods or (2.) Creating Multi-modal Capabilities, which we describe below.

1. Innovative methods for accumulating proprietary datasets for AI development and training.

We see numerous examples of startups at the application layer who have leveraged innovative ways to help customers access proprietary data to develop highly performative solutions for specific use cases. 

The advantage of proprietary data accumulation and usage has been a key aspect of our own investment thesis as we have assessed potential investments.  This aspect of AI differentiation was a core piece of many of the AI thesis brief articles we developed in collaboration with the 2022 EVC List investor cohort.

“The top companies are coming up with creative ways to scrape data and form partnerships with the tech giants whose compute resources are critical given how expensive it is to train on web-scale data. Folks that have proprietary data have an advantage because they can fine-tune off-the-shelf models for specific applications,” said Chris Abshire, senior associate at Toyota Ventures in his thesis brief, Understanding Generative AI

"The top companies are coming up with creative ways to scrape data and form partnerships with the tech giants whose compute resources are critical given how expensive it is to train on web-scale data. Folks that have proprietary data have an advantage because they can fine-tune off-the-shelf models for specific applications," said Chris Abshire, senior associate at Toyota Ventures in his thesis brief, Understanding Generative AI.

Presently, this strategy is well demonstrated in the 3D generative AI category. Many of the startups that have been able to drive progress in the field have overcome barriers around the lack of available 3D data (magnitudes less than the 2D data used to train text-to-image models like Midjourney) by streamlining the process for users to capture and upload 3D models from their phone or camera. Polycam and CSM are examples in this space, and each relies on user-submitted data to improve the training of their models.

2. Creating multimodal capabilities that combine distinct features of the application layer to provide robust and durable solutions for sticky customer bases. 

Tavus, the one investment in our portfolio that falls into the AI startup categories of the data hierarchy model, is an example of this. The company's platform utilizes both audio and video generation to create personalized replications of input video files. This multimodal leverage of both audio and video inputs allows Tavus to create a solution that is less at risk of disruption from foundation model providers, and the user data inputs allow for a first-mover advantage. 


A number of broader trends within the infrastructure layer point to the convergence of value in this area, although not all will be realized in the same time frame. 

Trend 1: An overwhelming majority of the companies seeking to deploy AI lack the needed data infrastructure to do so, emphasizing the present need for facilitators of AI adoption at the operations/infrastructure level. 

Only a very small percentage of technology-forward companies have data infrastructure that is adequately sophisticated for deploying AI solutions that effectively leverage their internal data. Most businesses are stuck on legacy infrastructure vendors and lack the needed expertise to set up even smaller models or build basic data pipelines. Unfortunately, they are still adopting level 2 of the data pyramid. 

Further, AI technology is evolving so fast that companies are racing to develop competencies that will be outdated very quickly. There are two primary options for adapting at this time: (1.) adopt platforms that keep up to date on innovation, or (2.) have an in-house team to keep abreast of changes.

Companies that choose to build an in-house team risk falling behind competitors that allocate the resources of a would-be in-house team to invest in competitive advantage. Companies that support the operationalization of data platforms and AI are likely to be the winners. 

Several industries stand to see enormous productivity gains through the operationalization of their data infrastructure. For example, supply chain logistics creates an enormous amount of data at every step, but the current utilization of this data is abysmal. Manufacturing is another example, where opportunities such as predictive maintenance models can be unlocked through improved infrastructure for the data generated from devices and equipment.

Trend 2: In the long term, market dominance is likely to shift from a handful of larger general-purpose foundation models like GPT to a decentralized landscape of smaller, personalized models. 

AI model dominance is currently concentrated amongst a handful of companies that can afford the massive costs of training large foundation models, such as OpenAI, Anthropic, Google, and Meta. The startups that facilitate the infrastructure layer of this eventually decentralized ecosystem will likely find and create defendable value accrual.  However, we believe decentralization could take a few years to develop given that today’s large foundation models are suitable for most current user needs and maturity levels.

The reason for our view regarding the long-term trend toward decentralization is partly due to the diminishing performance returns of general-purpose foundation models that prioritize size over specificity of training data.  AI model developers are discovering diminishing performance returns as they train models of increasing size. For many years, increasing the size of models (i.e., the parameter count) was sufficient for achieving record-setting performance gains. 

More recently, this relationship has subsided and begun to reach a point of inflection. Improvements are more easily achieved by scaling training data than by increasing model size. We do not expect to see a proliferation of smaller, vertical-specific models in the near term that replace ChatGPT or other AI foundational models. The current costs for training models that rival the size of GPT or Anthropic are beyond the scope of startups today - although this could change quickly. 

However, as smaller models become more capable of performing on highly curated and personalized data sets, we should see a shift in preference towards smaller, personalized models over the next several years. This perspective has been an important aspect of our investment thesis over the last year. The prospect of this trend was also discussed in our collaboration with last year’s EVC List as highlighted below by Rayfe Gaspar-Asaoka, partner at Canaan, (see his thesis brief on the Next-Gen AI Tech Stack for further insights). 

“Whereas I see the applications being built on AI/ML becoming highly verticalized and use-case specific, the companies that build the necessary infrastructure to support these next-generation businesses will be large companies with truly horizontal, sticky, platforms…”

We have also been hearing similar iterations of this perspective in recent weeks from AI investors. Our team is keeping a close eye on the prospect of this eventual decentralization of the application layer and how the shift would advantage infrastructure players. 

Recent deal-sourcing efforts have shown an increase in startups that are attempting to build the architecture that will facilitate a world of diverse models that give users a myriad of options for each inference request. The key roadblock for these startups at the moment appears to be a lack of sufficient model diversity to support their go-to-market. Open-source solutions, such as LangChain, are sufficient for the time being. 

We are actively monitoring for shifts in the value potential for infrastructure solutions that will connect a more diverse AI application ecosystem in the future.  For now, we believe that solutions that facilitate the broader adoption of data infrastructure have the best chance to create enterprise value. 

Related Posts

Costanoa's John Cowgill: Overlooked Bets On AI in the Enterprise

Costanoa's John Cowgill: Overlooked Bets On AI in the Enterprise

John Cowgill, Partner at Costanoa Ventures, points to some of the more overlooked and contrarian business models within this closely watched ecosystem. As the space matures, competition between emerging companies and quick-to-adopt incumbents will determine the winners...