Lux Capital’s Grace Isford: Building the Next-Gen AI/ML Infrastructure Stack To Power Organizations Everywhere
AI/MLThesis BriefsGrace IsfordLux CapitalAI/ML Infrastructure StackGPT-3EVC List 2022Runway

Terra Nova Insights Team,

Lux Capital’s Grace Isford: Building the Next-Gen AI/ML Infrastructure Stack To Power Organizations Everywhere

Contents


ABSTRACT

📄
Companies are increasingly focused on developing and deploying strategies around artificial intelligence and machine learning within their organizations. Along with the burgeoning demand comes a need for proper infrastructure to help test, build, deploy, and scale operations efficiently. Grace Isford, partner at Lux Capital, shares her investment focus in the computational sciences and dives into why it’s such an exciting time for the AI/ML industry and the enduring infrastructure to power the “tens of thousands” of possible applications out there.

READ KEY POINTS FROM GRACE ISFORD'S POV

Why is AI/ML infrastructure such an important category moving forward?

  • Advances in computing, data proliferation, lowered training costs, and an explosion of open source large language models are the tailwinds propelling the industry forward. “We’ve seen $100B invested in the space by venture capital funds in 2021 alone,” says Isford. More powerful computers and processing capabilities – “specifically the GPUs developed by NVIDIA,” she says – are pushing performance to new highs and enabling superior quality models to surface. “We’ve seen large language models improve particularly on complex language tasks making them even more ‘human-like, and in many cases are indistinguishable or better performing than humans themselves."

What are the applications or use cases that might be attached to this category?

  • On the front end, text, speech, image, video, and code generation are early applications of generative AI. "There are many more use cases emerging and I’m especially excited about those tied to strong ROI buckets like sales, advertising and research and development. One company I’m particularly excited about is Runway, whose team co-authored the open-source large language model Stable Diffusion pioneering text prompts to the media. Runway offers a video editing and video generation software platform leveraging AI magic tools to create and hone videos at scale,” says Isford.
  • Infrastructure tools powering end-user applications to work efficiently and reliably. “What are the infrastructure companies that will endure and specifically power AI applications at scale? I am particularly excited about infrastructure that abstracts away complexity for end-users and enables models to be smaller and more accessible to deploy at scale," she says.

What are some of the potential roadblocks?

  • Achieving scale with compute-intensive processes remains a big challenge for those looking to build. “Many companies are working to ink major high-performance computing cluster contracts with cloud providers like Amazon, Google, and Oracle to train data, host, and run models at scale. Founders need to strike smart deals to run their backend efficiently and also be thoughtful about how to build their backend stack to scale running models,” says Isford.
  • But at the same time, incumbents pose a major threat to startups and specialized competitors looking to grab a piece of the market. “AI/ML tooling that AWS, Microsoft Azure, or Google Cloud has built or will acquire for the MLOps stack, major innovations at the application layer (that OpenAI will productize, for example), or even new open source models could make some existing applications of AI or startups innovating in the space obsolete,” she says. Proprietary datasets, closed feedback loops with customers, and limited dependencies on other platforms are crucial to surviving these threats.

IN THE INVESTOR’S OWN WORDS

saidbyblock~ via Email Correspondence
Grace Isford

At Lux, my investment thesis centers around the computational sciences, or companies leveraging disruptive and new technology innovations powering enterprise infrastructure. Specifically I’m excited about AI/ML/data infrastructure and applications, especially in financial services and healthcare, and blockchain or network infrastructure.

There are a lot of exciting ‘why nows’ that have been compounding not just for the past few months, but for several years which has caused 2022 to be an exciting inflection point for the industry.

Computers are powerful enough to run large models, there is an explosion of massive amounts of training data - both organically available and machine generated. Processing power, specifically GPUs developed by NVIDIA, have improved meaningfully in performance. We’ve also seen compounding advances in neural networks, starting with Google’s Transformer network architecture paper in 2017 and a growing maturation of the MLOps stack including companies like Hugging Face.

It’s now possible to generate superior quality models that are both more parallelizable – requiring significantly less time to train – and more easily be customizable to specific domains. Costs to train image-classification systems have decreased, and we’ve seen large language models both improve and be democratized, particularly on complex language tasks, making them even more ‘human-like’ with large language models like GPT-3 and Stable Diffusion.


MORE Q&A

Q: What is the time horizon for adoption of the trends that underline this thesis?

A: “The time horizon is now. We are already seeing massive Fortune 500 companies implementing ML within their organizations, increasing sophistication of AWS, GCP & Azure ML tooling, compute, and storage.

While it will likely take a few years for large language models to deeply penetrate every use case possible, within the next year or two, I predict we will see wide scale large language model adoption and within 3 to 5 years, I expect to see emergence of a highly sophisticated AI/ML infrastructure and compute stack that can be easily adopted by any organization at scale."

Q: What is something that other market participants often misunderstand about this category?

A: "One misunderstanding is that there are multiple buyers and revenue streams for generative AI applications. The market is still early and there aren’t massive budgets earmarked for generated AI plays yet, with many enterprises still figuring out how to leverage and deploy the technology internally."


WHAT ELSE TO WATCH FOR

We will see players at the application layer create products that leverage existing, larger models and build out derivative products via specialized datasets. Few players within the ecosystem will need to train language models from the bottom-up themselves. “In reality, only bespoke use cases will train models from scratch. Instead, many companies will just retrain a much smaller proportion of data for their use cases. LLMs can be pruned smaller, which makes them less cumbersome to deploy, but can still work 90% as well at a smaller scale. Also, there’s a lot of really exciting AI & ML use cases that don’t need to leverage large language models or unsupervised learning, especially in financial services and healthcare that are already in production today,” says Isford.


STARTUPS MENTIONED IN THIS BRIEF

Runway


The 2022 EVC List honors the top 50 rising starts in venture capital. Terra Nova’s Thesis Brief series showcases each investor’s insights and category expertise.

Related Posts

Costanoa's John Cowgill: Overlooked Bets On AI in the Enterprise

Costanoa's John Cowgill: Overlooked Bets On AI in the Enterprise

John Cowgill, Partner at Costanoa Ventures, points to some of the more overlooked and contrarian business models within this closely watched ecosystem. As the space matures, competition between emerging companies and quick-to-adopt incumbents will determine the winners...