OpenAI’s new fundraising is shaking up Silicon Valley


All this makes OpenAI sound like a typical tech sensation: a sizzling startup reliant on intrepid investors to develop a new way of doing things that it hopes will change the world. Think Google, Facebook or Uber. Yet its significance goes further than that. Generative artificial intelligence (AI), the technology on which OpenAI is built, is changing the rules of the game in Silicon Valley itself.

There are three big challenges posed by the new technology: many venture-capital (VC) stalwarts cannot afford the huge sums of money that firms like OpenAI need to train and run generative-AI models; the technology scales in different ways than they are used to; and it may rely on unfamiliar approaches to making money. In short, generative AI is bringing disruption to the home of America’s disrupters-in-chief. Enjoy the Schadenfreude.

The first shock for venture capitalists is the size of the cheques required to fund the builders of large language models (LLMs) like those powering ChatGPT. According to PitchBook, a data gatherer, the average size of a VC fund raised in America last year was about $150m. OpenAI is looking to collect more than 40 times that from investors. The biggest cheques for LLMs are thus being written not by the VC industry but by tech giants. Since 2019 Microsoft has invested $13bn in OpenAI; Amazon has invested $4bn in Anthropic, one of OpenAI’s main rivals.

The tech giants do not just offer money. Their cloud services provide computing power to train the startups’ LLMs and also distribute their products—OpenAI’s via Microsoft’s Azure cloud, and Anthropic’s via Amazon Web Services. Microsoft is expected to invest more in Openai’s latest funding round. Apple (which will offer ChatGPT to iPhone users) and Nvidia (which sells huge numbers of chips to OpenAI) are also likely to take part. So are sovereign-wealth funds, demonstrating the vast sums of money that are required for a seat at the table.

A few venture investors are undeterred by the high entrance fee. OpenAI’s fundraising is being led by Thrive Capital, an investment firm based in New York that has made other big investments in highly valued startups, including Stripe, a payments company most recently valued at $65bn. Big Silicon Valley investors such as Sequoia Capital and Andreessen Horowitz helped provide part of the $6bn raised in May by Mr Musk’s xAI, and contributed to the $1bn raised this month by Safe Superintelligence, a model builder led by Ilya Sutskever, a former founder of OpenAI, that currently has negligible revenues.

But the size of the sums involved means some VCs are adopting a new modus operandi. Typically venture firms have sprayed capital thinly across an array of startups, knowing that if a few strike it rich, the returns will eclipse what is lost on those that do not. In the generative-AI era, where startups with access to the most capital, computing power, data and researchers have a big advantage, some are betting more on those that are already well-established, instead of kissing a lot of frogs.

The second big challenge to recent vc investment practice comes from how the new technology scales. Funding LLMs is coming to look more like the early days of Silicon Valley, when venture capitalists invested in companies cracking tough scientific problems, such as chipmakers, than the more recent trend of backing internet startups.

One of the venture mantras of the past decade has been “blitzscaling”. With the software behind most internet firms cheap to build and cheaper to run, startups could focus their money and attention on growing as fast as possible. Nowadays, the concept on everyone’s lips is “scaling laws”: the more computing power and data that you throw at AI, the cleverer models become. You thus have to invest fistfuls of money upfront to develop a competitive product, or else invent a new approach.

In a recent blog Ethan Mollick of the Wharton School at the University of Pennsylvania grouped state-of-the-art LLMs into four loose “generations”, each requiring ten times more computing power and data than the last. He calculated that in 2022, when ChatGPT was released, models typically cost $10m or less to train. Some cutting-edge models developed since then may have cost $100m or more. Those coming soon could cost $1bn. He thinks training costs will eventually exceed $10bn. As pundits quibble over how predictable these scaling laws really are, the cost of training continues to rise (see chart).

OpenAI’s new fundraising is shaking up Silicon Valley

View Full Image

(The Economist)

Inference is also becoming more expensive. On September 12th OpenAI introduced a new pair of models, called o1 (nicknamed Strawberry), which are designed to take multiple “reasoning” steps to produce a more accurate response to a query, relying heavily on a process called reinforcement learning. (Ask the latest version of ChatGPT how many rs in strawberry, and it immediately says two. Incorporate o1, and after four seconds of what it calls “thinking”, it gives the right answer.) That step-by-step approach, particularly useful for complex subjects like maths and science, improves as more computing power is used to think through a response.

As LLMs become ever more computationally intensive, those developing them are furiously searching for ways to bring down their cost. Meanwhile, many VC firms are being priced out of the market. Instead of pouring money into models, some are instead funding the startups that are building on top of them, such as those providing coding tools, or virtual health care, or customer support.

This is bringing about a third big shift in the VC playbook, as the industry is forced to work out how startups that rely on costly LLMs can become profitable. Digital advertising, the favoured monetisation model in Silicon Valley for decades, is tricky to incorporate into generative-AI tools without undermining their credibility with users. Subscriptions may also be difficult. Software firms typically charge per user per month. But as companies roll out AI agents that can help humans do their work, the number of users may fall.

Backchat

OpenAI still has its sceptics. They struggle to see how its revenue growth can justify such a stratospheric valuation, especially given the competition it faces from smaller, cheaper models, some of which are at least partially open-source. Big investments from deep-pocketed sovereign-wealth funds are often a sign of overly exuberant expectations. Scientific breakthroughs in model-building could upend the industry. Sceptics also think the rapid turnover of top talent at OpenAI underscores lingering corporate-governance and safety concerns, following the ousting and subsequent reinstatement of Mr Altman less than a year ago.

It will certainly not be easy for the would-be hectocorn to continue galloping ahead of its rivals. Anthropic is investing heavily, with Amazon’s backing. Google, Meta and xAI all have strong offerings of their own. Competition is fierce. If the rest of Silicon Valley wants in on the action, it will need to think differently.

© 2024, The Economist Newspaper Ltd. All rights reserved. From The Economist, published under licence. The original content can be found on www.economist.com

 



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *