Accelerating the next phase of AI

Accelerating the next phase of AI

OpenAI News

今天我们完成了最新一轮融资,承诺资金达 1220 亿美元,事后估值为 8520 亿美元。

现在, OpenAI 正在成为 AI 的核心基础设施,让全世界的个人和企业,无论大小,都能更容易地去构建应用。面向消费者的广泛普及使得 ChatGPT 成为进入职场的强大分发通道——企业端的需求正迅速从简单的模型接入,转向能够改造业务运作的智能系统。开发者通过我们的 API 在平台上构建并扩展功能, 而 Codex 正在改变开发者把创意变成可运行软件的方式。持续可得的算力是贯穿系统的战略优势:它推进研究、改进产品、扩大可及性,并从结构上降低大规模交付的成本。消费者采纳、企业部署、开发者使用与算力相互强化,形成把能力转化为经济影响的良性循环。

在成长速度上, OpenAI 创造了多项纪录:成为最快达到 1000 万用户、最快达到 1 亿用户的科技平台,并且很快将成为周活跃用户达 10 亿的平台。 ChatGPT 上线一年内,我们实现了 10 亿美元的营收;到 2024 年底我们已实现季度营收 10 亿美元;现在我们每月营收达 20 亿美元。目前我们的营收增速是定义互联网和移动时代的公司(包括 Alphabet 和 Meta )的四倍。

这既是商业规模,也是使命规模。让更多人尽早使用有用的智能,并让这种接入在全球范围内复利,是扩大 AI 益处的最快方式。 AI 正在带来生产力提升、加速科学发现,并扩展个人与组织能构建的可能性。这轮融资将使我们拥有继续以这一时代所需规模领先的资源。

全球资本的深度信心

我们的雄心得到了合作伙伴的坚定支持。本轮由战略伙伴 Amazon 、 NVIDIA 和 SoftBank 担纲领投,长期合作伙伴 Microsoft 继续参与。 SoftBank 与 a16z 、 D. E. Shaw Ventures 、 MGX 、 TPG 及受 T. Rowe Price Associates, Inc. 顾问管理的账户共同担任联合领投。

此外,还有来自全球多元机构的显著参与,包括 Altimeter 、 Appaloosa LP 、 ARK Invest 、 BlackRock 关联基金、 Blackstone 、 Coatue 、 D1 Capital Partners 、 Dragoneer 、 Fidelity Management & Research Company 、 Goanna Capital 、 Insight Partners 、 The Paragon Group 、 Sands Capital 、 Sequoia Capital 、 Sound Ventures 、 Temasek 、 Thrive Capital 、 UC Investments (University of California CIO Office) 、以及 Winslow Capital 。

我们还首次通过银行渠道向个人投资者开放,筹得超过 30 亿美元的个人投资。今天我们也宣布, OpenAI 将被纳入由 ARK Invest 管理的多只交易所交易基金,进一步扩大全球持有者基础,让更多人有机会分享 OpenAI 和 AI 时代带来的上行价值。

我们同时将现有的循环信贷额度扩展至约 47 亿美元,为在大规模投资时提供更大灵活性。该额度由包括 JPMorgan Chase 、 Citi 、 Goldman Sachs 、 Morgan Stanley 、 Wells Fargo 、 Mizuho 、 Royal Bank of Canada 、 SMBC 、 UBS 、 HSBC 及 Santander 在内的全球银团支持。目前该额度在交割时尚未动用。

消费端与企业端的领先地位

我们在 ChatGPT 、API 和企业产品上持续推出进展。近期我们推出了至今能力最强的模型 GPT‑5.4 ,在智能与工作流效率上都有显著提升。我们将 Codex 打造成旗舰级的编程代理,并在记忆、搜索、个性化和多模态交互方面取得推进,同时扩展到医疗、科学发现与商业等领域。

这些产品势头在数据上可见一斑。 ChatGPT 在消费者端遥遥领先:周活跃用户超过 9 亿,订阅用户超 5000 万。 ChatGPT 的月网页访问量与移动会话是第二大 AI 应用的 6 倍,用户在 AI 上总停留时间是第二大应用的 4 倍,也是其他所有应用之和的 4 倍。搜索使用量在一年内几乎增长了三倍,我们的广告试点在不到六周就达到了超过 1 亿美元的 ARR 。这些不仅是增长里程碑——也显示前沿 AI 正成为全球日常生活的一部分。

企业端的势头同样强劲,企业业务目前占我们营收的 40% 以上,按进度有望在 2026 年底与消费者业务实现平衡。 GPT‑5.4 在以代理为核心的工作流上推动了创纪录的参与度。我们的 API 目前每分钟处理超过 150 亿个令牌(tokens)。 Codex 的周活跃用户超过 200 万,过去三个月增长了 5 倍,使用量环比增长超过 70%。

算力是一项战略优势

算力驱动 AI 的每一层面:前沿研究与模型、产品、部署与营收。自从 ChatGPT 推出后,随着对智能系统需求的加速,我们的营收与可用算力都快速增长。

每一代基础设施升级都会让模型更强,使每个令牌变得更“聪明”。同时,算法与硬件的改进降低了每个令牌的服务成本,从而降低了单位智能的成本。更高的智能让 AI 能胜任更复杂的工作流,进而提高使用量、拉动算力需求,加速下一轮良性循环。

这会产生复利效应:更好的基础设施与更强的模型降低交付成本,而更优的产品与更深入的企业部署则提升每单位算力的营收。随着利用率提高和平台成熟,这将随着时间带来显著的经营杠杆。

在过去 15 个月里,我们把基础设施策略从少数核心供应商扩展到更广的组合,以满足全球 AI 部署的规模与可靠性要求。

Nvidia 仍是我们基础设施的根基。我们的训练集群和绝大多数推理堆栈继续运行在 Nvidia GPU 上,而本轮融资将随着规模扩大进一步加深这一合作。

对 AI 系统的需求增长迅速且更为多样化,没有单一架构能高效满足整个前沿的所有需求。为保持灵活并满足需求,我们正在构建一个更广泛的基础设施组合,涵盖多家云服务商、多种芯片平台,以及跨堆栈的更深度协同设计。

该策略目前覆盖:在云端与 Microsoft 、 Oracle 、 AWS 、 CoreWeave 以及 Google Cloud 合作;在芯片端包括 NVIDIA 、 AMD 、 AWS Trainium 、 Cerebras 以及我们与 Broadcom 合作开发的自有芯片;在数据中心方面则与 Oracle 、 SBE 及 SoftBank 建立合作。

OpenAI 的良性循环很简单:更多算力推动更智能的模型;更智能的模型催生更优秀的产品;更好的产品带来更快的采纳、更多的营收与现金流;由此我们获得再投资能力,更高效地把智能交付给全球的消费者、企业和开发者。

构建 AI 超级应用

这就是我们要打造统一 AI 超级应用的原因。随着模型能力提升,限制因素将从“有没有智能”转为“能否好用”。用户不想要割裂的工具,他们希望有一个能理解意图、采取行动,并跨应用、数据与工作流运作的单一系统。我们的超级应用将把 ChatGPT 、 Codex 、浏览能力和更广泛的代理化功能整合为以代理为先的一体化体验。

这不仅是产品层面的简化,更是一种分发与部署策略。通过统一界面,我们能把模型能力的提升直接转化为用户采纳与参与。消费端的规模成为企业使用的前门,日常生活中的熟悉度推动职场采纳。与此同时,统一的产品界面让我们能够更快改进、协同发布,并在代理化工作流中捕获更多价值。

最终形成的是一个紧密集成的体系:基础设施赋能智能,智能驱动代理,产品使这些代理在全球范围内具有实际用途。

这样的历史性时刻并不常有。过去几代资本市场帮助建成了塑造现代经济的系统——从电力到公路再到互联网。今天的时刻类似:部署的资本正在构建智能本身的基础设施层。随着时间推移,这些价值将反哺经济,回流到企业、社区,越来越多地惠及个人。

让我们开始建设。



Today, we closed our latest funding round with $122 billion in committed capital at a post money valuation of $852 billion.


OpenAI is becoming the core infrastructure for AI, making it possible for people around the world and businesses, big and small, to just build things. The broad consumer reach of ChatGPT creates a powerful distribution channel into the workplace, where demand is rapidly shifting from basic model access to intelligent systems that reshape how businesses operate. Developers build on and expand the platform by leveraging our APIs, and Codex is transforming how developers turn ideas into working software. Durable access to compute is the strategic advantage that compounds across the entire system: it advances research, improves products, expands access, and structurally lowers the cost of delivery at scale. Together, consumer adoption, enterprise deployment, developer usage, and compute form a reinforcing flywheel that is translating capability into economic impact.


OpenAI was the fastest technology platform to reach 10 million users, the fastest to 100 million users, and soon the fastest to 1 billion weekly active users. Within a year of launching ChatGPT, we reached $1B in revenue. By the end of 2024 we were generating $1B per quarter. We are now generating $2B in revenue per month. At this stage, we are growing revenue four times faster than the companies who defined the Internet and mobile eras, including Alphabet and Meta.


This is commercial scale, and it is mission scale. The fastest way to widen the benefits of AI is to put useful intelligence in people’s hands early and let that access compound globally. AI is driving productivity gains, accelerating scientific discovery, and expanding what people and organizations can build. This funding gives us the resources to continue to lead at the scale this moment demands.


Deep conviction across global capital


Our ambition is matched by the commitment of the partners backing us. The round was anchored by our strategic partners Amazon, NVIDIA, and SoftBank, with continued participation from our long-term partner, Microsoft. SoftBank co-led the round alongside a16z, D. E. Shaw Ventures, MGX, TPG, and accounts advised by T. Rowe Price Associates, Inc. 


There was also significant participation from a diverse set of global institutions including Altimeter, Appaloosa LP, ARK Invest, affiliated funds of BlackRock, Blackstone, Coatue, D1 Capital Partners, Dragoneer, Fidelity Management & Research Company, Goanna Capital, Insight Partners, The Paragon Group, Sands Capital, Sequoia Capital, Sound Ventures, Temasek, Thrive Capital, UC Investments (University of California CIO Office), and Winslow Capital. 


For the first time, we extended participation to investors through bank channels, raising over $3 billion from individual investors. Today, we’re also announcing that OpenAI will be included in several exchange-traded funds managed by ARK Invest, further broadening ownership and giving more people the opportunity to share in the upside economics of OpenAI and the AI era.


We have also expanded our existing revolving credit facility to approximately $4.7 billion, which gives us added flexibility as we continue to invest at scale. The facility is supported by a global syndicate including JPMorgan Chase, Citi, Goldman Sachs, Morgan Stanley, Wells Fargo, Mizuho, Royal Bank of Canada, SMBC, UBS, HSBC, and Santander. The facility remains undrawn at close.



Leadership across consumer and enterprise


We are continually shipping advances across ChatGPT, the API, and our enterprise products. We recently launched GPT‑5.4, our most capable model yet, with meaningful gains in intelligence and workflow performance. We expanded Codex into a flagship coding agent. We pushed forward on memory, search, personalization, and multimodal interaction. We also expanded into areas like health, scientific discovery, and commerce.

That product momentum shows up in the numbers. ChatGPT is the overwhelming leader in consumer AI with more than 900 million weekly active users, and over 50 million subscribers. ChatGPT has 6x the monthly web visits and mobile sessions than the next largest AI app, while total AI time spent is 4x the next largest AI app and 4x all others combined. Search usage has nearly tripled in a year, and our ads pilot reached more than $100 million in ARR in under six weeks. These are not just growth milestones—they show that frontier AI is becoming part of everyday life for people around the world.


Momentum is just as strong on the enterprise side, which now makes up more than 40% of our revenue, and is on track to reach parity with consumer by the end of 2026. GPT‑5.4 is driving record engagement across agentic workflows. Our APIs now process more than 15 billion tokens per minute. Codex now serves over 2 million weekly users, up 5x in the past three months, with usage growing more than 70% month over month.


Compute is a strategic advantage


Compute powers every layer of AI: frontier research and models, products, deployment, and revenue. Since ChatGPT launched, both our revenue and our available compute have scaled rapidly as demand for intelligent systems has accelerated.

With each new generation of infrastructure, we train more capable models, making each token more intelligent than before. At the same time, algorithmic and hardware improvements reduce the cost to serve each token, lowering the cost per unit of intelligence. That added intelligence makes AI useful for more complex workflows, which increases usage, drives compute demand, and accelerates the next turn of the flywheel.


This creates a compounding effect: better infrastructure and better models lower the cost of delivery, while improved products and deeper enterprise deployment increase revenue per unit of compute. As utilization increases and the platform matures, this drives meaningful operating leverage over time.

Over the past 15 months, we have expanded our infrastructure strategy beyond a small number of core providers to meet the scale and reliability requirements of global AI deployment. 


Nvidia remains the foundation of our infrastructure. Our training fleet and the majority of our inference stack continue to run on Nvidia GPUs, and with this round we are deepening that partnership as we scale. 


Demand for AI systems is growing faster and becoming more diverse. No single architecture can efficiently meet the needs of the entire AI frontier. To meet that demand and stay flexible, we are building a broader infrastructure portfolio across multiple cloud partners, multiple chip platforms, and deeper co-design across the stack.


This strategy now spans: cloud through Microsoft, Oracle, AWS, CoreWeave, and Google Cloud; silicon through NVIDIA, AMD, AWS Trainium, Cerebras, and our own chip in partnership with Broadcom; and data centers through partnerships with Oracle, SBE, and SoftBank. 


The OpenAI flywheel is simple. More compute drives more intelligent models. More intelligent models drive better products. Better products drive faster adoption, more revenue and more cashflow. That gives us the ability to reinvest and deliver intelligence more efficiently to consumers, enterprises, and builders around the world.


Building an AI superapp


That is why we are building a unified AI superapp. As models become more capable, the limiting factor shifts from intelligence to usability. Users do not want disconnected tools. They want a single system that can understand intent, take action, and operate across applications, data, and workflows. Our superapp will bring together ChatGPT, Codex, browsing, and our broader agentic capabilities into one agent-first experience.


This is not just product simplification. It is a distribution and deployment strategy. By unifying our surfaces, we can translate advances in model capability directly into user adoption and engagement. Our consumer scale becomes the front door for enterprise usage, as familiarity in daily life drives adoption at work. At the same time, a single product surface allows us to improve faster, ship more coherently, and capture more of the value created by agentic workflows.


The result is a tightly integrated system: infrastructure that enables intelligence, intelligence that powers agents, and products that make those agents useful at global scale.


Moments like this do not come often. In past generations, capital markets helped build the systems that defined modern economies, from electricity to highways to the internet. This is that kind of moment again. The capital being deployed today is helping build the infrastructure layer for intelligence itself. Over time, that value will flow back into the economy, to companies, to communities, and increasingly to individuals.


Let’s go build.



Generated by RSStT. The copyright belongs to the original author.

Source

Report Page