The next phase of enterprise AI
OpenAI News我刚在 OpenAI 完成了前 90 天的工作,期间有机会与数百位客户面对面交流。最令我感触深刻的是他们强烈的紧迫感和准备度。我整个职业生涯都在技术与企业转型的交汇处,但从未见过这种程度的信念能如此迅速、广泛地在各行各业蔓延。这些企业领导人把人工智能视为他们有生之年最具影响力的变革,并在问我们如何围绕它重塑公司。
这种信念也在本季度的业绩中有所体现。在消费者业务优势的基础上,企业业务现在已占到我们收入的 40% 以上,预计到 2026 年底将与消费者业务持平。 Codex 刚刚达到每周 300 万活跃用户,我们的 API 每分钟处理的 token 超过 150 亿, GPT‑5.4 在各类代理型工作流中带来了创纪录的参与度。我们看到像 Goldman Sachs、 Phillips 和 State Farm 这样的新客户,也在与 Cursor、 DoorDash、 Thermo Fisher 和 LY Corporation 等既有客户持续增长。以 DoorDash 为例,他们将 ChatGPT Enterprise 在旗下品牌(包括 Deliveroo 和 Wolt)中推广,并把 Codex 引入开发者以便进行代码审查、提升效率。
很明显,我们已经跨过了试验阶段。AI 正在承担实际工作,因此每家公司都在思考两个核心问题:
- 如何把最强大的 AI 部署到整个业务中,而不仅仅是单个副驾驶或助理?
- 如何把 AI 融入到员工的日常工作中,真正帮助他们发挥全部潜能?
这些问题将决定未来几年公司如何运作与竞争,这也正是我们的企业战略方向:以 Frontier 作为支撑公司所有代理的底层智能层,以及打造一个统一的 AI 超级应用,成为员工完成工作的主要入口。
OpenAI 在塑造企业未来方面具有独特优势,因为我们是少数能构建完整技术栈的公司之一——从基础设施和模型,到员工每天使用的界面。我们在倾听客户需求,并迅速成为企业级 AI 的核心基础设施,让全球个人和各类企业都能更容易构建产品,并自信迈入未来的工作方式。
推动企业范围内的代理化
如我们此前所说,世界正处于能力过剩(capability overhang)的阶段,模型已经能做的远超大多数人和企业目前的使用程度。我们致力于缩小这道差距,让前沿智能可用、可被信任,并深度嵌入实际工作流程。
我反复听到的一个诉求是,企业厌倦了相互独立、互不连通的 AI 点状解决方案,它们往往只带来混乱。企业希望 AI 成为统一的运营层:AI 同事能基于公司自身语境,连接内部系统与外部数据源,并受适当权限与管控约束。这正是我们通过 OpenAI Frontier 提供的能力,它正在帮助像 Oracle、 State Farm 和 Uber 这样的客户在公司范围内构建、部署和管理代理。与把代理嵌入到单一产品或环境中的其他方案不同, Frontier 让代理能跨公司系统与数据流动,在多种工具间协作,并随着使用不断改进。
除了作为一家研发前沿模型的公司外,我们也是一家部署型公司。我们把与数百家大型企业直接合作、整合 AI 代理的经验,打造为可扩展的基础设施。通过我们的 Frontier Alliances 合作伙伴 McKinsey & Company、 Boston Consulting Group ( BCG )、 Accenture 和 Capgemini,以及像 Amazon Web Services ( AWS )、 Databricks、 Snowflake 这样的技术伙伴,我们帮助企业把 OpenAI 的智能整合进现有的基础设施和数据生态中。例如,我们与 AWS 共同开发的 Stateful Runtime Environment,使代理能保持上下文、记住先前工作并在企业的工具与数据间运行,从而能更高效地应对复杂的真实场景。
赋能个人与团队
随着 AI 在公司内部扩展,它也必须自然而然地出现在每个人和每个团队的日常工作流程中。这就是我们构建统一 AI 超级应用的初衷:为员工提供一个全天候与 AI 代理协作的场所,在其熟悉的工具中完成任务并采取行动。这一体验将融合 ChatGPT、 Codex、具代理性的浏览能力和更广泛的功能,成倍提升个人员工和小团队的产出。
近几个月我们看到,走在前列的组织已经从把 AI 当作“帮忙做任务”的工具,转向让多位代理为他们管理并执行任务。这个转变始于像 Codex 这样的代理工具——今年以来其增长超过 5 倍。这其中包括 GitHub、 NextDoor、 Notion 和 Wonderful 等客户,他们正在构建能端到端执行工程任务的多代理系统。我们也开始看到各个职能的员工在工作流中越来越多地采用代理。例如,一线销售团队就用一个代理来研究潜在客户、按评分标准打分、给合格客户发送个性化邮件并更新 CRM,从而显著提升获客效率。
我们期待为企业带来更多让代理更易被广泛使用的解决方案。 OpenAI 的一大优势是能够连接个人与职业场景: ChatGPT 拥有 9 亿每周用户,这意味着员工在上岗之前就已经知道如何与它协作。对企业来说,这能降低部署阻力,加速到“每位员工都能把繁琐任务交给机器、投入更具挑战性项目”的节点。
结语
在 OpenAI 的第一个季度让我更加确信,AI 转型的速度比大多数人想象的要快。企业需要一个既理解这场巨变规模、又能帮助他们稳步前进的合作伙伴。这意味着要嵌入他们已依赖的系统,给出从试验到部署的切实路径,并通过合适的定价与包装降低采用门槛。最重要的是,他们要相信帮助他们变革的公司是真正关心其成功并以需求为导向进行构建的。
在 OpenAI,每个层级、每个职能都充满这样的承诺。我们全心致力于不断赢得帮助企业及其成员重塑面向 AGI 的未来的信任——以清晰、自信与可信赖的方式。这既是千载难逢的机遇,也是重大的责任,我对我们与客户和合作伙伴共同打造的未来充满期待。
I just wrapped my first 90 days with OpenAI and have had the opportunity to meet with hundreds of our customers. What has struck me most is their immense sense of urgency and readiness. I’ve spent my entire career at the intersection of technology and enterprise transformation, and yet, I have never seen this level of conviction spread so quickly and consistently across industries. These leaders recognize AI as the most consequential shift of their lifetime, and they’re asking us how to reinvent their companies around it.
I also saw that conviction reflected in our business this quarter. Building on our consumer strength, enterprise now makes up more than 40% of our revenue, and is on track to reach parity with consumer by the end of 2026. Codex just hit 3 million weekly active users, our APIs process more than 15 billion tokens per minute, and GPT‑5.4 is driving record engagement across agentic workflows. We’re seeing demand from new customers like Goldman Sachs, Phillips, and State Farm, and also growing with existing ones like Cursor, DoorDash, Thermo Fisher, and LY Corporation. DoorDash, for example, has expanded with OpenAI by deploying ChatGPT Enterprise to its brands including Deliveroo and Wolt, and bringing Codex to its developers for code review and greater efficiency.
It’s clear we’re past the experimentation phase. AI is now doing real work, and as a result, every company is grappling with two main questions:
- How do we put the most capable AI to work across the entire business, not just individual copilots and assistants?
- How do we make AI part of people’s everyday work, so it helps them unlock their full potential?
These questions will define how companies operate and compete in the years ahead, and that’s what our enterprise strategy is building toward: Frontier as the underlying intelligence layer governing all of a company’s agents, and a unified AI superapp as the primary experience where employees get things done.
OpenAI is uniquely positioned to shape the future of enterprise because we are one of the few companies building the full stack, from infrastructure and models to the interfaces employees use every day. We are listening to our customers and quickly becoming the core infrastructure for AI, making it possible for people around the world and businesses, big and small, to just build things and confidently step into the future of work.
Enabling agents company-wide
As we’ve shared before, the world is in a phase of capability overhang, where AI models can already do far more than most people and enterprises are using them for today. We are committed to closing that gap by making frontier intelligence usable, trusted, and embedded in how work actually gets done.
One thing I hear over and over is that companies are tired of AI point solutions that don’t talk to each other and just create chaos. They want AI to be a unified operating layer for their business, with AI coworkers grounded in their company’s context, connected to internal systems, external data sources, and governed by the right permissions and controls. That is what we’re providing with OpenAI Frontier, which is helping customers like Oracle, State Farm, and Uber build, deploy, and manage agents company-wide. While other solutions embed agents within a single product or environment, Frontier enables agents to move across a company’s systems and data, working across tools, and continuing to improve over time.
On top of being a research company building frontier models, we’re also a deployment company. We’ve taken what we’ve learned from working directly with hundreds of large enterprises on integrating AI agents and turned it into a scalable foundation. Together with our Frontier Alliances partners McKinsey & Company, Boston Consulting Group (BCG), Accenture, and Capgemini, and other partners like Amazon Web Services (AWS), Databricks, and Snowflake, we help enterprises integrate OpenAI’s intelligence into the infrastructure and data ecosystems they already rely on. For example, our Stateful Runtime Environment, which we’re building with AWS, makes it simple for agents to keep context, remember prior work, and operate across a business' tools and data, so it’s far more effective for complex, real-world use cases.
Empowering individuals and teams
As AI scales across the company, it also has to effortlessly show up in the daily workflow of every person and team. That’s why we’re building towards a unified AI superapp: one place where employees can work with AI agents throughout the day to complete tasks and take action across the tools they already use. This experience will bring together the best of ChatGPT, Codex, agentic browsing, and broader capabilities in order to multiply what individual employees and small teams can accomplish.
In recent months, we’ve seen a shift where the people who are furthest ahead have gone from using AI for help on tasks, to managing teams of agents to do tasks for them. The shift started with agentic tools like Codex, which has grown more than 5X since the start of the year. This includes customers like GitHub, NextDoor Notion, and Wonderful that are building multi-agent systems that can execute engineering work end-to-end. We’ve also started to see employees in every function adopting agents in their workflows. For example, our sales team brings in new business using an agent that researches inbound prospects, scores them against a rubric, sends a personalized email to qualified leads, and updates the CRM for them.
We’re excited to bring new solutions to enterprises that will make agents more accessible to everyone. One of OpenAI’s biggest advantages is our ability to bridge personal and professional use cases. ChatGPT has 900 million weekly users, which means employees already know how to work with it. For enterprises, that reduces rollout friction and accelerates the point where every employee can delegate tedious tasks and take on more ambitious projects.
--
My first quarter at OpenAI has made me more convinced than ever that the AI transformation is happening faster than most people realize. Enterprises want a partner who understands the scale of this transition and can help them confidently move forward. That means meeting them in the systems they already rely on, giving them a practical path from experimentation to deployment, and making adoption easier through the right pricing and packaging. Above all, they want to trust that the company helping them make this transformation is invested in their success and building for their needs.
At OpenAI, I feel the commitment at every level, in every function. We are wholeheartedly focused on continuously earning the right to help enterprises – and the people behind them – reinvent their companies for the future of AGI with clarity, confidence, and trust. It's the opportunity and responsibility of a lifetime, and I couldn’t be more excited about what we’re building with our customers and partners.
Generated by RSStT. The copyright belongs to the original author.