OpenAI co-founder and President Greg Brockman confirmed on May 5, 2026, that the artificial intelligence research organization expects its annual expenditure on computing power to reach $50 billion by the 2026 fiscal year. Speaking at the Global AI Infrastructure Summit, Brockman detailed the company's roadmap for infrastructure acquisition, emphasizing that the capital will be directed toward the massive hardware and energy requirements necessary to sustain its current software suite and pioneer more advanced generative models. This announcement marks a significant milestone in the company's transition from a research-focused entity to a massive-scale infrastructure operator.
The $50 billion figure represents a significant escalation from previous years, reflecting the intensifying computational demands of large-scale AI training and inference. Brockman noted that the budget includes the procurement of millions of high-end graphics processing units (GPUs), specialized AI accelerators, and the construction of proprietary data center facilities. This spending trajectory aligns with OpenAI’s ongoing efforts to push the boundaries of scaling laws, which suggest that increased data and compute lead to more capable and reliable AI systems. Brockman stated that the scale of compute required for the next generation of models is orders of magnitude higher than what was used for previous iterations.
While Brockman did not name specific vendors for the 2026 cycle, OpenAI maintains a deep strategic partnership with Microsoft Corporation, which provides the Azure cloud infrastructure used for the company's workloads. The projected $50 billion spend is expected to involve a mix of leased cloud capacity and dedicated hardware clusters. Brockman indicated that securing a stable supply of semiconductors and energy is a primary operational priority for the organization as it moves toward its 2026 goals. He also mentioned that OpenAI is exploring diverse energy sources, including nuclear and renewable options, to power the anticipated expansion of its server farms.
According to Brockman, the investment is essential for the development of frontier models that require exponentially more compute than the current GPT-4 and GPT-5 iterations. The funds will also support the global expansion of OpenAI’s API services and consumer-facing products like ChatGPT, which continue to see rising user engagement across enterprise and individual sectors. Brockman stated that the 2026 budget is a necessary foundation for achieving artificial general intelligence (AGI), citing the need for unprecedented levels of throughput and low-latency processing to handle complex reasoning tasks.
The announcement comes amid reports of OpenAI seeking further capital to fund its ambitious infrastructure projects. Although Brockman did not disclose specific revenue targets for 2026, he affirmed that the $50 billion compute budget is a core component of the company's long-term financial planning. The figure underscores the high capital intensity of the AI industry, where the cost of compute has become a primary differentiator for market leaders. Brockman concluded by noting that the company’s ability to execute on this spending plan will depend on global supply chain stability and the continued evolution of semiconductor efficiency.