OpenAI CEO Sam Altman has reportedly been having discussion with members of Congress about increasing the global supply of semiconductors capable of supporting AI workloads.

According to a report in the Washington Post, Altman has been involved in conversations relating to how and where new semiconductor fabs might be built. In August 2022, President Joe Biden signed the CHIPS and Science Act which includes $52.7 billion in subsidies for US semiconductor manufacturers.

The US Library of Congress' main reading room, Thomas Jefferson Building
– Highsmith, Carol M/ Flikr

News about Altman’s conversations with members of Congress came hot on the heels of reports that he had been engaged in similar discussions with Middle Eastern investors and semiconductor manufacturers in an effort to raise billions of dollars for an AI chip venture that would include the development of a global network of fabrication plants.

In November 2023, around the time Altman was suddenly fired and then rehired by OpenAI, Bloomberg reported that the CEO had been seeking investment to build an artificial intelligence chip company.

In addition to the need for more fabrication plants, Altman also recently stated that he believes an energy 'breakthrough' is necessary to advance AI models.

During a panel discussion with Bloomberg at the World Economic Forum in Davos, Switzerland, he told the audience that low-carbon energy sources including nuclear fusion are needed for the unexpected energy demands of AI.

"There's no way to get there without a breakthrough," he said. "It motivates us to go invest more in fusion."

Founded in 2015 as an AI research organization, OpenAI was arguably the driving force behind the generative AI wave that swept across the globe following the release of the company’s ChatGPT chatbot in November 2022.

This generative AI boom has led to a huge investment in compute power, with the energy requirements needed to support the technology on its current trajectory set to reach monumental levels.

As discussed by Alex de Vries of Digiconomist on a DCD podcast, his peer-reviewed analysis published in October 2023 estimates that at least 85.4 terawatt-hours of electricity would be required annually to continually power the 1.5 million AI server units Nvidia is on track to ship each year by 2027.