Scroll Top

Sam Altman wants up to $7 Trillion for a new chip venture, but who doesn’t!?

WHY THIS MATTERS IN BRIEF

As stupid as this sounds on the one hand it’s a grandoise ask, on the other it’s based in the reality that one day AI will run the planet …

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

Recently the Wall Street Journal reported that OpenAI CEO Sam Altman wants to raise up to $7 trillion for a “wildly-ambitious” tech project to boost the world’s chip capacity, funded by investors including the UAE — which in turn will vastly expand OpenAI’s own ability to power it’s giant foundational Artificial Intelligence (AI) models like GPT-4, ChatGPT and eventually GPT-5.

 

RELATED
IBM sets a small record as researchers store one bit of data on one atom

 

While this may simply be a dreamy moonshot on Altman’s part, or an Elon Musk like hype generator, what is not in doubt is the environmental impact of such a massive effort, according to Sasha Luccioni, climate lead and researcher at Hugging Face.

“If it does work out, the amount of natural resources that will be required is just mind-boggling,” she told reporters. “Even if the energy is renewable (which it isn’t guaranteed to be), the quantity of water and rare earth minerals required is astronomical.”

 

The Future of AI and Generative AI, by Keynote Matthew Griffin

 

For comparison, in September 2023 Fortune reported that AI tools fueled a 34% spike in Microsoft’s water consumption; Meta’s Llama 2 model reportedly guzzled twice as much water as Llama 1; and a 2023 study found that OpenAI’s GPT-3 training consumed 700,000 liters of water.

And beyond the environmental impact, shortages of rare earth minerals such as gallium and germanium have even helped inflame the global chip war with China.

 

RELATED
World's fastest supercomputer gets recruited to help develop a Covid-19 vaccine

 

Luccioni criticized Altman for not focusing on more efficient AI methods to develop AI like the ones that I’ve been talking about for a while such as Shallow Neural Networks and so forth. Instead, she said, “he’s taking a brute force approach and people are calling it.. visionary?”

But the fact is, Altman’s desire to tackle the current GPU shortages and reshape the semiconductor landscape is not unusual. Last summer, Reuters reported on how access to Nvidia’s hard-to-come-by, ultra-expensive, high-performance computing H100 GPU for large language model (LLM) training was becoming the “top gossip” of Silicon Valley.

And just last week, Meta offered a deep dive into its AI strategy in its latest earnings call, CEO Mark Zuckerberg said that to build AI “full Artificial General Intelligence (AGI)” the first key requirement is “world-class compute infrastructure.” Zuckerberg went on to repeat what he had recently disclosed in a recent Instagram Reel: that by end of this year Meta will have about 350k H100s — including other GPUs the total will be around 600k H100 equivalents of compute.

 

RELATED
Researchers have created a programming language for living cells

 

The company plans to continue investing aggressively in this area, he explained: “In order to build the most advanced clusters, we’re also designing novel data centers and designing our own custom silicon specialized for our workloads.”

Luccioni has been critical about Nvidia’s transparency about the carbon footprint of its products (which are designed by the company but manufactured by the Taiwan Semiconductor Manufacturing Company: “Nvidia has yet to publish any information about the environmental footprint of their manufacturing,” she said, adding that e-waste as a whole is also a “huge issue because people want the new GPUs and they’re essentially throwing out the old ones after a year or two.”

In Nvidia’s 2023 Corporate Responsibility Report, the company said “emissions are generated at every stage of our product lifecycle, including manufacturing within our supply chain. Since 2014, we’ve expected our key silicon manufacturing and systems contract manufacturing suppliers to report their annual energy and water usage, waste, greenhouse gas (GHG) emissions, and reduction goals and objectives through the RBA Environmental Survey or CDP. We also expect suppliers to have their GHG emissions verified by a third party. We use this supplier data to better understand our product manufacturing impact and allocate carbon emissions to our customers.”

 

RELATED
Researchers use just 14 atoms to build the world's first 0.5nm transistor

 

Overall, Luccioni maintains that there is less transparency today when it comes to the environmental impact of AI — and is unlikely to change anytime soon with Altman’s new fundraising march.

“If you look at the PaLM 1 paper from Google, which was in 2022, and then Palm 2 [released in May 2023], the amount of information they provided drastically dropped,” she said. In the original paper, she explained, Google shared enough information so that energy-use estimates could be made.

“Now [companies] don’t even say how long it took [to train], how many chips they used, there’s absolutely no information provided anymore,” she said.

But overall, Luccioni says she isn’t too worried: “I think this is just a moonshot project that won’t actually pan out,” she said. “But that will put [Altman] on par with Elon in terms of outlandish projects that attract attention and generate hype.”

Related Posts

Leave a comment

You have Successfully Subscribed!

Pin It on Pinterest

Share This