
想象一下纽约市酷热夏夜的景象:每台空调都在全力运转,地铁车厢在地下轰鸣,摩天大楼灯火通明。再想象一下圣迭戈遭遇破纪录热浪的巅峰时刻——电力需求飙升至5000兆瓦以上,电网几近崩溃。
萨姆·奥尔特曼(Sam Altman)及其合作伙伴表示,下一代人工智能数据中心的耗电量,接近上述场景耗电量之和。这意味着,单个企业项目每日耗电量,竟超过两座美国城市在极限负荷下的耗电量之和。
芝加哥大学计算机科学教授安德鲁·钱(Andrew Chien)称,这一消息的公布标志着他期待已久的“里程碑时刻”终成现实。
“我从事计算机科学研究已有40年之久,在大部分时间里,计算领域的能耗在整体经济能耗中占比微乎其微,”安德鲁·钱向《财富》杂志表示,“如今,它在整体经济能耗中的占比已相当可观。”
他认为这一转变既令人兴奋,也让人警惕。
“令人担忧的是……到2030年,计算领域耗电量可能占全球总耗电量的10%至12%。我们正迎来关键节点,需要重新审视人工智能及其社会影响。”
近日,OpenAI宣布与英伟达(Nvidia)合作建设耗电量高达10吉瓦的人工智能数据中心,另有总规模达17吉瓦的项目已正式启动。这一耗电量大致相当于纽约市夏季总用电量(10吉瓦)与2024年圣迭戈遭遇极端热浪时的用电量(超5吉瓦)之和。正如一位专家所言,这接近瑞士和葡萄牙两国用电量之和。
“这相当惊人,”安德鲁·钱表示,“一年半前,他们还在讨论5吉瓦规模的项目,如今已将目标提高到10吉瓦、15吉瓦,甚至17吉瓦,呈现持续升级态势。”
康奈尔大学能源系统工程教授、人工智能研究专家尤峰崎(Fengqi You)对此表示认同。
他向《财富》杂志表示:“10吉瓦的耗电量超过了瑞士或葡萄牙的峰值电力需求,而17吉瓦相当于同时满足上述两国电力需求。”
奥尔特曼的其中一个项目已在得克萨斯州破土动工,而该州电网的常规运行负荷约为80吉瓦。
安德鲁·钱指出:“这相当于得克萨斯州整个电网负荷的20%,也相当于其他所有行业的耗电量之和——炼油厂、工厂及家庭用户,简直是天文数字。”
奥尔特曼将扩建计划定位为满足人工智能迅猛发展需求的必要举措。
“这是支撑人工智能发展所需的基础设施,”他在得克萨斯州表示,并指出ChatGPT的使用量在过去18个月内增长了十倍。
人工智能需要哪种能源?
奥尔特曼毫不掩饰自己最青睐的能源类型:核能。他同时支持核裂变和核聚变初创企业,押注唯有核反应堆能提供稳定、集中的能源输出,从而满足人工智能无止境的需求。
“计算基础设施将是未来经济的基石。”他表示,并将核能定位为支撑未来发展的核心能源。
安德鲁·钱对核能在短期内应用的局限性直言不讳。
“据我所知,2030年前能接入电网的核电总量还不到1吉瓦,”他说道,“所以当你听到17吉瓦的需求量时,会发现这些数据根本无法匹配。”
安德鲁·钱指出,对于OpenAI这类需要10至17吉瓦能源的项目,核能“距离满足其需求还相差甚远,即便未来能够满足需求,推进过程也必然十分缓慢”。相反,他认为风能、太阳能、天然气以及新型储能技术将成为主流能源。
康奈尔大学的能源系统专家尤峰崎则持中间立场。他表示,若人工智能持续扩张,从长期来看,核能或许不可或缺,但警示道:“短期内根本不存在如此庞大的备用产能”——无论是化石能源、可再生能源还是核能。“如何在短期内提升产能?目前尚无明确方案。”
他还警告称该时间表可能不切实际。
“常规核电站的审批和建设需耗时数年,”他说道,“短期内,只能依赖可再生能源、天然气,或许还可以改造老旧电厂。核能的供应速度根本无法跟上需求的增长步伐。”
环境代价
这些专家同样担忧大规模能源消耗带来的环境代价。
“我们必须面对现实:企业曾承诺实现清洁生产与净零排放,然而由于人工智能不断增长的需求,这些承诺恐难兑现。”安德鲁·钱表示。
康奈尔大学的尤峰崎指出,生态系统可能会因此承受压力。
“如果数据中心耗尽当地水资源或破坏生物多样性,会引发意想不到的后果。”他表示。
相关投资规模同样令人震惊。OpenAI每个数据中心项目的估值约为500亿美元,计划总投资额达8500亿美元。仅英伟达一家就承诺投入1000亿美元支持这一扩张计划,并将提供数百万枚新款Vera Rubin图形处理器(GPU)。
安德鲁·钱补充道,我们需要围绕人工智能消耗如此大量电能所引发的环境代价展开更广泛的社会讨论。他指出,除碳排放外,大型数据中心还对水资源、生物多样性及周边社区造成隐性压力。仅冷却系统一项,就可能在水资源本已匮乏的地区消耗大量淡水。由于硬件更新换代速度极快,英伟达每年都会推出新款处理器,旧芯片不断被淘汰,进而形成含有有毒化学物质的废弃物。
“他们曾告诉我们,这些数据中心会是清洁环保的,”安德鲁·钱说道,“然而由于人工智能不断增长的需求,我认为这一目标无法达成。现在是时候向他们施压、要求兑现承诺了。”(*)
译者:中慧言-王芳
想象一下纽约市酷热夏夜的景象:每台空调都在全力运转,地铁车厢在地下轰鸣,摩天大楼灯火通明。再想象一下圣迭戈遭遇破纪录热浪的巅峰时刻——电力需求飙升至5000兆瓦以上,电网几近崩溃。
萨姆·奥尔特曼(Sam Altman)及其合作伙伴表示,下一代人工智能数据中心的耗电量,接近上述场景耗电量之和。这意味着,单个企业项目每日耗电量,竟超过两座美国城市在极限负荷下的耗电量之和。
芝加哥大学计算机科学教授安德鲁·钱(Andrew Chien)称,这一消息的公布标志着他期待已久的“里程碑时刻”终成现实。
“我从事计算机科学研究已有40年之久,在大部分时间里,计算领域的能耗在整体经济能耗中占比微乎其微,”安德鲁·钱向《财富》杂志表示,“如今,它在整体经济能耗中的占比已相当可观。”
他认为这一转变既令人兴奋,也让人警惕。
“令人担忧的是……到2030年,计算领域耗电量可能占全球总耗电量的10%至12%。我们正迎来关键节点,需要重新审视人工智能及其社会影响。”
近日,OpenAI宣布与英伟达(Nvidia)合作建设耗电量高达10吉瓦的人工智能数据中心,另有总规模达17吉瓦的项目已正式启动。这一耗电量大致相当于纽约市夏季总用电量(10吉瓦)与2024年圣迭戈遭遇极端热浪时的用电量(超5吉瓦)之和。正如一位专家所言,这接近瑞士和葡萄牙两国用电量之和。
“这相当惊人,”安德鲁·钱表示,“一年半前,他们还在讨论5吉瓦规模的项目,如今已将目标提高到10吉瓦、15吉瓦,甚至17吉瓦,呈现持续升级态势。”
康奈尔大学能源系统工程教授、人工智能研究专家尤峰崎(Fengqi You)对此表示认同。
他向《财富》杂志表示:“10吉瓦的耗电量超过了瑞士或葡萄牙的峰值电力需求,而17吉瓦相当于同时满足上述两国电力需求。”
奥尔特曼的其中一个项目已在得克萨斯州破土动工,而该州电网的常规运行负荷约为80吉瓦。
安德鲁·钱指出:“这相当于得克萨斯州整个电网负荷的20%,也相当于其他所有行业的耗电量之和——炼油厂、工厂及家庭用户,简直是天文数字。”
奥尔特曼将扩建计划定位为满足人工智能迅猛发展需求的必要举措。
“这是支撑人工智能发展所需的基础设施,”他在得克萨斯州表示,并指出ChatGPT的使用量在过去18个月内增长了十倍。
人工智能需要哪种能源?
奥尔特曼毫不掩饰自己最青睐的能源类型:核能。他同时支持核裂变和核聚变初创企业,押注唯有核反应堆能提供稳定、集中的能源输出,从而满足人工智能无止境的需求。
“计算基础设施将是未来经济的基石。”他表示,并将核能定位为支撑未来发展的核心能源。
安德鲁·钱对核能在短期内应用的局限性直言不讳。
“据我所知,2030年前能接入电网的核电总量还不到1吉瓦,”他说道,“所以当你听到17吉瓦的需求量时,会发现这些数据根本无法匹配。”
安德鲁·钱指出,对于OpenAI这类需要10至17吉瓦能源的项目,核能“距离满足其需求还相差甚远,即便未来能够满足需求,推进过程也必然十分缓慢”。相反,他认为风能、太阳能、天然气以及新型储能技术将成为主流能源。
康奈尔大学的能源系统专家尤峰崎则持中间立场。他表示,若人工智能持续扩张,从长期来看,核能或许不可或缺,但警示道:“短期内根本不存在如此庞大的备用产能”——无论是化石能源、可再生能源还是核能。“如何在短期内提升产能?目前尚无明确方案。”
他还警告称该时间表可能不切实际。
“常规核电站的审批和建设需耗时数年,”他说道,“短期内,只能依赖可再生能源、天然气,或许还可以改造老旧电厂。核能的供应速度根本无法跟上需求的增长步伐。”
环境代价
这些专家同样担忧大规模能源消耗带来的环境代价。
“我们必须面对现实:企业曾承诺实现清洁生产与净零排放,然而由于人工智能不断增长的需求,这些承诺恐难兑现。”安德鲁·钱表示。
康奈尔大学的尤峰崎指出,生态系统可能会因此承受压力。
“如果数据中心耗尽当地水资源或破坏生物多样性,会引发意想不到的后果。”他表示。
相关投资规模同样令人震惊。OpenAI每个数据中心项目的估值约为500亿美元,计划总投资额达8500亿美元。仅英伟达一家就承诺投入1000亿美元支持这一扩张计划,并将提供数百万枚新款Vera Rubin图形处理器(GPU)。
安德鲁·钱补充道,我们需要围绕人工智能消耗如此大量电能所引发的环境代价展开更广泛的社会讨论。他指出,除碳排放外,大型数据中心还对水资源、生物多样性及周边社区造成隐性压力。仅冷却系统一项,就可能在水资源本已匮乏的地区消耗大量淡水。由于硬件更新换代速度极快,英伟达每年都会推出新款处理器,旧芯片不断被淘汰,进而形成含有有毒化学物质的废弃物。
“他们曾告诉我们,这些数据中心会是清洁环保的,”安德鲁·钱说道,“然而由于人工智能不断增长的需求,我认为这一目标无法达成。现在是时候向他们施压、要求兑现承诺了。”(*)
译者:中慧言-王芳
Picture New York City on a sweltering summer night: every air conditioner straining, subway cars humming underground, towers blazing with light. Now add San Diego at the peak of a record-breaking heat wave, when demand shot past 5,000 megawatts and the grid nearly buckled.
That’s almost the scale of electricity that Sam Altman and his partners say will be devoured by their next wave of AI data centers—a single corporate project consuming more power, every single day, than two American cities pushed to their breaking point.
The announcement is a “seminal moment” that Andrew Chien, a professor of computer science at the University of Chicago, says he has been waiting a long time to see coming to fruition.
“I’ve been a computer scientist for 40 years, and for most of that time computing was the tiniest piece of our economy’s power use,” Chien told Fortune. “Now, it’s becoming a large share of what the whole economy consumes.”
He called the shift both exciting and alarming.
“It’s scary because … now [computing] could be 10% or 12% of the world’s power by 2030. We’re coming to some seminal moments for how we think about AI and its impact on society.”
This week, OpenAI announced a plan with Nvidia to build AI data centers consuming up to 10 gigawatts of power, with additional projects totaling 17 gigawatts already in motion. That’s roughly equivalent to powering New York City—which uses 10 gigawatts in the summer—and San Diego during the intense heat wave of 2024, when more than five gigawatts were used. Or, as one expert put it, it’s close to the total electricity demand of Switzerland and Portugal combined.
“It’s pretty amazing,” Chien said. “A year and a half ago they were talking about five gigawatts. Now they’ve upped the ante to 10, 15, even 17. There’s an ongoing escalation.”
Fengqi You, an energy-systems engineering professor at Cornell University, who also studies AI, agreed.
“Ten gigawatts is more than the peak power demand in Switzerland or Portugal,” he told Fortune. “Seventeen gigawatts is like powering both countries together.”
The Texas grid, where Altman broke ground on one of the projects this week, typically runs around 80 gigawatts.
“So you’re talking about an amount of power that’s comparable to 20% of the whole Texas grid,” Chien said. “That’s for all the other industries—refineries, factories, households. It’s a crazy large amount of power.”
Altman has framed the build-out as necessary to keep up with AI’s runaway demand.
“This is what it takes to deliver AI,” he said in Texas. Usage of ChatGPT, he noted, has jumped 10-fold in the past 18 months.
Which energy source does AI need?
Altman has made no secret of his favorite source: nuclear. He has backed both fission and fusion startups, betting that only reactors can provide the kind of steady, concentrated output needed to keep AI’s insatiable demand fed.
“Compute infrastructure will be the basis for the economy of the future,” he said, framing nuclear as the backbone of that future.
Chien, however, is blunt about the near-term limits.
“As far as I know, the amount of nuclear power that could be brought on the grid before 2030 is less than a gigawatt,” he said. “So when you hear 17 gigawatts, the numbers just don’t match up.”
With projects like OpenAI’s demanding 10 to 17 gigawatts, nuclear is “a ways off, and a slow ramp, even when you get there,” Chien said. Instead, he expects wind, solar, natural gas, and new storage technologies to dominate.
You, the energy-systems expert at Cornell, struck a middle ground. He said nuclear may be unavoidable in the long run if AI keeps expanding, but cautioned that “in the short term, there’s just not that much spare capacity”—whether fossil, renewable, or nuclear. “How can we expand this capacity in the short term? That’s not clear,” he said.
He also warned that timeline may be unrealistic.
“A typical nuclear plant takes years to permit and build,” he said. “In the short term, they’ll have to rely on renewables, natural gas, and maybe retrofitting older plants. Nuclear won’t arrive fast enough.”
Environmental costs
The environmental costs loom large for these experts, too.
“We have to face the reality that companies promised they’d be clean and net zero, and in the face of AI growth, they probably can’t be,” Chien said.
Ecosystems could come under stress, Cornell’s You said.
“If data centers consume all the local water or disrupt biodiversity, that creates unintended consequences,” he said.
The investment figures are staggering. Each OpenAI site is valued at roughly $50 billion, adding up to $850 billion in planned spending. Nvidia alone has pledged up to $100 billion to back the expansion, providing millions of its new Vera Rubin GPUs.
Chien added that we need a broader societal conversation about the looming environmental costs of using that much electricity for AI. Beyond carbon emissions, he pointed to hidden strains on water supplies, biodiversity, and local communities near massive data centers. Cooling alone, he noted, can consume vast amounts of fresh water in regions already facing scarcity. And because the hardware churns so quickly—with new Nvidia processors rolling out every year—old chips are constantly discarded, creating waste streams laced with toxic chemicals.
“They told us these data centers were going to be clean and green,” Chien said. “But in the face of AI growth, I don’t think they can be. Now is the time to hold their feet to the fire.”
