In Memphis, Tennessee, the Artificial Intelligence Megacenter Colossus 2 launches a 1-gigawatt training cluster, above the peak of San Francisco. xAI, founded in 2023, accelerated works since March 2025 and installed 200 MW of cooling in six months, using up to 35 gas turbines.
The Artificial Intelligence Megacenter of Elon Musk entered the global radar because it is not just “another data center”: Colossus 2, installed in Memphis, Tennessee, was presented as the first AI training cluster with a capacity of 1 gigawatt, consumption described as exceeding the peak demand of San Francisco, and with a declared ambition to advance to 1.5 gigawatts by April.
This escalation places xAI, a company founded in 2023 when the sector already had established players, as a case of extreme acceleration based on infrastructure: thousands of computing units, cooling on an industrial scale, tight schedules, and a controversial energy solution, with gas turbines operating continuously in a city associated with severe air quality issues.
What Makes Colossus 2 Different From A Regular Data Center

AI data centers are multiplying, but Colossus 2 caught attention due to its power level. A 1-gigawatt training cluster is not an abstract metric: it defines the size of the computational park that can run simultaneously, the cooling requirements, electrical infrastructure, the risk of bottlenecks, and the type of operation needed to keep the system running at a constant pace.
-
The coldest seawater ever recorded in Earth’s history was hidden under extreme ice 717 million years ago and reached an impressive -15°C.
-
China has launched the world’s first floating artificial island, a scientific structure designed to face open seas, test equipment weighing hundreds of tons, and reach depths of 10,000 meters.
-
China accelerates global science and may surpass the United States in 2 years with increased public investment, continuous growth, and direct impact on the global technological competition.
-
Scientific studies indicate that drought may be strengthening a much greater silent threat: more resistant superbugs.
The comparison made in the material serves to dimension: Colossus 2’s consumption was described as greater than the peak demand of the city of San Francisco. This type of reference is not just rhetorical because it transforms the debate into an urban scale, taking the issue out of the technological niche and leading to themes of energy, emissions, and local impact.
The Strategy of xAI: Arrive Late and Run Faster Than Everyone
xAI was created in 2023, when the race for AI was already well established. Even so, the company managed to carve out a niche among the biggest players, even surpassing them in some capacity indicators.
The background is clear: those who arrive late to the AI training must compensate with speed and infrastructure because without massive computing, there is no competitive large-scale training.
Colossus 2 is treated as the symbol of this strategy because it not only represents an upgrade but a leap in level that requires accelerated works, extreme amounts of money, and energy decisions that are normally discussed in industrial megaprojects.
Colossus 2 In Numbers: GPUs, Power, and Record-Scale Money
The basis of the leap is the hardware. Colossus 1 has 230,000 GPUs, and the new cluster raised the standard to over half a million GPUs. This places Colossus 2 among the most expensive centers ever built, not only for the quantity of equipment but for the entire ecosystem necessary to make this computational park function.
A report from EpochAI cited an investment of US$ 44 billion, a figure presented as around R$ 236.6 billion. This number indicates that it is not a “data center project,” but rather a training infrastructure comparable to national-scale works in terms of capital.
Even so, there is the expectation that Microsoft’s center in Fairwater will surpass Colossus 2 in both investment and power. The difference is that this project would still be under construction, so Colossus 2 stands out as the already operating concrete example defining the new playing standard.
The Declared Goal: 1.5 Gigawatts By April and The Pressure for Expansion
Elon Musk reportedly stated that he wants to expand the system to 1.5 gigawatts by April. Such a target is relevant because it shows that the project was not designed to stabilize at 1 gigawatt, but to grow quickly, further increasing pressure on energy, cooling, maintenance logistics, and operational decisions.
As power increases, the complexity of maintaining efficiency and avoiding failures also grows. Scale makes any problem bigger: if there is a power shortage, a massive block of capacity fails; if cooling fluctuates, the risk of instability increases; if there are regulatory restrictions, the bottleneck is not “small,” it is structural.
Infrastructure and Speed: Construction at Industrial War Pace
The history of Colossus 1 was presented as proof of the method: the company reportedly completed construction in just 122 days, considered a remarkable feat. Colossus 2 took longer, but still, the pace was described as out of the standard for megaprojects.
The Colossus 2 project began in March 2025 and, in just six months, already had 200 MW of installed cooling capacity. According to Semianalysis, this progress was faster than megaprojects attributed to Oracle and OpenAI.
Cooling, in this context, is not a technical detail. It is what prevents computing from turning into an oven. In centers with hundreds of thousands of GPUs, heat is a constant enemy, and without a robust dissipation system, operations cannot sustain high loads for long periods.
Accelerating in the Race: From “Behind” to Second Place in Capacity
The material cites that a Semianalysis graph shows xAI’s acceleration in training capacity. At the beginning of 2024, the company would be behind. By September 2025, it would have reached second place, behind OpenAI.
This change helps to understand why infrastructure became the center of the strategy. Even with controversies surrounding Grok, the point described is that the investment in data centers and training capacity was fundamental to catch up with competitors. Competition, in this scenario, is not just about products; it is about raw computing power.
The Most Explosive Part: Energy, Gas Turbines, and The Environmental Cost
Feeding a 1-gigawatt Artificial Intelligence Megacenter is described as a challenging task. To this end, Musk’s company installed up to 35 gas turbines with a capacity exceeding 400 megawatts.
This is where the environmental alert goes off. Gas turbines are described as extremely polluting, and the location exacerbates the controversy: Memphis reportedly already has poor air quality, to the point of being known as the “asthma capital.”
In other words, the project enters a territory where the discussion is not only about technology but also about public health and urban impact.
This combination creates a direct conflict. On one side, there is the need for continuous energy to maintain AI training. On the other side, there is the reality of emissions, pollution, and social pressure in a city with a sensitive history regarding air.
The Border Maneuver: Mississippi and More Lax Emission Rules
The material points out a regulatory problem: there would be no permission to have so many turbines. The described response was to explore geographic proximity: since Colossus is close to the Mississippi border, a state where emission laws are more lenient, some turbines were reportedly transferred there.
This movement is central to understanding why the case became a controversy. Practically, relocating turbines to a territory with more flexible rules reduces the legal obstacle but does not erase the discussion about emissions and their regional effects. The debate moves beyond being just technical and becomes political, urban, and environmental.
Why Energy Consumption Became The “Weak Point” Of AI Supermachines
The case of Colossus 2 highlights a dilemma: training increasingly larger models requires a massive physical base. It is not enough to have software or good researchers. Without GPUs, electricity, and cooling, capacity does not exist.
Therefore, the narrative of “supermachine” inevitably comes with an “environmental alert.” The consumption described as greater than the peak of San Francisco and the use of gas turbines in large volume make the project a symbol of the real cost of the race for AI.
The Domino Effect: Competitors and The New Scale Standard
When a project sets a benchmark, it pressures the rest of the market. The expectation that Microsoft’s center in Fairwater will surpass Colossus 2 shows that the trend is escalation, not stabilization.
This means that competition goes beyond who has the best model. It becomes a race to see who can build the largest infrastructure, faster, with better cooling, more GPUs, and guaranteed energy. And the larger the project, the more it collides with themes that do not fit into the technology world: permits, emissions, air quality, and local acceptance.
What The History Of Colossus 2 Reveals About The Current Phase Of AI
Colossus 2 appears as a portrait of the moment: companies betting on extreme power to avoid falling behind, accelerating schedules, investing billions, and making controversial energy decisions.
xAI entered in 2023, rushed to build Colossus 1 with construction completed in 122 days, initiated Colossus 2 in March 2025, installed 200 MW of cooling in six months, raised the GPU count from 230,000 to over half a million, and set an expansion goal of 1.5 gigawatts by April, while at the same time raising an environmental alert with up to 35 gas turbines and the controversy over permits and borders.
The competition that seemed “merely digital” now has the scent of heavy industry, with energy, air, and emissions at center stage.
Do you think the race for Artificial Intelligence Megacenters will force cities to accept more pollution to attract investments, or will environmental pressure impose a ceiling before the next leap in power?

No que concerne aos EUA, devido às posições do governo Trump, a corrida por megacentros de IA vai obrigar cidades a aceitarem mais poluição para atrair investimentos. Tudo anda muito rápido, a tecnologia de IA é a nova frente de expansão do capitalismo. Até chegar um tempo em que a pressão ambiental vai impor um teto para contraditoriamente o capitalismo continuar a se expandir, incorporando tecnologias limpas que permitam novos saltos de potência sem débito ambiental maior.