At first glance, Alberta’s AI data centre boom resembles a land grab — companies racing to secure power and build the next generation of digital infrastructure. But as a panel of experts discussed at YYC DataCon, the reality is more nuanced.
To illustrate, consider the experience of attending a Calgary Flames game.
“If you’re trying to get a drink at the Saddledome, you’re looking at five different lines, and you’re making a bet on which one moves the fastest,” said Mark Taylor, executive vice president of Captives Generation. “Turns out you always pick the slowest one.”
This analogy reflects Alberta’s data centre market. Companies are lining up for power, infrastructure, and permits, but not all intend to build. Some are merely holding their spot.
The industry even has a term for this — bragawatts.
“You see these massive gigawatt-scale data centre announcements, but many of them will never get built,” said Ian Nieboer, managing director and head of energy transition research at Enverus. “Companies overbook capacity, just like renewables did in Alberta. We’ve seen it before. The reality is, we’ll end up with a fraction of what’s being proposed.”

Who’s actually building, and who’s backing out?
This overbooking is evident. Microsoft has re-evaluated several planned data centres due to power constraints and shifting cloud demand, though it says it remains committed to AI infrastructure. Meanwhile, Meta (Facebook) is taking the opposite approach — engaging in discussions to secure a $35 billion financing package aimed at developing data centres in the U.S.
The difference lies in their business models.
“Microsoft serves foundational models to customers. They don’t need as much compute infrastructure themselves for training,” said Wish Bakshi, an energy data and AI consultant, and founder of AQ Energy. “Facebook, on the other hand, builds models and gives them away for free. They need to scale compute at an extreme level.”

Compute power refers to the processing capability of servers that run AI models, handling complex calculations that enable applications like ChatGPT and generative AI to function. As AI models advance, they require more compute power, increasing energy demand.
But as the industry scales, a more significant issue emerges — one that could determine which projects succeed and which fail.

AI’s biggest problem isn’t compute, it’s energy
Historically, AI’s primary bottleneck was access to computing power. That dynamic isn’t always the case anymore.
“The constraints are shifting,” said Nieboer. “Right now, it’s not just about getting GPUs — it’s about whether you can power them.”
AI data centres require an enormous and constant energy supply. Unlike traditional industrial facilities that can adjust to grid availability, AI training models demand 99.999 percent uptime, placing unprecedented strain on the grid.
To contextualize, Alberta’s total power consumption averages around 10 gigawatts daily. A single gigawatt can typically power between 600,000 and 750,000 homes, depending on energy efficiency and regional consumption patterns. Currently, there are 10 gigawatts of data centres in Alberta’s queue — equivalent to the province’s entire base load electricity demand.
“People don’t realize how much power we’re talking about,” said Bakshi. “If even two 500-megawatt data centres get built and connected to the grid, that’s 10% of our baseload power supply gone overnight.”
Consequently, Alberta requires new data centres to provide their own energy supply. Known as behind-the-fence or off-grid power, this approach means companies must secure their own generation, whether from natural gas, renewables, or a combination.
Read more: Alberta unveils strategy to become AI data centre hub
“We can’t just assume data centres will get all the power they want,” said Taylor. “If we don’t plan for this properly, we could be in a situation where residents are paying the price for corporate energy demand. That’s not a winning scenario.”

The risk of AI data centres losing power
One of the biggest challenges for data centres is ensuring uninterrupted energy supply.
Electricity grids operate in a delicate balance, like a heart maintaining a steady 60-hertz rhythm. Sudden spikes or drops in demand can cause serious disruptions or blackouts.
If a 500-megawatt data centre loses its primary power source, for example, its backup systems typically activate first. But if redundancy fails, it may need to draw from the grid, which could disrupt system stability.
Grid operators like the Alberta Electric System Operator (AESO) are preparing for this, but the solution isn’t as simple as adding more power. Many data centres are planning for triple redundancy which includes three independent power sources — direct grid connections, backup generation, and on-site energy storage.
“The grid is not expected to be that reliable,” said Nieboer. “They know that, and that’s why it’s designed that way. “A typical large data centre will have two high-voltage substations, enough diesel generators to run for 48 hours, and multiple redundancies. The grid isn’t providing five-nines reliability, they’re building it into their facilities.”
“Five nines” refers to 99.999% uptime, meaning a system is designed to have no more than about 5 minutes of downtime per year, ensuring extreme reliability.

AI is evolving, and so is its energy footprint
The energy challenge extends beyond scale. It also involves the evolution of AI itself.
Early AI models like ChatGPT focused on next-word prediction — processing inputs to predict the next likely word. New reasoning models, however, are fundamentally different.
“Before, you gave AI a prompt, it gave you an answer,” said Akbar Nurlybayev, co-founder and COO of CentML. “Now, these models reason in multiple steps, making them far more powerful, but also much more energy-intensive.”
As AI becomes more sophisticated, it requires more power per request. This shift is already affecting the industry.
“Last summer, you could rent an H100 GPU for under $2 an hour,” said Bakshi. “Now, it’s $15 for an entire rack. The demand is real, and it’s only going up.”
This trend explains why companies like Meta and initiatives like Elon Musk’s xAI are intensifying their energy investments. AI is no longer solely a software issue — it has also become an infrastructure challenge.

What Alberta needs to do next
Alberta has the opportunity to position itself as a global AI infrastructure hub, but success will require careful planning. The province needs to ensure that data centres contribute to energy stability by building behind-the-fence power rather than relying solely on the grid.
Investment in energy storage will be necessary to handle peak loads and unexpected disruptions.
And policymakers must also ensure that approvals go to viable projects, preventing speculative overbooking that ties up grid capacity.
If Alberta can balance these elements and scale AI infrastructure while maintaining grid reliability, it can secure its place as a leader in the AI era without compromising energy security for residents and businesses. Achieving this will require collaboration between government, industry, and energy providers to create policies that support long-term investment while protecting grid stability.
The province has an advantage in its existing expertise in energy production, natural resources, and large-scale infrastructure.
It also has a business-friendly climate that has attracted investment in both traditional and emerging industries. These strengths position Alberta well to support AI-driven infrastructure if decision-makers can align power generation with the rapid expansion of AI data centres.
Investment in energy infrastructure will need to move at a pace that matches the demand from AI growth. While the technology world operates on short cycles, power generation and transmission projects take years to complete. Clear planning and proactive development will be necessary to ensure Alberta’s power supply does not become a bottleneck for innovation.
The next few years will determine Alberta’s role in the AI economy.
If Alberta gets this right, it will not only be home to new AI infrastructure but will also shape how AI and energy intersect in the global economy.
Digital Journal is the official media partner of YYC DataCon 2025.

This article was created with the assistance of AI. Learn more about our AI ethics policy here.
