Modern AI data centers consume enormous amounts of power, and it looks like they will get even more power-hungry in the coming years as companies like Google, Microsoft, Meta, and OpenAI strive towards artificial general intelligence (AGI). Oracle has already outlined plans to use nuclear power plants for its 1-gigawatt datacenters. It looks like Microsoft plans to do the same as it just inked a deal to restart a nuclear power plant to feed its data centers, reports Bloomberg.
Personally? I don’t think this is a bad idea. The less they drain from the grid, the less they consume fossil fuel.
The reactor isn’t active right now, and they are a PWR design, and like the 1979 incident showed, they do fail safely.
So long as Microsoft pays for the operation of the plant? Seems reasonable to me if they’re going to consume an assload of energy with or without public support.
we could use that extra energy to offset a bunch of existing carbon emissions now. This is still waste. If it’s going to be started up again, and its energy used for something useless, it’s waste.
Microsoft would do it with or without the power plant. Make no mistake about that.
The same argument could be said if they made a 1GW solar farm, or any other form of power generation. Unless you have a way to legislatively prevent Microsoft from producing their own energy or prevent acquisition of decommissioned plants, I don’t see how you can prevent waste.
That argument presupposes that the reactor would otherwise be brought back into operation, which I don’t think is necessarily the case.
If it also shifts their current load off the existing grid, that might be beneficial.
I remember I had to do the 3 mile Island incident as part of my university degree. Apparently one of the biggest problems was that the control interface was hard to understand for the human operators.
So I guess if they just replaced the control system with a modern computer that would fix most of the problems. Obviously not a Windows system, otherwise we’ve just got the same issue all over again.
It was the SCADA view right? A lot of SCADA software is basically running on top of windows, though you typically would never see the desktop. Ignition at least is cross platform, but that is because the server is Java and Jython. A big part of why things are running on windows is due to OPC, which was traditionally all DOM and .NET. It is basically a standard communications protocol and is what allows your HMI/SCADA to communicate with PLCs. Otherwise, you use proprietary drivers and native PLC specific protocols.
SCADA programming/design is kind of an art and is usually written by an either an overworked engineer or someone who had far too much time on their hands. You basically build screens using specialized software, hook up buttons and UI elements to PLC signals, and pass some signals from the UI to the PLC. They are all heading in the Edge/iot/cloud/web based/techno-babble direction these days…
Ignition, programming software is free!: https://inductiveautomation.com
Some other random ones I have seen or used in the past: https://www.siemens.com/global/en/products/automation/simatic-hmi/wincc-unified.html https://www.aveva.com/en/products/intouch-hmi/ https://www.rockwellautomation.com/en-us/products/software/factorytalk/operationsuite/view.html
“is usually written by an over worked engineer”
I’m in this post and I don’t like it.
But really these scada systems are rarely well defined by the time implementation happens. Often the architect has a great plan, but by the time it’s passed to a manager, a non-software engineer, to the product engineer to the automation team to the contractor the end result is “X data is pushed in With Y form and we use either a,b,or c date time stamp any nobody knows”
It continued operating for decades after the event. I’m sure they already solved that issue. It can still be improved I’m sure though.
Introducing new Clippy For Reactors.
“It looks like you are trying to prevent a nuclear meltdown. I can help with that.”