Yeah I call bullshit on that. I get why they’re investing money in it, but this is a moonshot and I’m sure they don’t expect it to succeed.
These data centers can be built almost anywhere in the world. And there are places with very predictable weather patterns making solar/wind/hydro/etc extremely cheap compared to nuclear.
Nuclear power is so expensive, that it makes far more sense to build an entire solar farm and an entire wind farm, both capable of providing enough power to run the data center on their own in overcast conditions or moderate wind.
If you pick a good location, that’s lkely to work out to running off your own power 95% of the time and selling power to the grid something like 75% of the time. The 5% when you can’t run off your own power… no wind at night is rare in a good location and almost unheard of in thick cloud cover, well you’d just draw power from the grid. Power produced by other data centers that are producing excess solar or wind power right now.
In the extremely rare disruption where power wouldn’t be available even from the grid… then you just shift your workload to another continent for an hour or so. Hardly anyone would notice an extra tenth of a second of latency.
Maybe I’m wrong and nuclear power will be 10x cheaper one day. But so far it’s heading the other direction, about 10x more expensive than it was just a decade ago, thanks to incidents like Fukushima and that tiny radioactive capsule lost in Western Australia proving current nuclear safety standards, even in some of the safest countries in the world, are just not good enough. Forcing the industry to take additional measures (additional costs) going forward.
IMHO, data centers kind of need to be somewhat close to important population areas in order to ensure low latency.
You need a spot with attainable land, room to scale, close proximity to users, and decent infrastructure for power / connectivity. You can’t actually plop something out in the middle of BFE.
I remember reading a story about an email server that was limited to sending emails within 150 miles. Through a lot of digging, they found it was due to an auto-timeout timer getting reset to 0ms. Anything further than 150 miles would cause a 1ms delay and thus get rejected for taking too long.
For the majority of applications you need data centers for, latency just doesn’t matter. Bandwidth, storage space, and energy costs for example are all generally far more important.
The earth has a circumference of 25,000 miles, and the speed of light in a fiber cable is 124,000 miles per second, so going the whole way around the earth would take .2 seconds(assuming you could send a signal that far).
need to be somewhat close to important population areas
They really don’t. I live in regional Australia - the nearest data center is 1300 miles away. It’s perfectly fine. I work in tech and we had a small data center (50 servers) in our office with a data center grade fibre link - got rid of it because it was a waste of money. Even comparing 1300 miles of latency to 20 feet of latency wasn’t worth it.
To be clear, having 0.1ms of latency was noticeable for some things. But nothing that really matters. And certainly not AI where you’re often waiting 5 seconds or even a full minute.
This isn’t a moonshot at all. Checkout these eVinci microcreactors by Westinghouse. They’re currently being deployed in industrial settings around the US. They’re modular too so you just add more to scale. Pretty wild.
I thought I saw some active deployment on the east coast last time I looked into this but haven’t been able to immediately find an example. Either way, it’s at least in progress, has regulatory backing, and is not just imaginary.
I’d say it’s imaginary if they don’t exist. Your claim that, “They’re currently being deployed in industrial settings around the US.” isn’t really accurate, is it?
Edit: some context I was able to find:
"The US has approved a single design for a small, modular nuclear reactor developed by the company NuScale Power. The government’s Idaho National Lab was working to help construct the first NuScale installation, the Carbon Free Power Project. Under the plan, the national lab would maintain a few of the first reactors at the site, and a number of nearby utilities would purchase power from the remaining ones.
With the price of renewables dropping precipitously, however, the project’s economics have worsened. Some of the initial backers started pulling out of the project earlier in the decade, although the numbers continued to fluctuate in the ensuing years.
The final straw came on Wednesday, when NuScale and the primary utility partner, Utah Associated Municipal Power Systems, announced that the Carbon Free Power Project did not have enough utility partners at a planned checkpoint and, given that uncertainty, would be shut down. In a statement, the pair accepted that “it appears unlikely that the project will have enough subscription to continue toward deployment.”"
I’d say it’s imaginary if they don’t exist. Your claim that, “They’re currently being deployed in industrial settings around the US.” isn’t really accurate, is it?
I’d consider signed agreements as part of the “being deployed” process but yeah, I haven’t been able to find evidence of any currently active deployments. I wouldn’t call it a “moonshot” though when there’s so many in the works is all.
Not really sure how NuScale is relevant as that’s (or at least the project in the article is) utility-level power and not really the same thing.
That eVinci reactor is tiny at only 5MW. You’d need something like a thousand of them to run a single AI data center. It’s also horrifically expensive at over $100 million (each! multiply that by a thousand!) and it can only produce that amount of power for eight years, then I’m not sure what you do. Buy a thousand more of them?
For comparison, some wind turbines provide more than twice as much power from just a single turbine. And they cost single digit millions to setup. They’re not as reliable and they’re also bigger than a micro nuclear reactor. But none of that really matters for a data center, which can draw power from the grid if it needs to.
The only really promising small reactor I’ve heard of is the NuScale one - but it may have been vapourware. Republicans made a big splash during the 2016 election campaign and committed to paying 1/12th of the cost of a reactor as part of their clean energy “commitment”. There was no price tag, just 1/12th.
A couple years later, after they’d won the election, they quietly abandoned that plan and agreed to pay $1.3 billion which they claimed would be 1/4th of the budget. The subtext was the earlier election promise was before a budget had been figured out yet. But going from 1/12th to 1/4th is a pretty big jump.
And then a few years after that… when the company told the government $1.3 billion would not be enough money for the project to be financially viable… and that in order to sell electricity at all they needed the government to subsidise every single watt of power produced by the plant for the entire period that it operated… because it was going to run at a loss… that’s when the government pulled all funding (except what had already been spent, which was a lot of money) and the whole project collapsed.
I tried to find references for all of that, but the website for the project is now a “domain for sale” page. All that’s left is a few vague news articles which have conflicting information. But I’ve been following this for decades and the project you linked to was one of the ones that made it crystal clear to me that nuclear doesn’t have a future unless something really big changes.
Who knows, perhaps if the government had been really committed to NuScale, they might’ve pushed through the pain and helped it succeed int order to become cheaper later. But the government wasn’t willing to take that risk and apparently nobody else was either.
Yeah I call bullshit on that. I get why they’re investing money in it, but this is a moonshot and I’m sure they don’t expect it to succeed.
These data centers can be built almost anywhere in the world. And there are places with very predictable weather patterns making solar/wind/hydro/etc extremely cheap compared to nuclear.
Nuclear power is so expensive, that it makes far more sense to build an entire solar farm and an entire wind farm, both capable of providing enough power to run the data center on their own in overcast conditions or moderate wind.
If you pick a good location, that’s lkely to work out to running off your own power 95% of the time and selling power to the grid something like 75% of the time. The 5% when you can’t run off your own power… no wind at night is rare in a good location and almost unheard of in thick cloud cover, well you’d just draw power from the grid. Power produced by other data centers that are producing excess solar or wind power right now.
In the extremely rare disruption where power wouldn’t be available even from the grid… then you just shift your workload to another continent for an hour or so. Hardly anyone would notice an extra tenth of a second of latency.
Maybe I’m wrong and nuclear power will be 10x cheaper one day. But so far it’s heading the other direction, about 10x more expensive than it was just a decade ago, thanks to incidents like Fukushima and that tiny radioactive capsule lost in Western Australia proving current nuclear safety standards, even in some of the safest countries in the world, are just not good enough. Forcing the industry to take additional measures (additional costs) going forward.
IMHO, data centers kind of need to be somewhat close to important population areas in order to ensure low latency.
You need a spot with attainable land, room to scale, close proximity to users, and decent infrastructure for power / connectivity. You can’t actually plop something out in the middle of BFE.
I remember reading a story about an email server that was limited to sending emails within 150 miles. Through a lot of digging, they found it was due to an auto-timeout timer getting reset to 0ms. Anything further than 150 miles would cause a 1ms delay and thus get rejected for taking too long.
In case anyone wants to read that: https://www.ibiblio.org/harris/500milemail.html
For the majority of applications you need data centers for, latency just doesn’t matter. Bandwidth, storage space, and energy costs for example are all generally far more important.
The earth has a circumference of 25,000 miles, and the speed of light in a fiber cable is 124,000 miles per second, so going the whole way around the earth would take .2 seconds(assuming you could send a signal that far).
Sure, but infrastructure is not just fiber, and there is a lot of stuff in between your long stretches of fiber.
I’m not a sys ops guy, but I can pull from different data centers and see measurable differences
This is a pretty well known phenomenon. That’s why we have cloud data centers located close to major metro areas.
That’s… Not how internet infrastructure works.
And cables are not in straight lines between you and the destination.
They really don’t. I live in regional Australia - the nearest data center is 1300 miles away. It’s perfectly fine. I work in tech and we had a small data center (50 servers) in our office with a data center grade fibre link - got rid of it because it was a waste of money. Even comparing 1300 miles of latency to 20 feet of latency wasn’t worth it.
To be clear, having 0.1ms of latency was noticeable for some things. But nothing that really matters. And certainly not AI where you’re often waiting 5 seconds or even a full minute.
This isn’t a moonshot at all. Checkout these eVinci microcreactors by Westinghouse. They’re currently being deployed in industrial settings around the US. They’re modular too so you just add more to scale. Pretty wild.
I searched and I can’t find any cases of such a reactor being deployed anywhere in the US.
“Microreactors for civilian use are currently in the earliest stages of development, with individual designs ranging in various stages of maturity.”
https://en.wikipedia.org/wiki/Nuclear_microreactor
The reactor you’re referring to doesn’t even have a Wikipedia page.
Weird.
I first learned about it with this project in Butte, Montana, which is in development. They also have a page describing a deployment in Saskatchewan. I don’t know if this was completed yet but it’s been in progress for years. There’s also a lot of other planned deployments I’m finding.
I thought I saw some active deployment on the east coast last time I looked into this but haven’t been able to immediately find an example. Either way, it’s at least in progress, has regulatory backing, and is not just imaginary.
I’d say it’s imaginary if they don’t exist. Your claim that, “They’re currently being deployed in industrial settings around the US.” isn’t really accurate, is it?
Edit: some context I was able to find:
"The US has approved a single design for a small, modular nuclear reactor developed by the company NuScale Power. The government’s Idaho National Lab was working to help construct the first NuScale installation, the Carbon Free Power Project. Under the plan, the national lab would maintain a few of the first reactors at the site, and a number of nearby utilities would purchase power from the remaining ones.
With the price of renewables dropping precipitously, however, the project’s economics have worsened. Some of the initial backers started pulling out of the project earlier in the decade, although the numbers continued to fluctuate in the ensuing years.
The final straw came on Wednesday, when NuScale and the primary utility partner, Utah Associated Municipal Power Systems, announced that the Carbon Free Power Project did not have enough utility partners at a planned checkpoint and, given that uncertainty, would be shut down. In a statement, the pair accepted that “it appears unlikely that the project will have enough subscription to continue toward deployment.”"
https://arstechnica.com/science/2023/11/first-planned-small-nuclear-reactor-plant-in-the-us-has-been-canceled/
I’d consider signed agreements as part of the “being deployed” process but yeah, I haven’t been able to find evidence of any currently active deployments. I wouldn’t call it a “moonshot” though when there’s so many in the works is all.
Not really sure how NuScale is relevant as that’s (or at least the project in the article is) utility-level power and not really the same thing.
They’re both SMRs, right?
That eVinci reactor is tiny at only 5MW. You’d need something like a thousand of them to run a single AI data center. It’s also horrifically expensive at over $100 million (each! multiply that by a thousand!) and it can only produce that amount of power for eight years, then I’m not sure what you do. Buy a thousand more of them?
For comparison, some wind turbines provide more than twice as much power from just a single turbine. And they cost single digit millions to setup. They’re not as reliable and they’re also bigger than a micro nuclear reactor. But none of that really matters for a data center, which can draw power from the grid if it needs to.
The only really promising small reactor I’ve heard of is the NuScale one - but it may have been vapourware. Republicans made a big splash during the 2016 election campaign and committed to paying 1/12th of the cost of a reactor as part of their clean energy “commitment”. There was no price tag, just 1/12th.
A couple years later, after they’d won the election, they quietly abandoned that plan and agreed to pay $1.3 billion which they claimed would be 1/4th of the budget. The subtext was the earlier election promise was before a budget had been figured out yet. But going from 1/12th to 1/4th is a pretty big jump.
And then a few years after that… when the company told the government $1.3 billion would not be enough money for the project to be financially viable… and that in order to sell electricity at all they needed the government to subsidise every single watt of power produced by the plant for the entire period that it operated… because it was going to run at a loss… that’s when the government pulled all funding (except what had already been spent, which was a lot of money) and the whole project collapsed.
I tried to find references for all of that, but the website for the project is now a “domain for sale” page. All that’s left is a few vague news articles which have conflicting information. But I’ve been following this for decades and the project you linked to was one of the ones that made it crystal clear to me that nuclear doesn’t have a future unless something really big changes.
Who knows, perhaps if the government had been really committed to NuScale, they might’ve pushed through the pain and helped it succeed int order to become cheaper later. But the government wasn’t willing to take that risk and apparently nobody else was either.
Hey, is Signal down? Ah, reactor exploded, destroying the datacenter along with the staff on prem
Jk, cool stuff