The US National Ignition Facility has achieved even higher energy yields since breaking even for the first time in 2022, but a practical fusion reactor is still a long way off
Firstly, the energy output falls far short of what would be needed for a commercial reactor, barely creating enough to heat a bath. Worse than that, the ratio is calculated using the lasers’ output, but to create that 2.1 megajoules of energy, the lasers draw 500 trillion watts, which is more power than the output of the entire US national grid. So these experiments break even in a very narrow sense of the term.
It’s so refreshing to see an article at least mention the way these tests are measured are based on the energy just in the laser itself and not the total energy used.
I agree it’s good that the article is not hyping up the idea that the world will now definitely be saved by fusion and so we can all therefore go on consuming all the energy we want.
There are still some sloppy things about the article that disappoint me though…
They seem to be implying that 500 TW is obviously much larger than 2.1 MJ… but without knowing how long the 500 TW is required for, this comparison is meaningless.
They imply that using more power than available from the grid is infeasible, but it evidently isn’t as they’ve done it multiple times - presumably by charging up local energy storage and releasing it quickly. Scaling this up is obviously a challenge though.
The weird mix of metric prefixes (mega) and standard numbers (trillions) in a single sentence is a bit triggering - that might just be me though.
British Thermal Units. It’s the energy needed to heat 1 lb of water 1 degree F.
The bad part is that no one bothered to set the starting temp of the water so there’s 5 separate standards for what the hell a BTU actually is, which makes it a really bad standard.
It’s so refreshing to see an article at least mention the way these tests are measured are based on the energy just in the laser itself and not the total energy used.
I agree it’s good that the article is not hyping up the idea that the world will now definitely be saved by fusion and so we can all therefore go on consuming all the energy we want.
There are still some sloppy things about the article that disappoint me though…
They seem to be implying that 500 TW is obviously much larger than 2.1 MJ… but without knowing how long the 500 TW is required for, this comparison is meaningless.
They imply that using more power than available from the grid is infeasible, but it evidently isn’t as they’ve done it multiple times - presumably by charging up local energy storage and releasing it quickly. Scaling this up is obviously a challenge though.
The weird mix of metric prefixes (mega) and standard numbers (trillions) in a single sentence is a bit triggering - that might just be me though.
Electricity stuff is funny because it combines metric and imperial units sometimes to make bastard measurements
Huh? Whatchu talkin bout Willis?
Watt is a Joule per second
Volts, Amps, kWh, MJ… These are all metric.
Sssch don’t tell the Americans or they will try to wrangle in BTU in nuclear power plants
WE INVENTED IT AND BUH GAWD, WE WILL MEASURE IT IN MURICA UNITS!
Ignore how nonsensical BTUs are: Gonna shove energy and weight into a single measurement and it changes based on the initial temperature of the water.
Y’all do know what BTU stands for, right?
British Thermal Units. It’s the energy needed to heat 1 lb of water 1 degree F.
The bad part is that no one bothered to set the starting temp of the water so there’s 5 separate standards for what the hell a BTU actually is, which makes it a really bad standard.