lupi

cow of tailed snake (gay)

avatar by @citriccenobite

you can say "chimoora" instead of "cow of tailed snake" if you want. its a good pun.​


i ramble about aerospace sometimes
I take rocket photos and you can see them @aWildLupi


I have a terminal case of bovine pungiform encephalopathy, the bovine puns are cowmpulsory


they/them/moo where "moo" stands in for "you" or where it's funny, like "how are moo today, Lupi?" or "dancing with mooself"



Bovigender (click flag for more info!)
bovigender pride flag, by @arina-artemis (click for more info)



GoopySpaceShark
@GoopySpaceShark

Let's start with the subtitle:

It’s time to redirect the billions being squandered in fusion energy and invest in solutions to the climate crisis that actually work


This is objectively false -- The Independent is a British newspaper and the article immediately references the JET reactor, so I'm very safe in my assumption that Donnachadh is going off UK funding figures. The UK earmarked only £126 million for fusion in 20221.

Even going by global figures, the Fusion Industry Association reported $1.4 billion in global investment in 2022, bringing total investment in the field to $6.21 billion (nice). Whilst this may seem to lend credence to Donnachadh's claim, these are global figures for private laboratories, many of which will have goals other than exploiting fusion for power production. Make a note of this point, by the way; it will be important later.

Not off to a great start. Let's go to the first sentence of the article.

It is time to end the fusion delusion. Last week, there were headlines (again) about a “major breakthrough” in the search for unlimited, cheap, carbon-free electricity from nuclear fusion reactors.

Contrary to popular belief, "major breakthrough" does not equal "we did it, guys" -- it means a significant engineering challenge was overcome or a milestone was accomplished. Add a healthy dose of sensationalism from publications such as The Independent (not the scientists and engineers in the field), and any claims of delusion fall flat. Next.

Breathless announcements suggested that the UK’s 38-year-old JET Fusion programme had finally produced 11 megawatts of heat energy for five seconds. To the average person on the street that sounded impressive.
But it equates to the energy needed to boil a measly 60 kettles.

The JET (or Joint European Torus) reactor was never intended to be a viable power plant, or even realistically expected to break even. It was built for the sole purpose of preliminary research into larger Tokamak reactor designs2 that would inform the design and construction of its successor, ITER (the International Thermonuclear Experimental Reactor). ITER is intended to break even, but is similarly intended to be a technology demonstration and will likely not put any power out to the grid. DEMO, ITER's successor, is intended to demonstrate commercial viability.

In short, remember that point earlier? Yeah, JET was never intended to be a power reactor. Next.

The sad truth is that they [the UK Atomic Energy Agency] admitted that they actually had to put 40 MW of heat into the plasma to produce 11 MW of sustained fusion heat for five seconds. They added: “It is no secret JET uses a lot of energy. It was designed in the 1970s with copper magnets and will soon pass the baton to more energy-efficient experiments.

See above, and also add the UKAEA's point of decades-old designs and hardware. Modern reactor designs, such as MIT and Commonwealth Fusion Systems' SPARC project, use Yttrium-Barium-Copper Oxide superconducting magnets, providing much stronger magnetic fields at significantly reduced power cost. Copper electromagnets were simply never designed for these applications, and the engineers who designed JET were well aware of this. The UKAEA is even explicitly telling you that JET was intended as a stepping stone.

They went on: “ITER, the larger and more advanced upgrade in France, will use superconducting magnets to drastically lower the energy cost [...] ITER aims to release 500 MW of fusion heat using only 50 MW of external heating, and if you consider the power consumption of this entire experimental facility, it will break even.
The italics [bolded] are mine – this hugely expensive new fusion experiment is not expected to produce a single net kWh of electricity.

Assuming the traditional setup of steam turbines to convert thermal energy to electrical, 10% is an extremely pessimistic efficiency rating for modern boiler and turbine setups3. In reality, ITER is likely to generate an electrical net gain, however it will not output any power to the grid; this is by design, as once again, ITER is not intended for power production4. Next.

Thus, after literally a hundred years of research, since Arthur Eddington first postulated that nuclear fusion could be the stellar energy source, and untold billions of pounds invested by various governments ever since to try and replicate the creation of a mini star on earth, we still cannot produce a single net kWh of energy.
The fusion “industry” is always promising us unlimited clean energy in two to three decades time, but the cruel truth is that despite yet another annual flurry of “breakthrough” headlines, the fusion Holy Grail remains as illusory as the Grail itself.

Not sure why you put "industry" in quotations like that, but go off, I guess. Fusion - until very recently - has been underfunded to a point that undershoots even the most pessimistic research projections. Here's the referenced paper. Try cutting your own wage to a quarter of what it is now, and tell us how it went in a few decades.

Despite all these wasted billions, Boris Johnson’s government, as part of its supposed “10 Point Plan for a Green Industrial Revolution”, stated: “Our ambition is to be the first country to commercialise fusion energy, enabling low carbon and continuous power generation.”
It pledged another £222m for the spherical tokamak programme which “aims” to build the world’s first commercially viable fusion power plant by 2040, and another £184m to help found a global hub for fusion innovation in the UK.

If you believe that any promise made by Boris Johnson's government holds any value, he has a bridge to sell you, he 100% ab-so-lute-ly pinkie promised the NHS will get that extra £350 million a week in funding starting any day now, HS2 is nearing completion, and Brexit is going just great. No notes.

But in response to an excited BBC interviewer asking a fusion spokesperson when she might be able to boil her kettle with fusion energy, they said possibly in the 2050s. So, two years after the Johnson 2040 fusion promise, the delivery date is again delayed to three decades hence at the earliest.

Told you so.

But behind the headlines lies another really dirty truth about the UK AEA fusion experiments. Over just the last four years, it has consumed an eye-watering 232 million kWh of electricity to run its projects.

Let's just simplify that (and round up) to 240GWh real quick. Oh, and divide by 4, as you took a figure from over four years, so 60GWh. The UK's iron and steel industry -- noted for being on a sharp decline -- used 2.23TWh in 2022 alone (source); almost 10x the fusion sector's power consumption over your entire four year figure. Steel production is also noted for its CO2 production, an issue that could be mitigated with Electric Arc Furnaces powered by clean sources. Yes, that includes wind and solar, though fusion would provide a more compact solution that can provide power on-site.

Not a single kilowatt of the electricity used was from a renewable energy supplier.

Citation needed.

It also consumed 23 million kWh of fossil fuel gas [...]

Where did the other 90% of the 232GWh figure come from?

Over £0.5billion has been poured into the AEA fusion project over the last three years alone, with £100s of millions more planned over the coming decades.

A far cry from the "billions" you decried being "squandered" in the field.

Just this three year AEA budget would have insulated the roofs of 1.6 million poor people’s homes and thus reduced their heating bills and heating carbon emissions by 25 per cent. It costs just £300 to insulate the average semi-detached roof.

So could even a fraction of the cost of Brexit, or HS2, or the £4 billion of unsuitable PPE bought at the beginning of the pandemic. We could even take the money out of existing fossil fuel subsidies.

Even if fusion can eventually produce more electricity than that required to run the fusion plant, it is likely to be far more expensive than using renewables, energy efficiency and storage to eliminate carbon emissions.

A valid point. However, it is disingenuous to pretend the intent is to wholly fulfil our energy needs with fusion. Fusion can, however, fulfil the same niche as fission power plants (that being a compact, energy dense source), and for even heavier applications such as steel industries and possibly even desalination for clean water production.

Daniel Jassby spent 25 years as a senior fusion researcher at the Princeton Plasma Research Laboratory. In a shocking expose for the Bulletin of The Atomic Scientists, he explained that even if an operating plant were successfully launched, it would require mind-boggling amounts of water to cool it and a large labour force to keep it operational and safe.

Hang on, let's grab the quote from Jassby's linked article here:

In addition, there are the problems of coolant demands and poor water efficiency. A fusion reactor is a thermal power plant that would place immense demands on water resources for the secondary cooling loop that generates steam, as well as for removing heat from other reactor subsystems such as cryogenic refrigerators and pumps. Worse, the several hundred megawatts or more of thermal power that must be generated solely to satisfy the two classes of parasitic electric power drain places additional demand on water resources for cooling that is not faced by any other type of thermoelectric power plant.

Okay, that's a lot to unpack -- first of all, I don't doubt Jassby's qualifications, but purely from an engineering point of view, using water to cool anything to cryogenic temperatures is completely batshit insane; cryogenic cooling would use a closed loop of either liquid Nitrogen, or considering the target temperatures of only a few dozen Kelvin, more likely liquid Helium. This is supported by contemporary reactor designs, such as ITER.

Water would, however, be used to pull heat away from the assembly as a whole to produce steam and drive a turbine, as we already do with fission reactors. The extra thermal power generated to satisfy electrical parasitism would realistically be cycled back into the reactor and used to maintain the reaction -- fusion reactors operate on the principle of heating their fuel, in lieu of using gravity to force it together and fuse as occurs in stars. The water required to satisfy these cooling requirements, overall, whilst "immense" as Jassby put, would be similar to the water requirements of a fission reactor of equal thermal output.

As stated earlier, a fusion reactor could even dedicate a portion of its electrical output to desalination to produce the water it needs, without adversely impacting its local environment; ocean water is plentiful. As for safety, Jassby states:

Corrosion in the heat exchange system, or a breach in the reactor vacuum ducts could result in the release of radioactive tritium into the atmosphere or local water resources. Tritium exchanges with hydrogen to produce tritiated water, which is biologically hazardous. Most fission reactors contain trivial amounts of tritium (less than 1 gram) compared with the kilograms in putative fusion reactors.

Heat exchangers are a solved issue, see fission reactors; they're closed systems that are specifically designed to not allow ingress of radioactive materials. A breach of a fusion reactor's vacuum vessel, however, could cause Tritium excursion. Fusion reactors, however, also similarly contain less than a gram of Tritium at any given moment; in fact, ITER is designed to operate with a total fuel load of less than a gram at a time. I do not know where Jassby is getting this "kilograms" figure.

As an additional note, fusion reactors cannot explode like fission reactors. Any breach of a fusion reactor's vacuum vessel will result in air entering the chamber and removing thermal energy from the fuel, rendering the reaction unsustainable and stopping it in milliseconds. The only real safety concern, as Jassby pointed out, is:

Radiation damage and radioactive waste. [...] The neutron radiation damage in the solid vessel wall is expected to be worse than in fission reactors because of the higher neutron energies. Fusion neutrons knock atoms out of their usual lattice positions, causing swelling and fracturing of the structure. [...] The problem of neutron-degraded structures may be alleviated in fusion reactor concepts where the fusion fuel capsule is enclosed in a one-meter thick liquid lithium sphere or cylinder. But the fuel assemblies themselves will be transformed into tons of radioactive waste to be removed annually from each reactor.

Immediately offset by Jassby saying,

Materials scientists are attempting to develop low-activation structural alloys that would allow discarded reactor materials to qualify as low-level radioactive waste that could be disposed of by shallow land burial.

To this, I would like to add that smart, modular reactor design can minimise the amount of material in spent reactor casings, and the half-life of such materials would be drastically shorter than that of current spent fission products, likely further reduced by low-activation materials.

Quite frankly, I think journalists should have to have at least a GCSE-level understanding of what they're writing about. Donnachadh's article is poorly researched at best, and an insultingly bad hit piece at worst -- in my eyes no better than Greenpeace's continued hatred of nuclear power in any form, which serves only to push us back into using fossil fuels.

Yes, this is a kink account. I also will not tolerate nuclear fusion slander in this household.


  1. Page 13, Government support for fusion: "We have invested over £700m from 2021/22 to 2024/25 to support the UK Atomic Energy Authority’s (UKAEA) cutting-edge research programmes and facilities and £126m in 2022 to boost UK fusion programmes."

  2. Page 61 (25 as marked on page), I.1 Objectives of Research with JET: "An important part of the experimental programme will be to use JET to extend to a reactor-like plasma, results obtained and innovations made in smaller apparatus as a part of the general Tokamak programme."

  3. Page 11, 4.4 Performance Characteristics: "Boilers and steam turbines used for large, central station electric power generation can achieve electrical efficiencies of up to 45 percent HHV though the average efficiency of all units in the field is around 33 percent. [...] Consequently, the electric generation efficiencies for the examples shown are all below 10 percent HHV. However, when the energy value of the steam delivered for process use is considered, the effective electrical efficiency is over 75 percent."

  4. 3) Contribute to the demonstration of the integrated operation of technologies for a fusion power plant: ITER will bridge the gap between today's smaller-scale experimental fusion devices and the demonstration fusion power plants of the future. Scientists will be able to study plasmas under conditions similar to those expected in a future power plant and test technologies such as heating, control, diagnostics, cryogenics and remote maintenance.


You must log in to comment.

in reply to @GoopySpaceShark's post:

idk if you're british, but trust me, this kind of attitude is bone deep. nothing will ever change, there's nothing you can ever do, just suck it up as it gets worse and worse and worse and ignore that doing that lets all the rich keep robbing you.

And it doesn't even touch on some of the actual issues with fusion, like the tritium economy problem. Unless new CANDU-type reactors are built or other tritium production is built, it is expected that by the time ITER is done with its experimental runs, global tritium stocks will be almost empty (both due to tritium consumption and decay), with ITER alone account for over half of that.

And even with an operating power reactor... you get one neutron out per consumed tritium atom, and you need an efficient enough breeding setup for that neutron to produce substantially more than one tritium atom on average to supply the expanding tritium stockpiles needed for large-scale deployment of fusion power, as well as account for the inevitable leakage and decay in storage.

Only way that I really see around that is if D-D "side" reactions are a significant enough factor to supply a large part of the tritium burned in the reactor, or even if reactors purely fuelled by D-D, whether they can achieve He3 burnup or not, turn out to be viable.

Last I checked, ITER was going to use Lithium blankets for Tritium breeding. Lithium-7, as it turns out, produces Tritium, an alpha particle, and another neutron when struck, as was discovered in the Castle Bravo test. This system should catch any stray neutrons that would have otherwise impacted and damaged the reactor walls, and convert them into additional Tritium.

Given the design tolerances with ITER, I'm not sure if they're going to rely on D-D reactions due to the much higher energy requirements to overcome the Coulomb barrier. It'll probably correct for any strays, but not beyond that.

I'm not that optimistic about fusion -- the National Ignition Facility's results were impressive, but far from reaching actual net gain; it's another case of decades old equipment that was never designed to break even, using a technique (inertial confinement) known to be too inefficient for commercial power production. That said, their latest results are nothing short of exceptional.

I see more promise in magnetic confinement reactors, most likely tokamaks, though stellarators have seen some resurgence as of late - tokamaks being the likes of ITER and SPARC, the latter of which is due to start plasma testing next year if all goes according to schedule (which it so far has). They'll still be a decade away at the least, as they need to demonstrate both technical and commercial viability, but with the funding and political manoeuvring, power generation through fusion is now firmly within sight.

It'll also likely still fill the niche that current generation fission reactors do, acting as mainstay power generators and research reactors; a lot of slack will still be taken up by microgeneration and renewables.

There is a notable potential use case for fusion in spacecraft propulsion, however. If we can crack aneutronic fusion and guarantee the reaction product is charged, we can use the power the reactor generates to accelerate that product to extremely high velocities, achieving very high impulse without needing excessive amounts of reaction mass.

The NIF's results were only a net gain if you didn't factor in the total power used for ignition. Whilst the power from the reaction exceeded the power that was put in (more specifically the amount of energy that reached the fuel), this doesn't factor in parasitic losses. The ignition system loses a lot of energy to waste heat which can't be transferred into the fuel because of the nature of inertial confinement.

I'm well aware of the potential power output of fusion (around a factor of 10 of that of fission), but that's not the only limiting factor; fusion reactors will remain heavy, and will not scale down to the sizes we see in the most compact fission designs. Significant radiation shielding is still necessary to block gamma radiation during operation, and the capacitors required for ignition are bulky and require their own weight-adding safety measures. Mounting such reactors on large seagoing vessels may be possible, but it likely won't be commercially viable (fuck capitalism, by the way).

As for battery tech: Sodium glass batteries are another one to look out for! We also may be able to forget fossil fuels entirely with all the energy-intensive processes that become viable with fusion power -- everything from desalination to mass electric arc furnace adoption becomes feasible, as mentioned in the original post, so tearing apart existing, previously non-recyclable plastics and re-assembling them at the molecular level should be a cakewalk

Some tokamak and stellarator designs can run for long periods of time per ignition by siphoning off reaction products and injecting fresh fuel during operation, but inertial confinement ignites a single, pre-placed fuel pellet; it's effectively one and done, any additional reactions require the manual replacement of the fuel and re-ignition.

I've had this argument before, but my read through of the safety literature around fusion reactors for a near-future scifi worldbuilding project suggests that certain fusion reactor designs can absolutely explode in a loss of vacuum scenario if the cheese-holes line up, it's just that the explosion would be non-nuclear, but still could spread a considerable amount of radioactive material.

A small hydrogen gas explosion resulting from previously retained tritium and/or other hydrogen isotopes defusing back out of the metals that comprise the reactor as it cools can trigger a secondary dust-air explosion from eroded first wall materials, particularly when beryllium is used.

Water intrusion into a cooling fusion reactor during a loss of vacuum incident is another potential source for a hydrogen gas explosion.

The real risk would come during commercial fusion power plant operation, because without sufficient regulatory oversight, the pressures to extend the periods between baking, cleaning, and inspection cycles grows because there is always pressure to minimize down-time. As a result, a commercial tokamak fusion reactor could have the right conditions necessary necessary to trigger a non-nuclear explosion in a loss of vacuum incident.

And then there are the proliferation risks associated with breeder blanket reactor designs if there aren't international safeguards in place to monitor the operation of commercial fusion power plants.

I can't find my modelling of equivalent nuclear fission power plant accidents, but from what I recall, we can expect a rate of about one significant accident to occur about every ten to forty years, though with the caveat that almost all significant fusion reactor accidents will be mostly confined to the plant itself, and any potential further spread of radioactive material or heavy metals will be mostly due to environmental causes during or immediately after an accident, like wind or water runoff. Though there was some modeling done by Chinese researchers that suggested an extremely low probability worst-case scenario could be as bad as an INES 6 accident.

Fusion is something that we absolutely should pursue, but we also need to be realistic and honest about the risks.

I hate it when known though remote serious safety risks are downplayed because it always causes blowback. The only responsible thing is to talk about the known risks, and the safeguards that are put in place to prevent them, but acknowledge the fact that there is always a remote chance that irresponsible or incompetent operation and construction or even sheer bad luck, can result in those safeguards being bypassed and a worse-case scenario occurring.

Can't argue with any of that! Thank you for the insight. I didn't bring up conventional explosion risks because I wasn't sure of the potential impact, beyond "less than that in a fission reactor" due to the greatly reduced fuel loading (a few grams at any one time, as opposed to fission's several tonnes). I didn't want to comment on something I didn't have more definite information on, and the same goes for proliferation.

That said, I think the accident rate will be lower than the 10-40 year figure, as that's largely based off historical figures which are skewed by older reactor designs and malpractice, which have both since been remedied, if not heavily mitigated by, regulatory changes.

Yeah, it's a very conservative estimate based on the best proxy available.

My gut feeling is that it is likely to be closer to 30 to 80 years, but that's more of a result of the observation that similar mistakes tend to be made every three to four generations.