by Greg Walcher, E&E Legal Senior Policy Fellow
As appearing in the Daily Sentinel

The recent news that scientists have finally produced a controlled nuclear fusion reaction that generates a net energy gain was greeted with some skepticism, but also with considerable excitement. Some of the hype was a bit over the top, heralding the achievement at Lawrence Livermore Labs as a sign that a fossil fuel-free world is just around the corner.

For much of our lives, nuclear fusion has been considered the holy grail of clean energy, because it would create virtually unlimited energy supplies, with no carbon emissions and no long-term radioactive waste. I am no scientist, so most of the technical explanation is a foreign language to me. In general terms, nuclear power plants are reactors that use energy from uranium to split atoms, which releases massive amounts of energy, used to heat steam and generate electricity. That atom-splitting is nuclear “fission.” Such plants do not burn fossil fuels or emit carbon into the air. But they do leave behind radioactive waste that can last thousands of years, is difficult to store safely, and has long been a political stumbling block to expansion of nuclear power.

Nuclear “fusion,” rather than splitting atoms, combines them, literally fuses them together, and it releases much more energy than fission. Fusion is the energy that powers the sun and the stars, and it requires heat that intense to fuse the nuclei together.

Fusion has been artificially achieved in laboratories before, but creating that kind of heat uses more energy than it generates, so fusion has always been more of a dream than a realistic option. Still, governments and foundations have spent hundreds of millions on research and the equation may be changing. The Lawrence Livermore team has apparently achieved a technological breakthrough that resulted in a net gain of energy for the first time from fusion