Why the U.S. should invest a lot more in nuclear research
For the past several years, nuclear power has been a focus of sharp disagreement in the debate over climate change. Traditional environmentalists tend to oppose it, while climate trolls argue it is the savior of mankind, only stopped by green ignorance. For all the hyperbole, both sides make some good points. Nuclear power is not as dangerous as it is often portrayed, at least compared to coal, while the trolls fail to acknowledge the major problem with traditional nuclear power: its stupendous cost.
However, there are reasons to hope there could be a way to end this impasse. The answer lies in moving away from existing nuclear technology, and towards general research. The theoretical benefits of non-standard nuclear technologies are very great, but these technologies are not currently in a workable form. Thus, more research could pay off handsomely.
The Department of Energy is moving in just this direction, with $60 million recently awarded towards new nuclear research. That’s a good step, but an insufficient one. We ought to be doing much more.
The main problem with traditional nuclear power plants is that they’re too big. Existing nuclear technology is based on uranium fission, which requires enormous generators to work properly. The plants are huge, complicated, dangerous, and therefore extremely expensive to build and insure. Typically, that means large government subsidies are required to get one actually built, and cost overruns and other headaches are very common. As a result, many nuclear projects have been abandoned outright.