What energy innovation can we expect to come out of the United States in the coming years? Matthew Stepp, Senior Policy Analyst with the Information Technology and Innovation Foundation (ITIF) in Washington DC, takes us on a whirlwind tour of some of the most advanced American energy research going on. He visits five of the 17 National Laboratories of the US Department of Energy, where research is carried out on materials (e.g. for better batteries, solar panels, wind turbines, lightweight vehicles and carbon capture technologies), advanced nuclear energy, improved energy efficiency at data centres (to help with the integration of renewables into the grid), and more.
The National Laboratories have a storied, yet largely hidden, history and presence in the energy innovation space. Created to build the atomic bomb in the 1940’s, the Labs — now totaling 17 institutions — have evolved over time to conduct “big science” to address leading national missions, solve complex societal problems, and keep the United States at the leading edge of innovation. The Labs’ work in clean energy is no different.
Unfortunately, public awareness about the Labs — even in Washington — is relatively low. Harkening back to the secrecy of their atomic energy research roots, the Labs quietly work on government-funded research that often cannot be found anywhere else in the United States, if not the world. The Labs’ behind-the-scenes role in energy innovation perpetuates two issues: (1) the Labs’ fundamental role in many of today’s breakthrough science and technology developments is often ignored, and (2) funding and policy reform issues at the Labs rarely become top policy priorities.
In many ways, both issues are linked, making the Labs a natural stop during my seven city tour across the country. In total, I had the opportunity to visit five Labs: Lawrence Livermore (LLNL), Lawrence Berkeley (LBNL), Sandia (California Campus), National Renewable Energy Lab (NREL), and the SLAC National Accelerator.
Each has a unique role in spurring next-generation energy innovation and leverages truly remarkable capabilities and facilities to achieve affordable, high-performance clean energy technologies. Specifically, these Labs lead in four categories (among many others) that make them standout in the national energy innovation ecosystem: lasers, accelerators, supercomputing, and materials. The following is a snapshot of just a few of the energy innovators I met while touring the Labs and the critical research they’re working on.
Lasers are the Key to Fusion Energy…and more Efficient Engines?
It’s hard to miss the National Ignition Facility (NIF) when driving around the Lawrence Livermore campus. At over three football fields long and a football field wide, it’s one of the biggest government research projects in the country and required at least seven R&D breakthroughs simply to finish construction. It also has equally lofty research goals: achieving the first-ever fusion energy “ignition,” a fusion reaction that produces more energy than is required to start the reaction.
NIF’s capability is unlike anything else in the world. It amplifies 192 individual lasers from one-billionth of a joule to 4 million joules and points them all at a single target the size of a pea. To put this into context, this is the equivalent of taking the fraction of energy needed to fuel a flying mosquito and amplifying it to the energy required for a car to travel at 100 mph. The result is the release of roughly 500 TW of energy (the world consumes roughly 15 TW per year) and temperature and pressure equivalent to that of the Sun. This unique research environment also lends well to plasma physics and tests related to maintaining America’s stockpile of nuclear weapons without actually having to detonate a nuclear bomb. The work done here is also applicable to research into the origins of the universe.
To see a video film of how the NIF works, click here.
In a contrasting approach, Sandia National Laboratory’s California Campus is using much less powerful lasers in a unique way to drive breakthroughs in vehicle engine efficiency. At its simplest level, scientists at Sandia are able to use their nuclear research expertise to shoot a laser in a working internal combustion engine, allowing for unheard of diagnostics and experimentation. This is increasingly important for vehicle makers to develop more energy efficient vehicles that deliver the speed and power performance that drivers want, while consuming less fuel. For example, Sandia is collaborating with the major U.S. vehicle manufacturers and oil companies to analyze how fuel combusts in an engine and where particulates form. Using that data, its researchers are building simulation tools to quickly model new technologies in partnership with industry that increase performance, but decrease pollutants.
Accelerators Studying Moore’s Law 2.0 and Energy Efficient Computing
In many ways, the National Labs got their start on the backs of particle accelerators, particularly the cyclotron invented by Ernest Lawrence at what is now called the Lawrence Berkeley National Lab (LBNL). Today, LBNL houses one of the world’s premier cyclotrons — the Advanced Light Source — a domed, circular path of magnets that whip electrons around at nearly the speed of light. At different points along the path the electrons are agitated, giving off powerful X-ray light. These X-rays are then shot down 43 different tubes, or beamlines, that lead to specialized labs working on different experiments.
Whether we’re talking about next-generation batteries, solar panels, lightweight vehicles, wind turbines, and advanced nuclear energy, the need for new materials is paramount
As Dr. Roger Falcone, Director of the ALS put it to me, the cyclotron’s X-rays, “reveal the inner structure and dynamics of materials and devices.” In other words, think of it as one of the world’s most powerful microscopes. With such a tool, scientists and industry can study materials at the molecular level, such as improving the physical structure of pharmaceutical drugs to increase effectiveness, studying the degradation of materials in batteries to build energy storage devices that last longer, and identifying how the molecular structure of solar cells impedes energy conversion efficiency. Source (www.top10pharma.net)
For example, the ALS is partnering with the semiconductor industry consortium SEMATECH to make more powerful microchips in the future using new materials and production processes. Put another way, they’re studying how to develop a Moore’s Law version 2.0. Chip makers are up against a problem: existing methods of packing more and more transistors on a chip to boost speed will run out of steam by around 2020 because of physical, economic, and energy efficiency limitations. A new generation of technologies is needed to continue making more powerful microchips and keep Moore’s Law chugging along, but doing so costs a lot of money. Dr. Falcone argues that’s where ALS comes in because, “Building the next generation of microchip manufacturing facilities is enormously expensive, so industry has to have high confidence that the process they choose works. The ALS is working on the fundamental scientific understanding of new [semiconductor] processes to reduce that risk.”
Meanwhile, at the same time ALS (and many others) continues researching how to push Moore’s Law beyond its limitations, National Renewable Energy Lab (NREL) is pushing the limits of data center energy efficiency. I had the opportunity to tour their new Energy Systems Integration Facility (ESIF), which houses the most efficient data center in the world. After eliciting industries’ best proposals, ESIF worked with HP to demonstrate its breakthrough warm-water cooled, high-performance system for the first time, which transfers heat directly from the supercomputer to water and then uses the much hotter water to heat the ESIF building.
In comparison, traditional datacenters allow the supercomputers to warm the surrounding air, and high-energy consuming chillers cool the room. As Steve Hammond, Director of NREL’s Computational Science Center put it to me, “The traditional method isn’t very efficient. It’s like putting a beverage on your kitchen table and then going outside to turn up the air conditioner to get your drink cold.” With greater energy efficiency, NREL can implement a more powerful supercomputer, which it’s doing to tackle the nascent clean tech industries’ most pressing problems, such as integrating intermittent renewables with the existing electricity grid.
Developing New Materials to Make Clean Energy Cheaper, Better
If accelerators, supercomputers, and lasers are the bedrock capabilities used by today’s leading scientists to advance energy technology, then developing new materials is one — if not the number one — top goal. Whether we’re talking about next-generation batteries, solar panels, lightweight vehicles, wind turbines, and advanced nuclear energy, the need for new materials is paramount.
The federal government invests almost $14 billion in the National Lab system annually, appropriated through Congress and divvied out largely by the Department of Energy
At LBNL, researchers are trying to take the guesswork out of discovering new materials through computational science and supercomputers. Dubbed the Material Genome Project, Gerbrand Ceder (MIT), Kristin Persson (LBNL) and their team have developed a computer model to predict the characteristics of different lithium-ion cathode materials for next-gen batteries. Using LBNL’s supercomputing power, the properties of over 25,000 materials have been screened for usefulness as components within energy storage systems – an astounding amount of experimentation that would have taken decades in the laboratory.
A number of new battery materials were taken from the analysis and patented, and are now being turned into startups for better electric vehicle batteries. Its analysis feeds into LBNL’s larger BATT Program aimed at developing next-generation batteries, which has supported the underlying breakthroughs that have led to start-ups like Envia, ActaCell, and Sakti3. In the future, the Material Genome Project aims to conduct similar computational analysis for other materials, such as those critical to solar cell efficiency.
Another cornerstone of advanced material R&D is LBNL’s Molecular Foundry, one of the United States five national nanotechnology facilities. At its core, the Foundry is a collaborative hub working on taking our growing knowledge of biology and applying it to advanced nanotechnology. Ron Zuckerman, the Facility Director for Biological Nanostructures at LBNL, described this process concisely as, “bioinspiration.”
To do this, the Foundry facilities contain cutting edge machinery that allows researchers to theorize, analyze, fabricate, and manipulate nanostructures within a single experimental space. Academic researchers and industry scientists utilize the space together – in-house scientists perform their own work 50 percent of the time, but help external users the other 50 percent. For example, the Foundry is advancing our understanding of metal-organic frameworks(MOFs), which is one of the leading next-generation technologies being developed for carbon capture. And its scientists are actively studying biological instances of energy transfer to create better membranes and electrodes for next-generation batteries, beginning at the molecular level.
Intersection of Policy and America’s Research Base
It’s relatively easy to lose perspective while touring through the National Labs. It’s gratifying to know that research of this complexity, scale, diversity, and risk is being undertaken, particularly wish such immense energy and climate challenges impacting the country and world.
Digging a little deeper though reveals the fundamental importance of public policy to ensure that this type of research continues. The federal government invests almost $14 billion in the National Lab system annually, appropriated through Congress and divvied out largely by the Department of Energy, but also a number of other agencies.
It’s relatively easy to overlook the impact budget cuts and sequestration can have on the Labs because of their overall quiet disposition in the national energy policy debate. But make no mistake, as agency research budgets – and the Department of Energy in particular – continue to see up to $12.5 billion in total cuts through 2013, the Labs will be forced to scale back the ambition and scope of their research. That means fewer breakthroughs, fewer new technology spin-outs, fewer top-trained scientists and engineers, and diminished U.S. innovation capabilities.
Even a whirlwind tour through just a few Labs show their importance and the types of wide-eyed “Mad Scientists” it attracts to work on the country’s most pressing problems. It’s something unique to the United States and it’s something we don’t want to lose. Without a fully-staffed, well-funded, well-connected National Lab system, our national challenges may become even more intractable, if not impossible to solve.
This is the second in a four part series chronicling highlights from Matthew Stepp’s seven-city tour, Energy Innovation Across America. The first, a tour of Salt Lake City’s energy innovation ecosystem, can be found here. For a brief introduction to the series, visit here, and for information on the Millennial Trains Project, see here.
Matthew is a Senior Polcy Analyst with the Information Technology and Innovation Foundation (ITIF) in Washington DC specialising in climate change and clean energy policy. His research interests include clean energy technology development, climate science policy development, transportation policy, and the role innovation has in economic growth. He runs a blog at the website The Energy Collective called the Capitol Energy Report. This article is reprinted from the blog with kind permission from the author.