Distributed energy sources can reduce cost of electricity up to 50%, study says
Traditional grids will have to change. Modeling can help find the best way forward.
Dramatic changes are coming to the old power grid. As infrastructure ages and policy dictates a move away from fossil fuels, utilities and governments are looking at Distributed Energy Resources (DERs) as potential alternatives to continually building out a centralized grid.
DERs include all kinds of hardware that the utility may not necessarily own directly—solar panels, natural gas-fired microturbines, stationary batteries, and alternative cooling. Demand-response schemes, where a grid operator shifts electricity consumer use (usually through incentives) away from high-demand times, are also considered DERs.
Planning for DERs makes grid management trickier than it was when a company simply built a huge new plant and connected a power line to it. Without a lot of data, it’s hard to know what kinds of energy resources will have the most impact economically and environmentally and what will be most cost-effective for utilities. But a trio of researchers from Stanford University is attempting to make this planning easier for utilities and policy makers to solve. The researchers published a paper in Nature Energy this week describing a program they built to model DER deployment in a way that will result in the lowest cost to grid operators.
The program, called “ReMatch,” uses smart grid data to match groups of consumers with different kinds of distributed resources based on the customers’ energy use and the ability to construct resources in that area (like solar panels, batteries, and so on). If a business district uses a lot of power around mid-day, for example, it might be worthwhile to offer incentives for that area to install solar panels. If a row of restaurants is open until 9pm, perhaps offering those businesses a solar-plus-battery option would be more cost-effective.