Connect with ERS at IEPEC

IEPEC

The International Energy Program Evaluation Conference (IEPEC) kicks off on August 19 in Denver, Colorado. The event is renowned as a conference for energy program implementers, evaluators of those programs, local, state, national and international representatives, and academic researchers involved in evaluation. ERS is proud to be a gold sponsor of IEPEC. ERS will provide a pre-conference workshop, presentations, a panel discussion, poster, and host a sponsorship table at the conference. We encourage you to stop by our table and attend our presentations to learn more about ERS, our staff, and project work. A list of our conference sessions is below. We will see you in Denver!

Evaluation 203: Maximizing the Value of Impact Evaluation 

Monday, August 7 │ 1:00pm – 5:00pm | $95 Fee
Jon Maxwell, ERS and Michael Noreika, Puget Sound Energy

To understand the goals of impact evaluations, options available to complete them, and the strengths and weakness of choices that affect cost, accuracy, and value. The scope will cover the
most prevalent gross and net methods and sample design fundamentals. Training will feature selective deep dives with individual and small group exercises. For administrators, the session will include a segment on smart techniques to procure impact evaluations.
Specific topics will include:

  • Gross – site-specific M&V
  • Gross – program-level billing analysis
  • Net-to-gross methods with a close-up look at participant survey-based methods
  • Direct-to-net evaluation op
  • Sample design with a close-up look at the stratified ratio estimation method
  • Uncertainty
  • Program improvement through impact evaluation
  • Buying and selling impact evaluation services

Why Didn’t You Tell Me Earlier? A New Way for Evaluation to Inform Implementation before Project Approval

Tuesday, August 20 | 1:30 – 3:00 PM | Session 2A
Chris Zimbelman, ERS

The presentation will discuss not only the process of creation but challenges and successes associated with creating a baseline repository. The tool must be structured enough to be useful, but also flexible enough to acknowledge that there may be individual project needs that cannot be captured through industry-standard practice. Meeting the needs of the broad range of target audiences is an exciting challenge that encourages collaboration between all parties, ultimately providing clarity and efficiency for projects from implementation all the way through evaluation.

Refining the Vision: Improving the Delivery of Real-time Evaluation

Tuesday, August 20 | 1:30 – 3:00 PM | Session 2A
Levon Whyte, ERS

Among the biggest criticisms of impact evaluations by implementers is that they are not
timely. Evaluations that go beyond low-rigor verification and include in-field measurement over time or use billing data are especially vulnerable to this charge. At the 2016 ACEEE Summer Study a paper was presented a novel approach to evaluation that focused on quick measurement and verification (M&V) efforts and fast feedback on measure performance and program delivery. The premise was that the traditional model of conducting evaluations 2-3 years after program implementation did not provide timely feedback and that this proposed “real-time” impact evaluation approach would help implementers improve the outcomes of their programs sooner rather than later. The real-time evaluation approach was used with a utility’s Commercial and Industrial evaluation study and the conclusion was that the approach, though not without its challenges, was promising.

Since then, this real-time strategy has been applied to a larger Commercial and Industrial Energy Efficiency program for a different client having a greater measure diversity and carrying both gas and electric measures. This application gave evaluators the opportunity to build on lessons learned and gain new insights on how to make the process work effectively in a different context.

Net Energy Use of Behind-the-Meter Battery Storage Systems

Tuesday, August 20 | 5:30 – 7:15 p.m. | Poster Reception
Vijay Gopalakrishnan, ERS and Alexandra Bothner, Eversource

Behind-the-meter (BTM) energy storage systems can reduce peak demand (by charging batteries during off-peak hours and discharging during peak hours), resulting in cost savings for both the utility and customers. However, storage systems also result in a greater overall net use of energy due to the inefficiencies of battery cycling and maintaining charge. The study featured in this poster examined the tradeoff between peak demand savings and the increase in net energy usage for these systems.

Measuring Impact Toward Climate Change Mitigation: What Metrics are Needed?

Wednesday, August 21 | 10:30 – 12:00 |Session 5E
Moderator: Ari Michelson, ERS

The energy efficiency evaluation community is rapidly entering a time of substantial programmatic change, with the most significant shift being the growing policy focus on carbon-based regulatory and utility policy goals. States, provinces, and municipalities are increasingly adopting aggressive emissions-reduction targets. This shift is beginning to have significant impacts on the way we measure program progress, and evaluations must keep up with this trend. New evaluation approaches will be necessary with the incorporation of storage to facilitate expanded distributed renewable energy systems, the resulting impacts on demand curves, the fast-rising interest in electrification both for transportation and building systems, and other factors. Evaluation approaches and metrics must also innovate to effectively assess progress against carbon-reduction goals.

This panel will bring a diverse set of perspectives together for a compelling discussion exploring the policy shifts and the program metrics required to evaluate progress. The discussion will revolve around the following questions: What new metrics are needed to evaluate carbon reduction objectives? How can carbon metrics be implemented alongside traditional energy efficiency program evaluation? How should renewables integration and distributed energy resources (DERs) be valued? How should evaluation methods regarding fuel switching be evaluated?

Evaluating DR Evaluation

Wednesday, August 21 | 3:00 – 4:30 | Session 7A
Vijay Gopalakrishnan, ERS

This presentation aims to provide insights into appropriate evaluation methodologies,
process and impact evaluation results, and characteristics of value streams associated with different DR technologies. Since grid reliability and mitigation of grid constraints is an increasingly important topic across the United States, this paper will help utilities and grid operators throughout the country learn the best practices and limitations of different DR technologies currently available. The lessons learned from these demonstration projects and research studies have far-reaching impacts on program design, implementation, and evaluation, thereby adding value to the energy efficiency industry.

Lessons from the Field – Best Practices for Implementation and Evaluation of Supercomputers

Thursday, August 22 | 8:30 – 10:00 | Session 8C
Ryan Pollin, ERS

ERS recently completed an impact evaluation of a large industrial incentive program that
included several supercomputer projects. The research, data gathering, and analysis that went into the evaluation of these projects yielded valuable information for both program implementers and evaluators faced with supercomputer projects. This presentation will detail the challenges associated with implementing and evaluating supercomputer projects and our recommended best practices.

Best Practices and Lessons Learned as NYSERDA’s CHP Inspector

Thursday, August 22 | 8:30 – 10:00 | Session 8C
Matt Lockwood, ERS

Matt will present live case studies and discuss various types of issues flagged through the early-stage inspections and data review. Similarly, from the program design and evaluation perspective, we will provide our thoughts on best practices and lessons learned from working on this effort. We will highlight three to four instances that demonstrate the value of the inspection process and ongoing M&V to various stakeholders (end users, financiers, and incentive program staff). These will be cases where our inspection/analysis flagged major issues at a site that resulted in demonstrated action on the site’s behalf. We’ll follow up on these systems to show before and after results. Finally, we will share how NYSERDA is leveraging this portfolio of performance data to inform the next generation of incentive program design.