2 Why Care About Building Energy Performance?  Aside from building energy increasing?  Ignoring performance ratings is choosing to fly fairly blind.

Slides:



Advertisements
Similar presentations
Statistical basics Marian Scott Dept of Statistics, University of Glasgow August 2008.
Advertisements

The Robert Gordon University School of Engineering Dr. Mohamed Amish
SADC Course in Statistics Common complications when analysing survey data Module I3 Sessions 14 to 16.
ENPM808W Energy Efficiency/Energy Audit and Conservation Week III, Lecture 1: Energy Audit Procedures and Energy Benchmarking Dr. Michael Ohadi Spring.
Peeking Under the Hood: The Energy Characteristics of California’s Commercial Building Sector Martha Brook, P.E. California Energy Commission.
Plans for Commercial Building Benchmarking R&D in 2006 Martha Brook California Energy Commission PIER Buildings Program presented to the EPAC on December.
1 1 Slide © 2011 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
CE Overview Jay T. Ryan Chief, Division of Consumer Expenditure Survey December 8, 2010.
Steven Carlson, P.E. CDH Energy Corp. Evansville, WI ASHRAE Chicago, 2006 Energy Benchmarking.
Chapter 10 Index analysis. The concept and classification of index A statistical indicator providing a representation of the value of the securities which.
Basic Statistical Concepts Psych 231: Research Methods in Psychology.
The Power of Information: Rating and Disclosing Building Energy Performance Alexandra Sullivan US EPA, ENERGY STAR December 2, 2009.
17 May 2007LCWS analysis1 LCWS physics analysis work Paul Dauncey.
Stat 112: Lecture 13 Notes Finish Chapter 5: –Review Predictions in Log-Log Transformation. –Polynomials and Transformations in Multiple Regression Start.
© John M. Abowd 2005, all rights reserved Economic Surveys John M. Abowd March 2005.
Note – we are having a title slide and a background slide created – for now, please focus on copy Audience – for now, we are going to assume that the audience.
Employer Health Benefit Survey Release Slides Tuesday, August 20, 2013.
Operational Material. 2 Outline Topics to be covered Introduction to Housing Data Sources Data Requirements Survey Forms Validation Process Timeline.
Slides 13b: Time-Series Models; Measuring Forecast Error
The Use of Administrative Sources for Statistical Purposes Administrative Sources and Statistical Registers.
Portfolio Manager ® Learning Objectives In this session, you will become familiar with EPA’s ENERGY STAR Portfolio Manager tool and learn how to:
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
More about Relationships Between Two Variables
ASHRAE 2010 Winter Conference
Discussion of Mexican Adaptation of ENERGY STAR Methodology CEC Workshop Mexico City March 2013 Michael Zatz and Alexandra Sullivan ENERGY STAR Commercial.
The Path to Net Zero Energy Buildings Arkansas Chapter ASHRAE February 2008 Bill Harrison.
Section 1.1, Slide 1 Copyright © 2014, 2010, 2007 Pearson Education, Inc. Section 14.1, Slide 1 14 Descriptive Statistics What a Data Set Tells Us.
May 06th, Chapter - 7 INFORMATION PRESENTATION 7.1 Statistical analysis 7.2 Presentation of data 7.3 Averages 7.4 Index numbers 7.5 Dispersion from.
Managing Operational Energy in Buildings
Chapter 7 Confidence Intervals (置信区间)
Integrating Reserve Risk Models into Economic Capital Models Stuart White, Corporate Actuary Casualty Loss Reserve Seminar, Washington D.C September.
The Challenges & Opportunities created by California’s new Benchmarking Legislation Martha Brook California Energy Commission 2008 ASHRAE Annual Meeting,
Lessons Learned on Causes of High Energy Performance from EPA E NERGY S TAR Buildings ASHRAE 2002 Winter Meeting Seminar 41 Thomas W. Hicks U.S. Environmental.
DHP for Houses with Electric FAF Research Plan: Revisions Adam Hadley, Ben Hannas, Bob Davis, My Ton R&E Subcommittee February 25, 2015.
1 Overview of a K-12 Utility Benchmark Study and Survey Supported by the Arkansas Dept. of Education and the ADED – Energy Office Darin W. Nutter, Ph.D.,
NOTE – WE ARE HAVING A TITLE SLIDE AND A BACKGROUND SLIDE CREATED – FOR NOW, PLEASE FOCUS ON COPY AUDIENCE – FOR NOW, WE ARE GOING TO ASSUME THAT THE AUDIENCE.
Use of Administrative Data in Statistics Canada’s Annual Survey of Manufactures Steve Matthews and Wesley Yung May 16, 2004 The United Nations Statistical.
RANDRAND CAHPS® Relevance of CAHPS® for Workers’ Compensation Medical Care Donna Farley Senior Health Policy Analyst, RAND Workers’ Compensation Colloquium.
Program Name or Ancillary Texteere.energy.gov BUILDING TECHNOLOGIES PROGRAM Benchmark Buildings & Models: Concept & Structure Dru Crawley, Ph.D. U.S. Department.
Developing Building Energy Use Intensity Benchmarks for Standard 100 Energy Targets Terry Sharp, PE, CEM Building Technologies Research & Integration Center.
Biostatistics Case Studies 2007 Peter D. Christenson Biostatistician Session 3: Incomplete Data in Longitudinal Studies.
1 1 Slide Slides Prepared by JOHN S. LOUCKS St. Edward’s University © 2002 South-Western/Thomson Learning 
Biostatistics Case Studies 2008 Peter D. Christenson Biostatistician Session 5: Choices for Longitudinal Data Analysis.
BPA M&V Protocols Overview of BPA M&V Protocols and Relationship to RTF Guidelines for Savings and Standard Savings Estimation Protocols.
Monitoring at the Household Level Methods, Problems, and Use of Critical Information.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Process-Based Life Cycle Assessment: H. Scott Matthews Civil and Environmental Engineering / Engineering and Public Policy Carnegie Mellon University.
Sampling in Research Suter, Chapter 8. Questions about sampling Sample size – do I have enough participants? Is it the right kind of sample? Is it representative?
PAGE 1 Sustainable Buildings 2030 © 2008 THE WEIDT GROUP Building Energy Benchmarks THE WEIDT GROUP.
Dr. Fowler AFM Unit 8-1 Organizing & Visualizing Data Organize data in a frequency table. Visualizing data in a bar chart, and stem and leaf display.
MARKET APPRAISAL. Steps in Market Appraisal Situational Analysis and Specification of Objectives Collection of Secondary Information Conduct of Market.
Week 8 End Use Breakdown. Elements of Breakdown Understand various methods to calculate energy use and savings Understand the purpose and various types.
Surveillance and Population-based Prevention Department for Prevention of Noncommunicable Diseases Displaying data and interpreting results.
Copyright © 2009 Pearson Education, Inc. 8.1 Sampling Distributions LEARNING GOAL Understand the fundamental ideas of sampling distributions and how the.
Section 1.1, Slide 1 Copyright © 2014, 2010, 2007 Pearson Education, Inc. Section 14.1, Slide 1 14 Descriptive Statistics What a Data Set Tells Us.
Organizing and Visualizing Data © 2010 Pearson Education, Inc. All rights reserved.Section 15.1, Slide
Phoenix Convention Center Phoenix, Arizona Lessons Learned from ARRA Projects Solution ShowcaseLessions Learned from ARRA Projects Martin Weiland, P.E.
The business process models and quality issues at the Hungarian Central Statistical Office (HCSO) Mr. Csaba Ábry, HCSO, Methodological Department Geneva,
Multifamily Energy Calculator Rapid modeling of mid-rise residential projects Greg Arcangeli | Graduate Engineer | LEED AP BD+C Cristina Woodings | Graduate.
Dave Lonergan. INTRODUCTION AMD have been making processors for a long time and with each new generation the company improves on product performance and.
Portfolio Manager ® Learning Objectives In this session, you will become familiar with EPA’s ENERGY STAR Portfolio Manager tool and learn how to:
Teachers’ Literacy & Numeracy skills Bart Golsteyn, Stan Vermeulen, Inge de Wolf Maastricht University, Academische Werkplaats Onderwijs.
University of Hawaii at Manoa
FIZZ Database General presentation.
Killeen Independent School District Energy Efficiency Report
RSA Insight Report: Supporting Slides
Performance Benchmarking
Organizing and Visualizing Data
RSA Insight Report: Supporting Slides
Metadata on quality of statistical information
Presentation transcript:

2 Why Care About Building Energy Performance?  Aside from building energy increasing?  Ignoring performance ratings is choosing to fly fairly blind — staying at the “dumb” end of the “dumb and dumber” scale  Performance ratings are an evaluation, quickly, and not an investigation

3

4

5

6 New Construction has been a problem for 50 years, increasing carbon footprint   2003 CBECS data with malls, kBtu/sq-ft-yr weighted means, higher source energy EUIs in newer buildings   CBECS data show same pattern with each survey year, life-cycle influences are shown

7 Basic Energy Benchmarking (Performance) Info  Go to TC 7.6 website (shown on title slide previously)  Select Program Activities at bottom  Chicago 2006, Seminar 17, first presentation  Atlantic City 2002, Seminar 41, first two presentations

8 Current ASHRAE High-Performance Protocol Project  “ASHRAE needs to provide guidance regarding the measurement and reporting of the performance of new and existing [commercial] buildings....”  “... to further the development of building energy performance standards.”  “Measuring and Reporting the On-site Performance of Buildings...”

9 ASHRAE STANDARD to now  BSR / ANSI / ASHRAE Standard (RA99) covers measurement and expression of building energy performance at a basic level, with suggested optional extensions  Standard 105-[2007?] is a major revision and has been submitted for publication. It extends the coverage of energy performance measurement and expression, and comparison of building energy performance against others  The nature and level of performance comparison requires some performance “standard” and requires or intrinsically offers some evaluation

10 Standards of Comparison 1.Minimum prescriptions or best practice levels (Stds 90.1, 90.2, 189P, LEED) 2.Self-reference, e.g., past and future 3.Ad-hoc building populations 4.Representative populations, e.g., CBECS, RECS for USA and CEUS for CA

Applications Handbook Energy Comparisons using CBECS  Chapter 35, energy management, 3 tables on commercial buildings  Based on 2003 CBECS micro-data without malls  About 50 building types  Site energy use indexes for mean and percentiles 10, 25, 50, 75, and 90  Electricity and cost indexes at same detail

12 Commercial Buildings Energy Consumption Survey, CBECS  Latest survey micro data available = 2003, next is 2007 (released in 2010?)  Publicly available government reports and data on EIA website  Nationally representative sample, with fairly complicated cluster sampling frame  Different versions have been available, ~5,000 records  Not including imputation flags, there are ~350 data parameters  Data seem to get better each time

13 CBECS and CEUS, some important differences ItemCBECS 2003CEUS 2003 Survey approachPhoneSite, skilled Unit of interestOne building, even if a campus Site, including campuses Characteristics detailLimitedVery detailed Floor area limits due to masking < 1,000,000 sq ft for valid data No limit, but not over 2M here Fuel data limitationsPropane data coarseOnly gas and electric real Simulated or regressed end uses NoneSimulated Fuel cost dataAnnual by fuelNone Fuel data intervalsAnnual onlyMonthly

14 Basic EUI Statistics kBtu/sq-ft per yr Quantity, all weightedCBECS 2003 N = 4678 CEUS 2003 N = 2360 Mean th percentile th percentile Median th percentile th percentile449521

15 Floor Area Distributions, Sq Ft Quantity, all weightedCBECS 2003 N = 4738 CEUS 2003 N = 2360 Mean14,3528, th percentile th percentile Median th percentile th percentile ,960

16 Week Schedule, hr/week open Quantity, all weightedCBECS 2003 N = 4360 CEUS 2003 N = 2360 Mean th percentile th percentile4045 Median50 75 th percentile th percentile16898

17 Worker Density workers per 1,000 Sq-ft Quantity, all weightedCBECS 2003 N = 4360 CEUS 2003 N = 2352 Mean th percentile th percentile Median th percentile th percentile

18 Density of PCs PCs per 1,000 Sq-ft Quantity, all weightedCBECS 2003 N = 4360 CEUS 2003 N = 2127 Mean th percentile th percentile Median th percentile th percentile

19 Rough-cut, Incomplete Regression Models, weighted Parameter coefficient >> intercepts not signif. CBECS 2003 N = 4300 CEUS 2003 N = 2352 EUI change per hr/wk EUI change w/ worker density Lab, change in EUI from average 386NS Offices– 52– 49 Clinics– 42– 33 Restaurant Fast Food Average EUI245179

20 Not done fishing yet