Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications NCHRP 22-24 FUTURE WORK FOR TASK 8 Worcester.

Slides:



Advertisements
Similar presentations
Quality Assurance/Quality Control Plan Evaluation February 16, 2005.
Advertisements

Hubbard County, Minn. Classification and Compensation Study Update GREG MANGOLD| AUGUST 5, 2014.
NCHRP Task 254 Project Vehicle Size & Weight Management Technology Transfer/Best Practices.
WP4 – Task 4.4 LCA Activities
M.A.S.H.: The New Safety Hardware Crash Testing Criteria
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
GBS PILOT PROJECT ROUND TABLE REVIEW TRIPARTITE TOKYO 20th SEPTEMBER 2007.
California Department of Public Health Loriann De Martini, Pharm.D. Chief Pharmaceutical Consultant Center for Healthcare Quality Medication Error Reduction.
SUMMARY CHANGES FOR NCHRP REPORT 350 GUIDELINES [NCHRP (02)] Keith A. Cota, Chairman Technical Committee on Roadside Safety May 4, 2007.
Subcommittee No. 7 - Certification of Test Facilities by Ronald K. Faller, Ph.D., P.E. Midwest Roadside Safety Facility John LaTurner, P.E. E-TECH Testing.
Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications NCHRP DEFINITIONS AND PROCEDURES.
Evaluating the Quality of Image Synthesis and Analysis Techniques Matthew O. Ward Computer Science Department Worcester Polytechnic Institute.
1 Dissertation Process 4 process overview 4 specifics –dates, policies, etc.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Implementing Commissioning and the CxP for DOE Super ESPC Projects Draft Guideline Review Chuck Dorgan, P.E., Ph.D., CxAP Roger Johnson.
Marco Anghileri Dipartimento di Ingegneria Aerospaziale Politecnico di Milano Italy. NCHRP 22_24 Interim report Meeting. Washington Robust:
Best Practices Related to Research Problem Identification, Scoping, and Programming: A Researcher’s View Martin Pietrucha, Director The Thomas D. Larson.
VALIDATION METHODOLOGY
ADC Meeting ICEO Standards Working Group Steven F. Browdy, Co-Chair ADC Workshop Washington, D.C. September, 2007.
PEFA Performance Measurement Framework A Tool For Budget Reforms THE GEORGIA EXPERIENCE.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Method Validation and Verification: An Overview Patricia Hanson, Biological Administrator I Florida Department of Agriculture and Consumer Services, Food.
Forty-second session of GRSP December REGULATION No. 94 (Frontal collision) Proposal for draft amendments Proposal submitted by France Informal.
WP1Transnational project and financial management Establishment-Operation of the Project Management and Implementation Instruments Region of Peloponnese.
OpenSG Conformity IPRM Overview July 20, ITCA goals under the IPRM at a high level and in outline form these include: Organize the Test and Certification.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
Pre kick off meeting 1 1 NCHRP_22_24 January CME Activities 1.Define reporting procedures for simulations (data output, variables, etc) 2.Define.
Update: Grocery Refrigeration Provisional Standard Protocol for Site Specific Savings RTF Meeting June 28,
Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications NCHRP VERIFICATION AND VALIDTION.
On-line briefing for Program Directors and Staff 1.
Quality Evaluation methodologies for e-Learning systems (in the frame of the EC Project UNITE) Tatiana Rikure Researcher, Riga Technical University (RTU),
P1516.4: VV&A Overlay to the FEDEP 20 September 2007 Briefing for the VV&A Summit Simone Youngblood Simone Youngblood M&S CO VV&A Proponency Leader
©Dr I M Bradley Doing the project and other things.
Minutes from Subcommittee No. 7 - Certification of Test Facilities by Ronald K. Faller, Ph.D., P.E. Midwest Roadside Safety Facility John LaTurner, P.E.
NCHRP Project Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications SURVEY OF PRACTITIONERS.
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
Evaluating Assessment for the NRS Evaluating Assessments for the NRS American Institutes for Research February 2005.
Pre kick off meeting 1 1 NCHRP_22_24 January Time Frame CM/E finalised the vehicle and test object modelling documents (Parts 2&3) –These documents.
Test Documentation and Reporting AASHTO Task Force 13 Subcommittee 7 Laboratory Accreditation Spring 2006 Meeting Lido Beach, Florida May 11-12, 2006 Subcommittee.
Building Simulation Model In this lecture, we are interested in whether a simulation model is accurate representation of the real system. We are interested.
CME Bruxelles. 13/03/2007 Cen /TC226/WG1/CME 15° meeting agenda. Chairmanship. (all) TRB Report (Marco) Validation activities: –Latest development.
P ublic Sector Development Project Preparation, Approval and Revision Process Mantu Kumar Biswas Joint Chief Ministry of Water Resources Bangladesh Secretariat.
3GPP2 Publication Process Training TSG-S PMT. December Presentation Overview Background OP Input and Intent Publication Process Overview The Revised.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Judges Score Card Overview Mark Turner NASA Ames Research Center.
Model Minimum Uniform Crash Criteria (MMUCC) Everything you wanted to know about MMUCC.
NDIA PMCS May Rosslyn, VA Earned Value Management System Acceptance Working Group Buddy Everage Working Group Lead.
Highway Infrastructure and Operations Safety Research Needs (NCHRP 17-48) Prime Contractor: UNC Highway Safety Research Center Subcontractors: VHB Jim.
CME Bruxelles. 13/03/2007 Validation. Last meeting result.
An Overview Of The MBP CM Leadership Manual Presented by: Robert J. Carter, CCM Robert M. Pfeiffer, CCM.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Determinations / verifications under JI – Experience to date UNFCCC Technical Workshop on Joint Implementation Bonn, February 13 th, 2007 For the benefit.
EIAScreening6(Gajaseni, 2007)1 II. Scoping. EIAScreening6(Gajaseni, 2007)2 Scoping Definition: is a process of interaction between the interested public,
Steve Griffith February 28th, 2017
Introduction for the Implementation of Software Configuration Management I thought I knew it all !
Taught Postgraduate Program Review
Session 30 Transition to the Certification Phase of Training
Commercial Operations Sub-Committee Update to TAC
Implementation Strategy July 2002
Flooding Walkdown Guidance
Setting Actuarial Standards
Sam Houston State University
MECH 3550 : Simulation & Visualization
Sam Houston State University
Building Valid, Credible, and Appropriately Detailed Simulation Models
PSS verification and validation
Taught Postgraduate Program Review
Standards Development Process
Fiber Reinforced Polymer Systems
Presentation transcript:

Development of Verification and Validation Procedures for Computer Simulation use in Roadside Safety Applications NCHRP FUTURE WORK FOR TASK 8 Worcester Polytechnic Institute Battelle Memorial Laboratory Politecnico di Milano

Meeting Agenda 9:00-9:30Introductions/Instructions (Niessner, Focke) 9:30-10:30Definitions and V&V Procedures (Ray) 10:30-11:30ROBUST Project Summary (Anghileri) 11:30-NoonSurvey Results Noon-1:00Lunch 1:00- 2:30V&V Metrics (Ray) 2:30-4:00Future Work for Task 8 (Ray)

PROPOSED FUTURE TASKS Task 8: Execute the plan – Task 8A:Roadside Safety Simulation Validation Program (RSSVP) – Task 8B:Develop Benchmark Cases – Task 8C:Acceptance Criteria for Quantitative Metrics – Task 8D:Roadside Safety Model Best Practices Guide – Task 8E:Develop standardized V&V report format and procedural Guidelines Task 9Test Cases Task 10 Review the Guidelines Task 11Recommendations Task 12Final Report

Task 8: Execute the Plan Execute the approved plan and developed the Guidelines. Task 8A: Roadside Safety Simulation Validation Program (RSSVP) Task 8B: Develop Benchmark Cases Task 8C: Acceptance Criteria for Quantitative Metrics Task 8D: Roadside Safety Model Best Practices Guide Task 8E: Develop standardized V&V report format and procedural Guidelines

Task 8A:Roadside Safety Simulation Validation Program (RSSVP) Develop a computer program that: – Synchronizes the impact time from the two time histories (i.e., method of least squares). – Filters the data for both time histories (SAE J211). – Calculates the following domain-specific metrics: ORA OIV THIV PHD ASI 50 msec avg. Maximum roll, pitch and yaw – Calculates the following shape comparison metrics for all six DOF (e.g., X,Y,Z accel/vel, RPY rotation rate time histories): ANOVA Sprauge-Geers MPC Others? – Presents: Comparison time history curves Tables of result comparisons and metrics Assessment based on acceptance criteria

Task 8A:Roadside Safety Simulation Validation Program (RSSVP) Verify the calculations in the program against: – Hand/MATLAB calculated examples and – Analytical shapes. The program should: – Be consistent with TRAP – Be easy to use (i.e., input unfiltered, unmodified time history data both from the simulation program and crash test). – Be useable for BOTH verification and validation. – Be useable for both comparisons of complete models as well as components (crash test domain-specific metrics would not be calculated for component verfication/validation).

Task 8B:Develop Benchmark Cases Develop two or three benchmark cases for roadside safety. Review the literature to find several complete models that: – Use LSDYNA features and material models that are important for roadside safety simulations. – Examples: – TopCrunch

Task 8C:Acceptance Criteria for Quantitative Metrics Use the RSSVP program developed in Task 8A to evaluate repeated crash test data sets and validate the program: – Brown’s six Festiva rigid pole tests. – The 12 rigid barrier ROBUST Round Robin tests. – The four deformable ROBUST Round Robin tests. – Any others we can find. Determine reasonable acceptance criteria based on the repeated tests. Subject Mater Expert Survey similar to Moorcroft. Coordinate Expert opinions to quantitative criteria.

Task 8D:Roadside Safety Model Best Practices Guide Develop a roadside safety best practices guide. – Examples: SAE Committee on aircraft seats is trying to develop a best practices guide ASME PTC-10 is developing a series of best practices guides. – Such a guide would be useful for – What should be in the guide? Mesh densities Time step issues (i.e., mass scaling) Modeling techniques

Task 8E:Develop standardized V&V report format and procedural Guidelines Body of the report – Develop a procedures guideline consistent with the ASME V&V – Develop a standardized format for V&V reports. Appendices – Provide Benchmark cases in the standardized format as examples. – Provide an example from Task 9 in the standardized format as an example.

Task 8E:Develop standardized V&V report format and procedural Guidelines

Task 9: Test Cases Select an on-going project at WPI/Battelle/Milan and apply the V&V process to it. Apply V&V process Document in the standardized format Possible projects: Annisquam River Bridge Railing (WPI) Knee-Thigh-Hip model (WPI) Tractor-Trailer (Battelle) Single Unit Truck (Battelle) Crash cushion project (Politecnico) Apply the Guidelines to a minimum of two selected crash simulations of roadside hardware designs. Revise Guidelines as appropriate.

Task 10: Review the Guidelines Distribute the draft guidelines and standardized report format to the subject-area experts several months prior to the meeting to allow them to possibly use it. Convene the meeting and discuss the subject-area expert’s experiences with using the Guidelines. Revise/modify the Guidelines to improve the Guidelines based on the comments. Meet with the NCHRP Panel and up to five to eight additional subject- area experts to review the guidelines developed in Task 8, approximately one month after their submittal. Modified guidelines resulting from review comments and discussion at the meeting shall be included in the preliminary draft report.

Task 11: Recommendations How can the procedures and guidelines be incorporated into the State/Federal acceptance progress? The Guide should be a “self certification” document. The validation report should act in a similar way to a crash test report submitted as documentation to the FHWA for acceptance. Develop recommendations for implementation for hardware modification acceptance, including a standardized report format for documenting the verification and validation of the model.