Open Source Technical Support and Working Group Services for VA VistA Contract Number: VA118-16-C-0841 April 21, 2016 SLIN 0002AD – Open Source Software.

Slides:



Advertisements
Similar presentations
Requirements Specification and Management
Advertisements

State of Indiana Business One Stop (BOS) Program Roadmap Updated June 6, 2013 RFI ATTACHMENT D.
LYDIA HARKEY EIR ACCESSIBILITY OFFICER TEXAS A&M UNIVERSITY COMMERCE FALL Implementing Accessibility Strategically at Your Organization.
© 2009 The MITRE Corporation. All rights Reserved. Evolutionary Strategies for the Development of a SOA-Enabled USMC Enterprise Mohamed Hussein, Ph.D.
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
Future Directions for the National Healthcare Quality and Disparities Reports AHRQ 2010 Annual Conference September 27, 2010.
Chapter 10 Schedule Your Schedule. Copyright 2004 by Pearson Education, Inc. Identifying And Scheduling Tasks The schedule from the Software Development.
Systems Engineering in a System of Systems Context
Enterprise Architecture. 2 Agenda What is Enterprise Architecture (EA)? Roles in EA? Why is EA Important? Tangible Benefits from EA? What Do We Need to.
Requirements Specification
- 1 - Component Based Development R&D SDM Theo Schouten.
Fundamentals of Information Systems, Second Edition
Major Exam II Reschedule 5:30 – 7:30 pm in Tue Dec 5 th.
Iterative development and The Unified process
DITSCAP Phase 2 - Verification Pramod Jampala Christopher Swenson.
The Software Product Life Cycle. Views of the Software Product Life Cycle  Management  Software engineering  Engineering design  Architectural design.
The Vision Document 1. Importance of a Vision Document  It describes the application in general terms, including descriptions of the target market, the.
® IBM Software Group © 2006 IBM Corporation PRJ480 Mastering the Management of Iterative Development v2 Module 3: Phase Management - Inception.
What is Business Analysis Planning & Monitoring?
Chapter : Software Process
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
Transportation leadership you can trust. presented to NCHRP Project Panel presented by Cambridge Systematics, Inc. with PB Consult Inc. Texas Transportation.
The Evergreen, Background, Methodology and IT Service Management Model
RUP Requirements RUP Artifacts and Deliverables
UML - Development Process 1 Software Development Process Using UML (2)
Introduction to Software Quality Assurance (SQA)
Typical Software Documents with an emphasis on writing proposals.
Introduction to RUP Spring Sharif Univ. of Tech.2 Outlines What is RUP? RUP Phases –Inception –Elaboration –Construction –Transition.
Software Development *Life-Cycle Phases* Compiled by: Dharya Dharya Daisy Daisy
1 Our Expertise and Commitment – Driving your Success An Introduction to Transformation Offering November 18, 2013 Offices in Boston, New York and Northern.
Software System Engineering: A tutorial
These courseware materials are to be used in conjunction with Software Engineering: A Practitioner’s Approach, 6/e and are provided with permission by.
An Online Knowledge Base for Sustainable Military Facilities & Infrastructure Dr. Annie R. Pearce, Branch Head Sustainable Facilities & Infrastructure.
Role-Based Guide to the RUP Architect. 2 Mission of an Architect A software architect leads and coordinates technical activities and artifacts throughout.
Product Documentation Chapter 5. Required Medical Device Documentation  Business proposal  Product specification  Design specification  Software.
Software Engineering Principles Principles form the basis of methods, techniques, methodologies and tools Principles form the basis of methods, techniques,
CHECKPOINTS OF THE PROCESS Three sequences of project checkpoints are used to synchronize stakeholder expectations throughout the lifecycle: 1)Major milestones,
University of Sunderland CIFM03Lecture 2 1 Quality Management of IT CIFM03 Lecture 2.
The Development of BPR Pertemuan 6 Matakuliah: M0734-Business Process Reenginering Tahun: 2010.
Rational Unified Process Fundamentals Module 5: Implementing RUP.
Fifth Lecture Hour 9:30 – 10:20 am, September 9, 2001 Framework for a Software Management Process – Life Cycle Phases (Part II, Chapter 5 of Royce’ book)
Washington State Office of Insurance Commissioner State Insurance Management & Business Application Project Recap November 2007.
Systems Analysis and Design in a Changing World, Fourth Edition
Software Requirements: A More Rigorous Look 1. Features and Use Cases at a High Level of Abstraction  Helps to better understand the main characteristics.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Integrating EM QA Performance Metrics with Performance Analysis Processes August 26, 2009 Robert Hinds, Manager, Quality Assurance Engineering & Greg Peterson,
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Name Project Management Symposium June 8 – 9, 2015 Slide 1 Susan Hostetter, Reed Livergood, Amy Squires, and James Treat 2015 Project Management Symposium.
State of Georgia Release Management Training
Unit – I Presentation. Unit – 1 (Introduction to Software Project management) Definition:-  Software project management is the art and science of planning.
ATLAS Database Access Library Local Area LCG3D Meeting Fermilab, Batavia, USA October 21, 2004 Alexandre Vaniachine (ANL)
RUP RATIONAL UNIFIED PROCESS Behnam Akbari 06 Oct
Cmpe 589 Spring Fundamental Process and Process Management Concepts Process –the people, methods, and tools used to produce software products. –Improving.
University of Wyoming Financial Reporting Initiative Update April 2016.
What is a software? Computer Software, or just Software, is the collection of computer programs and related data that provide the instructions telling.
1 3:00 PM, EST. 2 Don Hewitt Vice President, Business Operations OSEHRA Ramina Toy Program Manager Brad Triebwasser.
LECTURE 5 Nangwonvuma M/ Byansi D. Components, interfaces and integration Infrastructure, Middleware and Platforms Techniques – Data warehouses, extending.
IS&T Project Reviews September 9, Project Review Overview Facilitative approach that actively engages a number of key project staff and senior IS&T.
Quarterly IPR Meeting Contract Number: VA C May 2016, 11:00 am SLIN0001AD - Quarterly IPR Brief hrameetings/onstage/g.php?MTID=e5e4.
Join us for the 2017 OSEHRA Open Source Summit! summit.osehra.org
Life Cycle Logistics.
Validation & conformity testing
DMAIC Analyze, Improve, Control
By Jeff Burklo, Director
YIIP1100 Project Management
Software Engineering I
Finance & Planning Committee of the San Francisco Health Commission
Applied Software Project Management
Presentation transcript:

Open Source Technical Support and Working Group Services for VA VistA Contract Number: VA C-0841 April 21, 2016 SLIN 0002AD – Open Source Software and Product Selection Criteria Initial Submission

Open Source Software and Product Selection Criteria Overview Approach Product Selection Criteria and Tool Success Factors and Challenges Next Steps Note: Selection Criteria and Scoring Tool v1.0 (provided separately in Excel format) Contents

Open Source Software and Product Selection Criteria Analysis Overview

Conduct “Discovery” activities, performing research and analysis to identify open source EHR products, code, and toolsets that align to, or would further enhance/expand upon, the feature set requirements as defined in the VistA 4 Product Roadmap. Support the alignment of open source products and VistA needs: –Produce a Gap Analysis of priority features and functions required to make progress with VA’s VistA vision, with primary emphasis on how that vision is elaborated in the Feature Set delivery schedule per the VistA 4 Product Roadmap. –Subsequently, overlay the findings of the Gap Analysis and SWOT Analysis to document detailed Open Source Software and Product Selection Criteria. –The Contractor shall utilize the VA open source software selection criteria to measure the degree to which open source candidates may fulfill the capability gaps. Assess Open Source Product Candidates for VA VistA Intake

Open Source Software and Product Selection Criteria Open Source Software and Product Selection SOW Requirement (section 5.2.1)Slide Numbers 1. Consolidates and prioritizes with VA the functional, technical, and performance attributes of VistA Feature Set or non-VistA Feature Set variables for further investigation Documents the constraints and assumptions or “boundary conditions” which define imposed limitations that can be physical or programmatic (e.g., specifying the latest acceptable initial operational capability (IOC) date illustrates a programmatic constraint) Elaborates capability gaps identified in the respective BRDs and RSDs16, Elaborates the extent to which the code has been vetted and tested by the open source community, and the extent to which that code may have been previously certified via automated testing and peer review which has verified the safety, compliance and functionality of the code both prior to and after new code submissions Assigns a quantitative metric by which to measure open source product attributes against functional, technical, capacity, performance, interoperability, and security requirements criteria. Additionally, the Contractor’s assessment shall include implementation criteria by which to assess the ease of integrating the open source code in the corresponding VA VistA application and with the application’s internal VA VistA interfaces

Approach

Identify initial set of open source criteria Add identified Feature Set 3 gaps as criteria for the initial quarters Develop Selection Criteria and Scoring Tool Plan to mature the content of selection criteria over the next several quarters Define, mature, and synergize the relationships of other Capabilities Based Assessments (CBA) content to the selection criteria Approach

Approach Overview

Q1 (current) –Feature set 3 gaps included based on Q1 gap analysis –Best practice criteria included –Scoring tool developed based on criteria; will be used to evaluate the next set of relevant open source software candidates Q2 –Include additional feature set 3 gaps based on Q2 gap analysis –Add initial set of security selection criteria per TWG and related discussions –Refine best practice criteria, scoring, and weighting based on feedback –Incorporate stakeholder perspective from interviews conducted during Q2 Q3+ –Incorporate new gaps as they are identified in the gap analysis –Include additional security criteria –Continue to refine best practice criteria, scoring, and weighting –Continue to mature the product selection criteria Selection Criteria – Quarterly Maturation Plan

The Open Source Software (OSS) and Product Selection Criteria will incorporate additional variances as they are identified through subsequent Gap Analyses The Product Selection Criteria in addition to the Selection Criteria and Scoring Tool will be used to screen OSS for SWOT analysis The selection criteria will be iterated in conjunction with the Gap Analysis findings and information gathered to screen for SWOT candidates with the most potential positive impact Integration with Work Products

Product Selection Criteria and Tool

Selection criteria developed for multiple areas Criteria cover the full breadth of relevant elements –Include VA-specific elements and gaps Product Selection Criteria Overview

Programmatic Constraints & Boundary Conditions Functional Fit / Capability Gaps Technical, Capacity, Performance, and Interoperability Implementation Risks Specific VistA Gaps to be Filled Security –Weighting TBD, to be determined as criteria mature Selection Criteria Areas

Each criteria supports selection against functional, technical, and performance attributes Specific VistA / VA criteria from Gap Analysis and newly emerging information from VA Criteria phrased for consistent scoring Specific Criteria Applied to Each Area

Fits with Roadmap plans - timing No significant physical, logistical, or other constraints No additional open source version improvements likely, timing of intake good (vs. improvements by others anticipated, too early to use) Speeds substantive time-to-value for VA in the area Complies with mandates relevant to implementation Programmatic Constraints & Boundary Conditions Criteria

Fills Implementation Gaps –Capability gaps identified in BRDs and RSDs Fills Vision Gaps –Capability gaps identified by comparing implementation plans against the broad VE vision Measurably improves delivery of healthcare and/or access improvements Software can perform business functions at a high-level of quality and reliability Software’s interface is user friendly Functional Fit / Capability Gaps

Application is interoperable and integrates well with VistA architecture Data and data exchange are interoperable High level of code quality and reliability - certified High capacity and scalability High quality of software documentation Minimal-to-no infrastructure changes required Software is rapidly responsive to users (speed of performance) Minimal-to-no software modifications required Software is easily maintainable – technical and business rules Software has minimal-to-no operational support requirements No licensing or copyright issues such as license mismatch Technical, Capacity, Performance, and Interoperability

Low level of business risk for implementation of new processes and cultural change Low level of software technical integration and complexity risk Impact and rollout risks are very low Implementation cost is low Implementation Risks

Scheduling risks include development of standardized information sharing for scheduling data exchange, both internal and external to the VHA Ability to use population level data to assess quality of care at the institutional protocol level (e.g., how well is one care team doing versus another with their pool of patients) EHR with analytics, cloud, and patient experience capabilities –VA CIO LaVerne Council, Congressional Testimony, April 14, needs-new-ehr-analytics-cloud-patient-experience-capabilities Specific VistA Gaps to be Filled

The Product Selection Tool provides quantitative metrics by which to measure open source product attributes –Provides score for each relevant open source software candidate –Assesses candidate against the criteria across all areas –Tool developed in Excel to calculate specific criteria which are weighted to emphasize most important measures, balance criteria across areas, and provide “tuning” capability for areas and criteria emphasis Product Selection Tool

Areas weighted evenly (20% each for Q1 version) –Security is currently not weighted since the criteria is TBD –Weights can be adjusted later based on prioritization Weighted Scale: 2 - Overweight 1 - Neutral 0.5 – Underweight Product Selection Tool Weighting Criteria

Criteria Scoring Scale Candidate fully satisfies business requirement or decision criterion. 0.5 – Candidate partially satisfies business requirement or decision criterion Unknown or null/balanced (The candidate neither satisfies nor dissatisfies business requirement or decision criterion.) Candidate partially dissatisfies business requirement or decision criterion Candidate fully dissatisfies business requirement or decision criterion Product Selection Tool Criteria Scoring System

Success Factors and Challenges

Focus on business value –Identify functional, business, and technical selection criteria to focus efforts on analysis of open source candidates that meet VA criteria Use flexible approach to content and document development which accommodates: –Initial development of selection criteria with longer term plan and maturation over time based on usefulness and feedback from VA and community Most selection criteria will be stable over time, and weighting may evolve Criteria regarding specific gaps will evolve Considerations around security criteria are emerging Success Factors

There are many documents describing the aspects of VistA Evolution (VE). These various documents give rise to issues such as: –Content overlap with varying degrees of currency –Assorted elements (KPIs, metrics, etc.) describing aspects of the VE target –No prior VE open source selection criteria to work from or use as templates –VE plans and implementations are continuously evolving, and document updates lag the changes Alignment of the selection criteria within specific gaps will be continuously shifting –Criteria around VistA gaps and open source software security will need to adjust as these issues are discussed and identified Challenges

Next Steps

Use selection prioritization criteria and Selection Criteria Scoring Tool to assess open source candidates for Q2 Mature selection prioritization criteria for Q2 –Expand the criteria for security and Feature Set 3 –Adjust the criteria weighting and scoring if needed, based on experience using the criteria for Q2 candidates –Incorporate feedback from this version –Integrate with other products Incorporate OSEHRA community input Next Steps