Input from DMG (and SsS) requirements that would help to streamline the data reduction workflow E. Villard.

Slides:



Advertisements
Similar presentations
Smarter Information Management Presenter: Dianne Macaskill Chief Executive Archives New Zealand.
Advertisements

Importing Transfer Equivalencies: How to Maximize Efficiency How Columbia College Office of Registrar improved productivity through third party solutions.
Before the transition…So far during the transition… 1. There wasn't much communication to the customers (or even, it seemed, to the FAS-IT staff) about.
1 SYS366 Week 1 - Lecture 2 How Businesses Work. 2 Today How Businesses Work What is a System Types of Systems The Role of the Systems Analyst The Programmer/Analyst.
Monitoring and Pollutant Load Estimation. Load = the mass or weight of pollutant that passes a cross-section of the river in a specific amount of time.
Commissioning the NOAO Data Management System Howard H. Lanning, Rob Seaman, Chris Smith (National Optical Astronomy Observatory, Data Products Program)
Operations Specialists Paranal [calChecker] [HC monitor]calCheckerHC monitor September 2013 The big picture I Why do we take calibrations? SCIENCE observations.
Managing Risk. Objectives  To Describe Risk Management concepts and techniques  To calculate and analyze a project using Probability of completion 
 Based on progressions points - learning statements that indicate what a student should be able to achieve at each level.  No set assessment, the way.
Level 2 IT Users Qualification – Unit 1 Improving Productivity Chris.
Event Management & ITIL V3
Testing Workflow In the Unified Process and Agile/Scrum processes.
Sinha and Kotzab: Supply Chain Management, Copyright © 2012 by Tata McGraw-Hill Education CHAPTER 8 Production Planning and Scheduling.
SCIOPS 2013 Reinhard Hanuschik, ESO Garching The VLT Quality Control Loop.
Software Phase V Testing and Improvements to Test Procedures S. Corder and L.-A. Nyman April 18, 20131ICT Planning Meeting, Santiago.
BTS330: Business Requirements Analysis using OO Lecture 6: Systems.
Firmware - 1 CMS Upgrade Workshop October SLHC CMS Firmware SLHC CMS Firmware Organization, Validation, and Commissioning M. Schulte, University.
PowerPoint Presentation for Dennis, Wixom, & Tegarden Systems Analysis and Design with UML, 5th Edition Copyright © 2015 John Wiley & Sons, Inc. All rights.
EMI INFSO-RI SA1 – Maintenance and Support Francesco Giacomini (INFN) EMI First EC Review Brussels, 22 June 2011.
CMS: T1 Disk/Tape separation Nicolò Magini, CERN IT/SDC Oliver Gutsche, FNAL November 11 th 2013.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
1 The Training Benefits Program – A Methodological Exposition To: The Research Coordination Committee By: Jonathan Adam Lind Date: 04/01/16.
TXF 4998 PROJEK ILMIAH Title: An efficient framework of distribution and collaboration in Electronic Document Management System (EDMS) Name: Norlaele bt.
Chapter 1 Strategic Human Resource Planning
ALMA Integrated Computing Team Coordination & Planning Meeting #4 Santiago, November2014 New AM organization and improving the acceptance procedure.
Logic Models How to Integrate Data Collection into your Everyday Work.
Calibration: preparation for pa
Welcome to Home Base You can also re-emphasize that all of these benefits can be found using one login all in one place, saving teachers more time to focus.
Automating Accounts Payable
Software Project Configuration Management
Status Report of EDI on the CAA
Quality Management ISO 9001
Congestion Charging: An idea that makes sense?
Project life span.
From delivery to acceptance
PLM, Document and Workflow Management
Approaches to ---Testing Software
DMG internal tools currently in use and plans to ingest requirements to ICT deliverables E. Villard.
Principles of Information Systems Eighth Edition
Improving the Defect Life Cycle Management Process
Community Participation in Research
Project Management Processes
Sessions 1 & 3: Published Document Session Summary
Existing Perl/Oracle Pipeline
Status Report of EDI on the CAA
Methodologies By Akinola Soyinka.
PanDA in a Federated Environment
ServiceNow Implementation Knowledge Management
Active Data Management in Space 20m DG
ALICE Computing Upgrade Predrag Buncic
Getting The Help Of A Professional Window Cleaner Gives You The Peace Of Mind
Organizational Cost Reduction Steps to Use CRM Applications for.
Important Software Performance Testing That Ensure High Quality Solutions.
QuickBooks has a huge demand in market be it small or big businesses. This accounting software has benefitted many businesses and has earned them huge.
Welcome: How to use this presentation
An Efficient method to recommend research papers and highly influential authors. VIRAJITHA KARNATAPU.
Chapter 2 – Software Processes
An Introduction to e-Assessment
Assessment Workshop Title of the Project (date)
A summary of part 1 of Chapter 7 CEP 812 Kay Paff March 24, 1999
Project Management Processes
Global Grants Breakout
Mental Training “The Triangle of Success”
Applying Use Cases (Chapters 25,26)
Applying Use Cases (Chapters 25,26)
Topic 4: I can see clearly now
Summing up and next steps
Jaeliza Morales CUR/516 Dr. Mary Poe
Business Planning Meeting Periodic Business Review
Topic 4: I can see clearly now
Presentation transcript:

Input from DMG (and SsS) requirements that would help to streamline the data reduction workflow E. Villard

Inputs from SSG Open & unresolved SCIREQ tickets (367) Inputs from SSG to Feb 2016 AMT/IXT meeting (190) Recent inputs from SSG: AQUA Archive P2G Project tracker Scheduler SLT Source catalog

SCIREQ/ICT workflow could be improved Some ICT tickets created without a prior SCIREQ ticket Makes it difficult to get an overall picture Most SCIREQ tickets have more than one component Makes it difficult to filter by subsystems No match of components between SCIREQ and ICT tickets SCIREQ tickets not used just for requirements, but also for discussions and keeping track of work within WG

It’s a global issue Alternative is to relax the QA2 requirements Data processing is difficult because it is at the end of the flow, receiving all issues/inefficiencies from previous parts of the flow Data processing will benefit from: More detailed commissioning Strict technical assessment More complete E2E tests Strict scheduling/observing Cleaner data Strict/more detailed QA0 Alternative is to relax the QA2 requirements

Streamlining? Streamlining means making more efficient, not prettier Although prettier can mean more efficient, of course How to improve the efficiency of data processing? Make sure, before starting data processing, that the data have a very high chance of passing QA2. This means a very thorough QA0. Automate as much as possible the various steps of data processing Transfer temporary tools to long-term subsystems, to relieve the persons maintaining the temporary tools

1. A more thorough QA0 I think a very thorough QA0 means: A strict assessment of whether an EB is useful An accurate assessment of how useful an EB is Execution count for sensitivity/angular resolution Pre-QA0 flags Binary flags, online flags, QL flags (+ manual/AoD?) QL flags not fully implemented. Could be very helpful, but difficult, so would need careful attention. Would be ideal if those flags could be taken into account at QA0, but not sure how this could be done? Additional checks: SNR on calibrators, metadata

2. Automation Imaging Weblog reviewing State transitions First imaging PL is in operations. First impressions are quite good. I think effort needs to continue on development. Weblog reviewing Effort currently on streamlining layout. This is good, but I think we should start working on having an overall QA score (machine learning). State transitions Further triggering from AQUA QA0 or QA2 status

3. Transfer of temporary tools (see next talk) Developed initially for prototyping They are now working very well. The only issue is that they take time to maintain. Knowledge from mature tools should be transferred to official tools It will help free resources for data processing. But the outcome will not be just positive: the data processing workflow is still evolving, so scientists and developers will need to continue collaborating closely together for some time. Transition from EPT to PT will have to be planned carefully. I should not say that we want to have scientists and developers work together in the same place.

Two additional suggestions

Trending Some say: ”If you can’t measure it, you can’t manage it” Not entirely true, but a bit We need to be able to determine statistics, plot values, correlate various quantities related to the data processing workflow. Helps evaluate the performance of the tools (e.g. PL) Helps see an issue before it becomes serious Helps evaluate the effectiveness of a solution when trying to resolve an issue

Public release date This is working fine in most cases. Except when an issue is affecting a large number of datasets. It would be great if it could be set automatically. For instance: If an OUS is transitioned to QA3inProgress, the public release date could be set to default date far in the future. After an OUS in QA3 was re-observed and processed, the public release date could be re-set automatically, following the policy on extension of proprietary period.