Baker/McGreevy Day 1 How many beamlines have eqpmt not able to support present software? Certain experiments can’t be done now, but in future? What does.

Slides:



Advertisements
Similar presentations
ASYCUDA Overview … a summary of the objectives of ASYCUDA implementation projects and features of the software for the Customs computer system.
Advertisements

TeraGrid Deployment Test of Grid Software JP Navarro TeraGrid Software Integration University of Chicago OGF 21 October 19, 2007.
Andrew McNab - Manchester HEP - 24 May 2001 WorkGroup H: Software Support Both middleware and application support Installation tools and expertise Communication.
DIGIDOC A web based tool to Manage Documents. System Overview DigiDoc is a web-based customizable, integrated solution for Business Process Management.
Presentation by Prabhjot Singh
© 2005 by Prentice Hall Appendix 2 Automated Tools for Systems Development Modern Systems Analysis and Design Fourth Edition Jeffrey A. Hoffer Joey F.
Track, View, Manage and Report on all aspects of the Recruitment Process… with ease!
Computer Integrated Manufacturing CIM
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Introduction to the State-Level Mitigation 20/20 TM Software for Management of State-Level Hazard Mitigation Planning and Programming A software program.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN Environmental Molecular Science Laboratory (EMSL) Collaboratory at the Pacific Northwest National Laboratory.
Experimental Facilities DivisionORNL - SNS June 22, 2004 SNS Update – Team Building Steve Miller June 22, 2004 DANSE Meeting at Caltech.
Requirements Analysis 5. 1 CASE b505.ppt © Copyright De Montfort University 2000 All Rights Reserved INFO2005 Requirements Analysis CASE Computer.
Copyright 2002 Prentice-Hall, Inc. Chapter 4 Automated Tools for Systems Development 4.1 Modern Systems Analysis and Design Third Edition Jeffrey A. Hoffer.
© , Michael Aivazis DANSE Software Issues Michael Aivazis California Institute of Technology DANSE Software Workshop September 3-8, 2003.
SNS Update DANSE Workshop Steve Miller September 20-21, 2004.
Objectives Explain the purpose and various phases of the traditional systems development life cycle (SDLC) Explain when to use an adaptive approach to.
SDLC. Information Systems Development Terms SDLC - the development method used by most organizations today for large, complex systems Systems Analysts.
Knowledge Portals and Knowledge Management Tools
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
Laboratory Information Management Systems (LIMS) Lindy A. Brigham Div of Plant Pathology and Microbiology Department of Plant Sciences PLS 595D Regulatory.
Migrating Access Applications to.NET and SQL Server Ken Tim
© 2005 by Prentice Hall Appendix 2 Automated Tools for Systems Development Modern Systems Analysis and Design Fourth Edition Jeffrey A. Hoffer Joey F.
Open and Shared Information System OaSIS. SUNCOM’s Standard Business Process Centralized ordering for the enterprise Maintenance of an enterprise inventory.
ViciDocs for BPO Companies Creating Info repositories from documents.
Appendix 2 Automated Tools for Systems Development © 2006 ITT Educational Services Inc. SE350 System Analysis for Software Engineers: Unit 2 Slide 1.
Systems Analysis – Analyzing Requirements.  Analyzing requirement stage identifies user information needs and new systems requirements  IS dev team.
Systems Analysis And Design © Systems Analysis And Design © V. Rajaraman MODULE 14 CASE TOOLS Learning Units 14.1 CASE tools and their importance 14.2.
Understanding Data Warehousing
S/W Project Management Software Process Models. Objectives To understand  Software process and process models, including the main characteristics of.
Introduction to Computer Aided Process Planning
Transforming Army Management of Individual Chemical Protective Equipment Mobility Inventory Control and Accountability System Used in Individual Protective.
Automated Data Analysis National Center for Immunization & Respiratory Diseases Influenza Division Nishan Ahmed Data Management Training Cairo, Egypt April.
Communication & Web Presence David Eichmann, Heather Davis, Brian Finley & Jennifer Laskowski Background: Due to its inherently complex and interdisciplinary.
DIFFERENCE BETWEEN ORCAD AND LABVIEW
ENTERPRISE RESOURCE PLANNING.  ERP is a Enterprise Resource Planning, used by company to help them to store and manage dataevery stage of business and.
 Knowledge Acquisition  Machine Learning. The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
Chapter 10  2000 by Prentice Hall Information Systems for Managerial Decision Making Uma Gupta Introduction to Information Systems.
Module 7: Fundamentals of Administering Windows Server 2008.
SCSC 311 Information Systems: hardware and software.
 CS 5380 Software Engineering Chapter 2 – Software Processes Chapter 2 Software Processes1.
1 Knowledge Portals and Knowledge Management Tools Chapter 13.
Process Improvement. It is not necessary to change. Survival is not mandatory. »W. Edwards Deming Both change and stability are fundamental to process.
Presented by Scientific Annotation Middleware Software infrastructure to support rich scientific records and the processes that produce them Jens Schwidder.
SSC SI Data Processing Pipeline Plans Tom Stephens USRA Information Systems Development Manager SSSC Meeting – Sept 29, 2009.
ANKITHA CHOWDARY GARAPATI
March 2004 At A Glance NASA’s GSFC GMSEC architecture provides a scalable, extensible ground and flight system approach for future missions. Benefits Simplifies.
March 2004 At A Glance autoProducts is an automated flight dynamics product generation system. It provides a mission flight operations team with the capability.
Chapter 6 CASE Tools Software Engineering Chapter 6-- CASE TOOLS
CASE (Computer-Aided Software Engineering) Tools Software that is used to support software process activities. Provides software process support by:- –
Mantid Stakeholder Review Nick Draper 01/11/2007.
HUMAN RESOURCE MODULE. Sub systems under HR module Human resource management is an essential factor of any successful business. The various subsystems.
Copyright 2002 Prentice-Hall, Inc. Chapter 4 Automated Tools for Systems Development 4.1 Modern Systems Analysis and Design.
 Programming - the process of creating computer programs.
RECENT DEVELOPMENT OF SORS METADATA REPOSITORIES FOR FASTER AND MORE TRANSPARENT PRODUCTION PROCESS Work Session on Statistical Metadata 9-11 February.
Software Quality Assurance and Testing Fazal Rehman Shamil.
Learning Objectives Understand the concepts of Information systems.
The Integrated Spectral Analysis Workbench (ISAW) DANSE Kickoff Meeting, Aug. 15, 2006, D. Mikkelson, T. Worlton, Julian Tao.
CS223: Software Engineering Lecture 18: The XP. Recap Introduction to Agile Methodology Customer centric approach Issues of Agile methodology Where to.
Copyright Office Material Copyright Request System.
CASE (Computer-Aided Software Engineering) Tools
Introduction to Computer Aided Process Planning
Proposal of Satellite Data Center India Meteorological Department A.K.Sharma (Chairman), Virendera Singh (Member), R.K.Giri (Member) and N.Puviarasan (Member.
Your Interactive Guide to the Digital World Discovering Computers 2012 Chapter 13 Computer Programs and Programming Languages.
IODE Ocean Data Portal - technological framework of new IODE system Dr. Sergey Belov, et al. Partnership Centre for the IODE Ocean Data Portal.
Laboratory Information Management Systems (LIMS)
Appendix 2 Automated Tools for Systems Development
An Introduction to the IVC Software Framework
Ultimate Requirements & Test Management
Windows Forms in Visual Studio 2005: An in-depth look at key features
Presentation transcript:

Baker/McGreevy Day 1 How many beamlines have eqpmt not able to support present software? Certain experiments can’t be done now, but in future? What does current software not do? Need readouts of mevs, A-1, current display of data that is not esoteric, is a peak an artifact of the instrument? Need data simulations during experiment Do the results correspond to expectations? Intelligent systems? What is the “2-minutes” based on?, Time requirements of different instruments? Walk-in, crawl under hood, push button, analyze data versus simulation Who will determine if sample is good? Modeling program for structure will aid this? Experiment pre-planning will allow sample structure determination to be what is expected: analysis software with rapid response, and tie into structure databases on planet

Baker/McGreevy Day 1 What leads to hypothesis-driven research? How to data dip in an integrated way Need standard modules describing resolution models for each instrument Central repository of software Code throw-out: lower the bar, make the development process easy and straightforward Will the code be written in open source so future use cannot be denied? Code openness leads to responsibility to assist others in use of code; this may lead to reluctance for sharing Neutron community should support sharing of the code! Quality assurance is a hurdle; vote with feet

Baker/McGreevy Day 1 User to have option of leaving facility w/o data reduction Gory details of sample for metadata? Encourage but not mandate Capture as much metadata as possible automatically, electronic notebook, smart card Should software identify, and data formats, track errors? YES! Implications of computer security Track treatment of each data set

Baker/McGreevy Day -1 Different levels of access for software repository? Benchmarking software is important Prioritize software adapted to SNS, who? Resources will never be available to support all codes Need the ability to integrate “personal” code into SNS framework How will needs of casual user differ from experienced user? Need to know what codes are being written Two levels of codes: (1) facility supported with high level of detail; (2) user supplied with lower level of documentation Capture intellectual prowess of community

Baker/McGreevy Day -1 Going from specific use code to general purpose is difficult and time- consuming Common modules for instruments or analysis functions Reducing data will be provided by the SNS, common across instruments; treatment routines would be different, help provided by facility API: application program interfaces are needed; Facility will provide instrument-specific output after reduction, capable of being used with analysis programs Need assessments of individual code packages to see their potential for use at SNS Source code is needed if facility manages code: trust but verify Identify class of customers for software Inexperienced users need statistical packages (SAS..) to be integrated with other data treatments, plug and play

Baker/McGreevy Day -1 Total problem-solving environment needed? Entropy module to be pulled in, along with knowledge base Top-level from home institution, plugged in and ported; home, airport and on plane Be able to cut junk, eliminate other parts of data, but note various subtractions. The only facility product is what comes out of detectors, all of the rest is visualization Are there codes available that learn? Codes that are expert systems? Common component architecture (CCA) can be used to integrate pre-packaged data analysis Should there be an effort to translate the experience of instrument scientists into an expert system? YES Immediate incentive is needed for supplying metadata and ancillary code; convince a few super-users of benefit Electronic notebook to include metadata

Baker/McGreevy Day -1 Identify barriers to use personal codes and then the added value to use facility programs Jennifer White and viz the lipids User written programs typically start at the analysis stage; SNS will provide data reduction programs SNS will enable people’s science, this is a partnership, not get in each other’s way What is minimum to get in the SNS door? Should SNS be another tool in a tool box? SNS should provide the framework for the full process, yet allow individuals to use their favorite tools, to be extensible

Baker/McGreevy Biologists Biologists have high level requirements, but are not code diggers, but need high- level interface; they need data formats that allow use of existing software (CCP-4)

Baker/McGreevy Chemistry Jennifer White and viz the lipids

Baker/McGreevy Materials Science

Baker/McGreevy Physics

Baker/McGreevy Day D goggles with colored neutrons viewing structure, with dispersion lit up on touch Flexibility to go down into pulse-level raw data, as well as go up in ultimate massaging Ability for instrument simulation of resolution may be soon or longer term

Baker/McGreevy Paul’s summary Data format Standard formats Capture (automatically) as much metadata as possible Carry processing history Carry error propagation Carry full history of analysis (MC, MD..) Data visualization All type of formats Instantly at instrument From everywhere (office, on/off line) Analysis reduction Software architecture should be plug-in for legacy code and individual code SNS to provide core validated code Provide depository and knowledge base Future Encourage growth of software with professional programmers and contributions from the community Other Reliability is key (every time it is needed whatever it is) Users should have access to data from raw data level to final image as desired