SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Cyberinfrastructure Requirements and Best Practices Lessons from a study of TeraGrid Ann Zimmerman.

Slides:



Advertisements
Similar presentations
Team/Organization Name Background and structure Location Brief system information (type, size) Pilot population.
Advertisements

Program Evaluation. Overview and Discussion of: Objectives of evaluation Process evaluation Outcome evaluation Indicators & Measures Small group discussions.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
State of Indiana Business One Stop (BOS) Program Roadmap Updated June 6, 2013 RFI ATTACHMENT D.
DATA TRACKING AND EVALUATION 1. Goal of the STEP program: To increase the number of STEM graduates within the five-year period of the grant. You have.
User Support Coordination Objectives for Plan Years 4 and 5 (8/1/2008 to 3/1/2010) Sergiu Sanielevici, GIG Area Director for User Support Coordination.
Arkansas MSP Grants: Statewide Evaluation Plan Judy Trowell Arkansas Department of Education Charles Stegman Calli Holaway-Johnson University of Arkansas.
Relationships between Involvement and Use in the Context of Multi-site Evaluation American Evaluation Association Conference November 12, 2009.
New technologies and disaster information resources Part 2. The right information, at the right time, the right way.
“If you don’t know where you are going, how are you gonna’ know when you get there?” Yogi Berra Common Elements of Strong Program Evaluation THEORY OF.
How to use Student Voice Training Session Gary Ratcliff, AVC - Student Life.
Knowing What We Learn and Demonstrating Our Success: Assessment and Evaluation of FLCs Andrea L. Beach, Ph.D. Western Michigan University Presented at.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN Success Factors for Collaboratories Gary M. Olson Collaboratory for Research on Electronic Work School of.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN Symposium on Knowledge Environments for Science and Engineering National Science Foundation Nov , 2002.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Progress on Goals Completed/In Place  SDICCCA policy role, Leadership Team, WDC, Sector Task Force – working well  Voting Process, Regional.
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Impact of Experiential Ocean Science Education on Scientist-Volunteers’ Knowledge, Teaching Capacities, and NSF Broader-Impacts Endeavors Fritz Stahr,
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
© 2014 The Regents of the University of Michigan. This work is licensed under the Creative Commons Attribution 4.0 Unported License. To view a copy of.
Evaluation Process/Requirements for CAPP Algebra Project.
PYs 4 and 5 Objectives for Education, Outreach and Training and External Relations Scott Lathrop
1 SARAH HUNTER RAND CORPORATION LAURA STEIGHNER AMERICAN INSTITUTES FOR RESEARCH NOVEMBER 16, 2009 National Evaluation of the Demonstration to Improve.
Tom Duncan “Principles of Advertising and IMC” 2nd ed.
EVALUATION AND THE RESIDENCY PROGRAM Caroline C. Nielsen, Ph.D.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
You’ve Got What It Takes: Peer Training and Mentoring for Staff Development You’ve Got What It Takes: Peer Training and Mentoring for Staff Development.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Computational Literacy NPACI Site Visit July 22, 1999 Gregory A. Moses EOT Thrust Leader.
BMAN Integrative Team Project Week 2 Professor Linda A Macaulay.
Information and Discovery in Neuroscience (IDN) Carole Palmer Graduate School of Library and Information Science University of Illinois at Urbana-Champaign.
PRO-CTCAE Face-To-Face Meeting #2 Advancing the Science of Adverse Symptom Monitoring in Cancer Treatment Trials Amy P. Abernethy, MD Task 6: Usability.
United We Ride: Where are we Going? December 11, 2013 Rik Opstelten United We Ride Program Analyst.
Welcome! Please join us via teleconference: Phone: Code:
SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb.
IPMA Executive Conference Value of IT September 22, 2005.
October 21, 2015 XSEDE Technology Insertion Service Identifying and Evaluating the Next Generation of Cyberinfrastructure Software for Science Tim Cockerill.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Why Do State and Federal Programs Require a Needs Assessment?
TeraGrid Planning Workshop — June 7, 2007 Welcome to the Gateways Workshop!
Community Planning Training 5- Community Planning Training 5-1.
Strategic Research. 6-2 Chapter Outline I.Chapter Key Points II.Research: The Quest for Intelligence and Insight III.The Uses of Research IV.Research.
Shaping a Health Statistics Vision for the 21 st Century 2002 NCHS Data Users Conference 16 July 2002 Daniel J. Friedman, PhD Massachusetts Department.
Aligning Research and Funding Megan Alderden, Ph.D., Research Director Ronnie Reichgelt, Victim Services Program Administrator Illinois Criminal Justice.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Biomedical and Bioscience Gateway to National Cyberinfrastructure John McGee Renaissance Computing Institute
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
A Mentoring Model for ITOs and Employers. Overview Set the context Research methodology Initial survey Intervention (mentor workshops) Questions 2.
Three Critical Matters in Big Data Projects for e- Science Kerk F. Kee, Ph.D. Assistant Professor, Chapman University Orange, California
Aligning Forces for Quality Learning from Effective Ambulatory Practices (LEAP): Opportunities for Collaboration with AF4Q January 23, 2014.
1Mobile Computing Systems © 2001 Carnegie Mellon University Writing a Successful NSF Proposal November 4, 2003 Website: nsf.gov.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Education, Outreach and Training (EOT) and External Relations (ER) Scott Lathrop Area Director for EOT Extension Year Plans.
TeraGrid Planning Workshop — August 2007 Welcome to the TeraGrid Planning Workshop!
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Record Keeping Studies – Love ‘Em or Leave ‘Em National Science Foundation Division of Science Resources Statistics International Conference on Establishment.
Strategic Research Part 2: Planning and Strategy Chapter 6.
Economic Gardening is Sprouting in Michigan Barbara Fails – MSU Land Policy Institute Christine Hamilton-Pennell – Growing Local Economies National Economic.
Data Infrastructure Building Blocks (DIBBS) NSF Solicitation Webinar -- March 3, 2016 Amy Walton, Program Director Advanced Cyberinfrastructure.
A Framework for Evaluating Coalitions Engaged in Collaboration ADRC National Meeting October 2, 2008 Glenn M. Landers.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
State Development Information and tips to develop the Annual Work Plan 1.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Environment, Sustainability & Society Program The College of Sustainability.
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
1 Quality Management for Non-Clinical Care Barbara Rosa, RN-C, MS Thursday, August 26; 10-11:30am Washington 1 RWA-0419.
4 th Biennial SAMEA Conference Meaningful Evaluation: Improving Use and Results Evaluation of Teacher-Directed ICT Initiatives Presenter Mokete Mokone.
MPHCA CHIP OUTREACH and ENROLLMENT GRANT Background and Overview May 24 th, 2010 Carley Jefcoat and Mitch Morris.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
Process Evaluation the implementation phase
Presentation transcript:

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Cyberinfrastructure Requirements and Best Practices Lessons from a study of TeraGrid Ann Zimmerman Research Assistant Professor UM School of Information OGF Workshop, May 27, 2009

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Outline n Background n Challenges n People n Methods used n Analysis n Conclusions

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Background: Learning from TeraGrid n NSF-funded study to examine: –The TeraGrid collaboration –user needs and requirements –impact on research practice & outcomes –education, outreach & training activities n Research Team –Tom Finholt, PI; Ann Zimmerman, co-PI –Magia Krause, PhD student

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Key Questions: User Needs n What factors affect users’ computing needs and requirements? n What factors affect users' behavior as it relates to their use (or non-use) of TeraGrid/HPC? n How are the needs of users expected to change over the next five years?

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Key Questions Continued n Where do users currently spend time that does not count as doing science? n What research questions do they want to answer but currently cannot? What are the barriers?

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Data Collection June 2006-May 2007 n 7 site visits, including 4 TeraGrid sites n Interviews (n=~90) n Participant observations n User workshop n Document analysis and review n Surveys –Survey of current TeraGrid users –Surveys of tutorials at TG ’06 & TG ‘07

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu TeraGrid Planning Process n The goal of the planning process was to –develop options for delivering TeraGrid resources and services –based on the diverse needs of science and engineering communities n

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Challenges n Heterogeneous users n Potentially thousands of users n Distributed environment

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu People included in the study n Current TeraGrid users n “Target” TeraGrid users –Non-users –Science gateway developers n Cyberinfrastructure “experts” n TeraGrid personnel

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu METHODS

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Workshops n Strengths –Learn by seeing users interact –Gain both broad & detailed information –Relatively efficient n Challenges –Require careful & creative planning (pre- and post-workshop) –Invite 3-4 times as many people as you want to participate in the workshop

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Interviews n Strengths –Gain detailed information –Information informs survey development n Challenges –Time-consuming (to conduct & to analyze data) –Resources limit the number of people who can be interviewed

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu CategoryDefinition Number of interviewees TeraGrid Users  Individual Researchers Individuals associated with a project that had a TeraGrid allocation at the time of the interview 26 TeraGrid Users  Science Gateway Developers Individuals who on a day-to- day basis spend some portion of their time working on a project designated as a TeraGrid Science Gateway 27 TeraGrid PersonnelIndividuals employed by one of the TeraGrid RP sites who have a formal or informal role in the TeraGrid project 26 Non-TeraGrid Users of HPC Resources Individuals who use HPC computing resources other than TeraGrid 3 Cyberinfrastructure ExpertsIndividuals with extensive knowledge of high- performance computing 4

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Survey n Advantages –Findings are generalizable across a larger population –Inexpensive (in $ not in person time) n Challenges –Developing a good survey is hard –Getting a good response rate takes a lot of effort

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Participant Observation n Advantages –Learn about all the factors that affect the ability to serve users –Learn about user needs from a variety of sources n Challenges –Time-consuming –Capturing and analyzing data

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu FINDINGS

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Components of User Behavior n The nature of the research problem n Alignment between infrastructure and scientific practice n Computational readiness n Ease of use

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Achieving Transformative Science n “Easy” things can be show stoppers n Many complexities to manage –virtual organization –diverse user needs –changes in science

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Conclusions n Different methods provide different kinds of information n Involve more than users in your study n Current methods are effective, but time- consuming and resource-intensive –New methods required. For example, “mine” and analyze sources of information (wikis, user support logs and databases, user sites)

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu More information n TeraGrid evaluation study reports & Planning Process workshop reports – (browse for documents by Ann Zimmerman) n Other TeraGrid Planning Process materials –

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Acknowledgments n TeraGrid n Research participants n NSF grants OCI and OCI

SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN si.umich.edu Contact me! n Sorry I couldn’t be here! n Contact me at: –