Assessing Research Management at Canada’s National Research Council AEA/CES Conference 2005 Crossing Borders, Crossing Boundaries R&D Topical Interest.

Slides:



Advertisements
Similar presentations
Local Immigration Partnerships: Systems Planning to Help People.
Advertisements

Capacity Building Global Support Program Enhance the institutional capacity necessary to support professionals in implementing tiger conservation over.
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
Note: Lists provided by the Conference Board of Canada
Educate to Innovate A SusChem programme for building skills capacity for a sustainable European chemical sector Susan Fleet - Britest Limited Professor.
Family Resource Center Association January 2015 Quarterly Meeting.
CADTH Therapeutic Reviews
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
Quality evaluation and improvement for Internal Audit
Knowledge Translation: A View from a National Policy Perspective KU-02 Conference Oxford, England July 2, 2002.
Professional Learning Communities in Schools Online Workshop.
Purpose of the Standards
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Internal Auditing and Outsourcing
Measuring Innovation and Smart Specialisation – What have we Learned? Dirk Pilat, OECD.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Inventory, Monitoring, and Assessments A Strategy to Improve the IM&A System Update and Feedback Session with Employees and Partners December 5, 2011.
© The Johns Hopkins University and The Johns Hopkins Health System Corporation, 2011 Sustaining and Spreading surgical safety improvements with SUSP Mike.
Outcomes of the 16 th Regional Disaster Managers Meeting held from 9 th – 11 th August 2010 Presentation to the Pacific Humanitarian Team Monday 6 th December.
Campaign Readiness Project Overview Enabling a structured, scalable approach to customer-centric campaigns.
Missouri Integrated Model Mid-Year Meeting – January 14, 2009 Topical Discussion: Teams and Teaming Dr. Doug HatridgeDonna Alexander School Resource SpecialistReading.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
1 Mid-Term Review of the Hyogo Framework for Action Roadmap to Disaster Risk Reduction in the Americas & HFA Mid-Term Review.
Module 2 Stakeholder analysis. What’s in Module 2  Why do stakeholder analysis ?  Identifying the stakeholders  Assessing stakeholders importance and.
Commissioning Self Analysis and Planning Exercise activity sheets.
Committee on the Assessment of K-12 Science Proficiency Board on Testing and Assessment and Board on Science Education National Academy of Sciences.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Introduction to Development Centres Sandra Schlebusch The Consultants.
Implementing Inquiry Based Science Teaching Through Teacher Networks The SINUS model for educational development in Germany Matthias Stadler, Kiel (Germany)
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
BEYOND MKUKUTA FRAMEWORK: Monitoring and Evaluation, Communication and Implementation Guide Presentation to the DPG Meeting 18 th January, 2011.
Key Principles for Preparing the DCSD Community Plan 1.Integration – Social, Economic, Environmental Well-being focused on outcomes and people centred.
1 Click to edit Master text styles Second level Third level Fourth level Fifth level Administrative Support for Large- Scale Funding Applications – Session.
Ohio Superintendent Evaluation System. Ohio Superintendent Evaluation System (Background) Senate Bill 1: Standards for teachers, principals and professional.
The RDI Governance System Vasileios Pitsinigkos Head of Managing Authority of Eastern Macedonia - Thrace Region.
Queen’s Management & Leadership Framework
1 NOAA Priorities for an Ecosystem Approach to Management A Presentation to the NOAA Science Advisory Board John H. Dunnigan NOAA Ecosystem Goal Team Lead.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
TRANSPORTATION RESEARCH BOARD WATER SCIENCE AND TECHNOLOGY BOARD TRANSPORTATION RESEARCH BOARD TRB’s Vision for Transportation Research.
Including School Stakeholders. There are many individuals and groups associated with schools and many of these people are likely to have valuable ideas.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Kathy Corbiere Service Delivery and Performance Commission
INTEGRATED STRATEGIC PLANNING PROCESS (ISPP) 10 year planning outlook10 year planning outlook Monia Lahaie, DCFO and Director General Finance at Statistics.
1 Role of Evaluation Societies in Nurturing the M&E Systems Daniel Svoboda Czech Evaluation Society IDEAS Global Assembly October 2015.
Deerin Babb-Brott, Director National Ocean Council Office National Boating Federation 2013 Annual Meeting.
Generic competencesDescription of the Competence Learning Competence The student  possesses the capability to evaluate and develop one’s own competences.
Identifying, Evaluating and Prioritising Urban Adaptation Measures.
Info-Tech Research Group1 Manage the IT Portfolio World Class Operations - Impact Workshop.
SOLGM Wanaka Retreat Health and Safety at Work Act 2015 Ready? 4 February 2016 Samantha Turner Partner DDI: Mob:
Commissioning Support Programme Post-16 Commissioning David Brown NASS Conference 9 th October 2009.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Science, Technology, Engineering, and Innovation as Instruments for Enhancing Competitiveness Organization of American States Meeting of Ministers and.
Collaborative & Interpersonal Leadership
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Overview – Guide to Developing Safety Improvement Plan
Overview – Guide to Developing Safety Improvement Plan
school self-evaluation and improvement toolkit
Considerations in Development of the SBSTA Five Year Programme of Work on Adaptation Thank Mr. Chairman. Canada appreciates this opportunity to share.
There is a significant amount of diversity across the 38 rural councils in terms of the challenges faced, as well as capacity, resourcing and uptake.
Civic Engagement in Minnesota
February 21-22, 2018.
The Use and Impact of FTA
EU initiative on integrating ecosystems and their services into decision-making Draft outline CGBN, Brussels, 1 March 2018.
Environment and Development Policy Section
Reading Paper discussion – Week 4
Presentation transcript:

Assessing Research Management at Canada’s National Research Council AEA/CES Conference 2005 Crossing Borders, Crossing Boundaries R&D Topical Interest Group – Presentation Toronto, October 28, 2005

Presentation Outline About the National Research Council of Canada (NRC) Challenges in Research Management What is the Research Management Self- Assessment (RMSA) Tool? Development of the Tool Progress to Date, and Moving Forward

The National Research Council of Canada – Science at Work for Canada Economic value Incubators and start-up support Knowledge and technology transfer/diffusion Commercialization and new company creation SME support Government of Canada’s leading resource for scientific research, development and technology-based innovation Research and Development Basic/fundamental to applied Technology development National R&D infrastructure R&D training International collaborations Innovation Systems and Support Regional innovation systems Information and knowledge networks National codes and standards Resources ( ): Full time staff (FTE): 4,100+ Expenditures: $712.4M Appropriations spent: $653.0M Revenues:$75.2 million Infrastructure: 175 buildings, 517,407 m2 Research areas include: Biotechnology Information and Communications Technologies Aerospace Manufacturing Construction Ocean engineering Genomics Fuel cells Bioinformatics High-performance computing Photonics Nanotechnology Environmental and sustainable development technologies 20 institutes across Canada, 2 technology centres, IRAP, CISTI

Our Research Environment

Some Challenges in Managing Research More generic Uncertainty about where most valuable discoveries lie Risks and costs associated with uncertainty Unpredictable results given the pursuit of the unknown Difficulty in assessing contribution/impact of research results Often long timeframes before outcomes or impacts of research become evident Others facing NRC Balancing creativity and accountability (value for Canada) Managing High Quality Personnel within federal framework Planning in a dynamic environment Prioritizing with multiple stakeholder interests Managing multiple collaborations Establishing attribution of results in collaborative efforts Balancing cluster activities, horizontal program requirements and institute research activities

What is RMSA? The Research Management Self-Assessment (RMSA) is a practical, diagnostic tool developed by NRC for use by its institutes in support of continuous improvement and effective management of research and research-related activities. The RMSA: A framework and codified research management practices against which institutes can assess themselves Complementary to NRC’s annual performance reporting, and program evaluation Helps to identify links to other corporate resources and tools Furthermore, RMSA allows: A common language across NRC Knowledge sharing and exchange A non-threatening, honest, open self-examination for institutes Engagement of research and related staff Opportunity for management development

RMSA Framework (August 2005)

RMSA Development

How Does RMSA Work? 1. Assessment Planning 2. Pre-Assessment Session3. Assessment Workshop 4. Follow-up Action Establishing the institute- specific context – anticipated issues / challenges; participants; timing; communications… Awareness building and prioritization of topics for assessment – includes presentation, discussion, and completion of questionnaire Facilitated session to establish current and desired future state – focus on priority areas established in step 2. Diagnostic results are qualitative and quantitative (based on voting) Planning and implementation of follow-up action – follow-up actions to be determined based on prioritized opportunities from assessment results Monitor progress against action plan, and re-assess as needed for other categories over time Prioritized categories for assessment Positioning of RMSA within Institute Key areas identified for improvement

Sample Definition Governance: Within our Institute, roles and responsibilities of management and professional staff have been clearly defined, communicated and are understood by those involved. Managers have authorities commensurate with their responsibilities, and are accountable for their actions. A comprehensive approach to governance is in place, including: use of a defined and understood accountability framework; use of an independent, expert advisory body; documenting and communicating decisions in a transparent manner; and having a fair and independent process to manage conflicts of interest.

The Assessment Scale LevelDescription 1The practices described do not exist within our Institute, and appear to describe an ideal state that is beyond our reach at the present time. 2A few of the defined practices have been adopted at the Institute where I work, however, these are not applied in a uniform fashion. 3Several of the defined practices are established within the Institute where I work, but gaps still exist. 4The defined practices are the norm at the Institute where I work, and part of our culture. 5Not only are the defined practices the norm at the Institute where I work, but we were early-adopters and continue to seek ways to improve and share our practices with others in this area. 9Do not have sufficient knowledge – cannot comment. Discussion to focus on: 1) strengths (what is done well), 2) what can be done better (opportunities); and 3) some challenges/barriers to keep in mind; The qualitative information is key – the numbers provide some scale of the gap that exists (“as is” and “to be”) to help in prioritizing issues.

Progress to Date and Moving Forward Pilot testing of content and process Results to date have been positive –Process has been well-received –Adjustments are expected to refine the content, and framework Pilots to be completed by late Fall 2005 Recommendations to be made to NRC senior executives regarding: –broader rollout of the tool based on pilot experiences –opportunities for corporate follow-up to address cross-cutting issues/gaps (e.g., training, corporate resources and support)

Some Messages to Date Keep tool flexible rather than policy – cannot have a cookie cutter approach Use of tool by institutes in two ways: –As quick checklist (Are we doing this or not?) –As a full assessment (How well are we doing in these various areas?) RMSA should be positioned for use as an internal management tool rather than as a means of measuring institute performance and resource allocation Accountability needs to be built in for follow-through on improvement More support could be provided at the corporate level to institutes for research management – specifics will emerge with assessment activities.

Questions? For further information: Flavia Leung National Research Council of Canada Website: