Presentation is loading. Please wait.

Presentation is loading. Please wait.

Evaluation & Assessment

Similar presentations


Presentation on theme: "Evaluation & Assessment"— Presentation transcript:

1 Evaluation & Assessment
Cindi Dunn and Wendi Stark Office of Educational Innovation and Evaluation (OEIE)

2 Project Evaluation Components
Process/Implementation Years 1 - 5 Outputs Outcomes/Impacts (Years 3 – 5) Progress toward project goals and milestones Fidelity of Implementation Challenges Lessons Learned Project Participants Research External Engagement Scholarly Work Publications Conference Materials Generated Workforce Development Training LINK, TGEF REU, RET, SURE Courses Competitiveness Proposals Seed Grants SBIR/STTR Patent/Licenses Multi-Discipline, Institution, Team Collaboration & Products Transfer of Knowledge Use of Products and Tools Applied Practices Expanded Industry Partnerships New Hire and Seed Grant Recipient Products Changes in Skills & Knowledge CIMM Participants Diversified Workforce Retention and Graduation Increased Funding & Expanded Portfolio

3 CIMM Evaluation Resource (CIMMer)
94% CIMM Senior Investigators completed modules (milestone = 100%) 77% CIMM Participants (post-doc, student) completed modules (milestone > 80%) Background & Demographics Research Participants Publications & Presentations Proposals Intellectual Property Honors & Awards Noteworthy Activities Internet Dissemination Extended Visits & Internships Collaborators & Partners Building Networks Outreach Activities Research Infrastructure Project Progress Survey 77% CIMM Senior Investigators completed Survey

4 Process/Implementation
Majority of milestones met or exceeded Targets met for REU, RET, SURE grants, Seed Grants 89% CIMM Senior investigators attended at least one of the two annual meetings (metric = 90%) 100% research groups represented at Symposium (metric = 100%) All CIMM project participants report satisfaction with project progress and management (metric = 80%) Year (response rate) Process Outcomes Commitment Efficiency Effectiveness Productivity Year 1 (82%) 4.07 3.90 4.36 4.41 4.13 3.93 Year 2 (77%) 4.22 4.31 4.48 4.56 4.33 4.26 Bold text in chart represents increase from Year 1 to Year 2. All ratings above 4.22 on a 5 point scale Based on a 5-point scale Challenges reported: Coordination of CIMM activities and members Delays within institutions for account management and purchases

5 Outputs Year 2 CIMM Participants

6 Outputs (Years 1 - 2) Project Products
SJR range 0.19 – 3.68 Average attendance 231 per event Average attendance 25 per event 6 Phase 1 and 2 funded for > $1.6M 20 Funding Agencies 5 NSF Directorates

7 Outcome/Impact (Increase Dissemination)

8 Outcome/Impact (Multi-Institutional Collaboration)
Overall, project participants have shown more new collaboration as shown in red lines (56.7%) in Year 2 than in Year 1 (28.05%). Year 2 showed fewer disconnected components than Year 1. There were five distinct components in Year 1, but only two distinct components in Year 2. Compared to Year 1, network members were better integrated in Year 2. Compared to Year 1, fewer bridges (shown by boxes) were observed in Year 2. A bridge is any node that is the only connection to the rest of the network for other nodes. In Year 1, several interconnected network members from SUBR were connected to the rest of the network through a single participant from LSU. As a bridge, removing this LSU member from the network would largely disconnect the project members from SUBR. Similarly, several network members from SUBR and LSU were only connected to the rest of the network through a single participant from LA Tech. Again, as a bridge, this project member from LA Tech was key to connecting multiple distinct components. While a bridge was observed among responses in Year 2, the components they connected were less evenly matched. The Year 2 bridge member was from Grambling and removing this participant would have effectively disconnected all Grambling individuals from the network. However, removing this member from Grambling does little to change the network for a few other institutions. Notably, several, simultaneous connections were observed involving respondents from LA Tech, LSU and UNO.

9 Outcome/Impact (Collaborative Products)
Goal Only using published Multi-campus = could be a CIMM campus, non-CIMM campus in LA, outside LA, abroad) CIMM campus = publications with authors from other CIMM campuses. It should be noted that 78% of STT1 and 45% of STT2 papers include another CIMM author (either at own campus or another campus). Appears most CIMM author collaboration is within the same campus. Looking at collaborative publications, presentations and proposals across the project, most are in same 5 publications/conference proceedings acknowledge ICME (Year 1 = 2) 50% of CIMM senior investigators/research groups adopt ICME tools and protocols by Year 4 (Year 1 = 19%; Year 2 = 79%) 12% CIMM senior investigators report contribution of CIMM data to partnering ICME portals (Year 1 = 5%)

10 Outcome/Impact (Changes in Skills & Knowledge)
CIMM Students Mean ratings > 4.12 on 5-point scale Exposure to Multiple Disciplines Collaboration in Research Locate information from other disciplines that is relevant to my research Increased research quality worth effort to overcome barriers to collaboration Apply research skills to various disciplines Importance of dividing work among team members Value perspectives from other disciplines Listen to other’s ideas Research students share as a result of their CIMM experience they have gained skills in many areas. The highest rated areas relate to the importance of multiple disciplines and collaboration in research.; a skill that will be expected/helpful as they enter the workforce. CIMM is preparing students to be interdisciplinary researchers.

11 Outcome/Impact Example (Retention and Graduation)
CIMM Students ~ 50% stated experience defined or enhanced career choice Graduates: 14 graduate students (2 Higher Education position; 4 Ph.D.) 19 undergraduate students (5 industry, 3 graduate school) REU Students ~ 95% state are more inclined to stay in a STEM field Graduates: 4 undergraduate students (1 industry)

12 Reporting for May 2017 – April 2018
Next Steps Reporting for May 2017 – April 2018 July S M T W F 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 Evaluation Champions GSU: Pedro Derosa LaTech: Ramu Ramachandran LSU: Dimitris Nikitopoulos UNO: Paul Schilling SUBR: Patrick Mensah Open CIMMer February 2018 S M T W F 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 Deadline to Submit Participants March 2018 S M T W F 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 Close CIMMer

13 Evaluation & Assessment
Cindi Dunn and Wendi Stark Office of Educational Innovation and Evaluation (OEIE)


Download ppt "Evaluation & Assessment"

Similar presentations


Ads by Google