Download presentation
Presentation is loading. Please wait.
Published byNeil Bruce Neal Modified over 9 years ago
1
Institutional Planning, Assessment & Research 2010 Institutional Planning, Assessment & Research Assessment Review Committee Report Research & Graduate Studies Ron Mitchelson May 5, 2014 1 2013 Institutional Planning, Assessment, and Research
2
Mentoring/Review Process ARC Membership –7 ARC members (Mitchelson, Gemperline, Hall, Sanders, Epley, Van Scott, Morrissey/IA rep) ARC Meetings –August 23 – initial ARC training –September 11 – reviewed 2 assessment reports together –Group decided to email Ron with questions and then decide if they want to meet again Review Process –Most reviews done individually (Mitchelson and Morrissey did a few together) –1-4 units per reviewer 22013 Institutional Planning, Assessment & Research
3
2012-13 Component Data DevelopingAcceptableProficient Outcome161327 Means of Assessment 1025 Criteria for Success 92228 Results19 22 Actions Taken271319 Follow-Up to Actions Taken 281115 32013 Institutional Planning, Assessment & Research
4
Data Visualization 42013 Institutional Planning, Assessment & Research
5
2012-13 Best Practices – “Closing the Loop” [Institute for Coastal Science & Policy - Research] Outcome: ICSP will support research, instruction, and outreach by soliciting and securing funding from local, state, and national private and public sources. Means of Assessment: Annually enumerate the number of grants of external research support received (new and continuing grants/contracts) by ICSP faculty and staff. Criteria for Success: Each faculty member receives external research support for their research in 2 years of each 3-year period. 2012-2013 Results: On average faculty have 2.8 grants funded, each, and have access to $303,558 of funding, each. There are 3 unfunded faculty, one of whom has been unfunded for 2 of the last three years. Actions Taken (based on analysis of results):The ICSP director has met with those faculty members who have not received funding in the past two years to discuss their research agendas and plans for future proposal submissions. As a result of developing a standard operating procedures manual for ICSP, grant proposal submissions and awards for external funding have become an expectation for faculty. Therefore, this means of assessment will be made inactive and a new one that more clearly measures this expectation will begin in the 2013-14 reporting year. 52013 Institutional Planning, Assessment & Research
6
2012-13 Best Practices – “Closing the Loop” [Office of Sponsored Programs] Outcome: The Office of Sponsored Programs will improve completeness of award files maintained in RAMSeS, the electronic records management system. Means of Assessment: Award file completeness is measured by determining the level of completeness of award checklists, which are called Award Coversheets. Criteria for Success: Based on a monthly examination of all Award Coversheets, 95% of the Award Coversheets should be 100% complete upon submission by the Grant Officer to the OSP Data Manager. 2012-2013 Results: Monthly examination of all award coversheets showed completeness rates ranging from 88% to 100%. There seemed to be no link between completeness of award coversheets for months when there was heavy proposal activity. Actions Taken (based on analysis of results):Factors contributing to incomplete award coversheets included missing CFDA numbers, and missing IRB and IACUC documentation. Review of the data on a monthly basis with staff has raised level of awareness and resulted in steady improvement in completeness of award cover sheets. 62013 Institutional Planning, Assessment & Research
7
2012-13 Best Practices – “Closing the Loop” [ Office of Innovation & Economic Development – Public Service] Outcome: The Office of Innovation & Economic Development will assist communities by increasing the capacity for public leadership within those communities. Means of Assessment: Number of communities/local agencies enrolled in the Talent Enhancement and Capacity Building program (TECB). Criteria for Success: At least 10 communities represented in each cohort per fiscal year. 2012-2013 Results: In 2012-13, there were 13 new TECB communities. Actions Taken (based on analysis of results): Expanded core on-campus training curriculum to include local government leadership development and nonprofit organizational development. Provided project development and grant writing assistance to 15 TECB community partners that leveraged more than $7 Million for local project. 72013 Institutional Planning, Assessment & Research
8
Substantive Changes (For example: All new Outcomes/MOAs or a reorganization of the assessment unit) Program/Unit: Office of Research Compliance Administration and Office of Human Research Integrity were merged to create the Office of Research Integrity and Compliance effective July 1, 2013. Justification for Changes: –Reduction in administrative costs (1 less director) –Improved coordination and communication to the research community regarding compliance trends, issues, policies, and procedures 82013 Institutional Planning, Assessment & Research
9
Rubric and Review Process Feedback What worked for your ARC? –ARC team composition – level of expertise, willing participants –Met soon after initial training – most reviews were completed in a timely fashion What difficulties were encountered? –Tracking movement of units and timing of review process (Centers and institutes moving out of RGS) –Several reviews were done late– mostly because reviewer forgot and then got busy with other things –A few assessment units have not invested at an adequate level in the process which makes the review process difficult Process for addressing “developing” and “acceptable” components –In a few instances, the reviewer worked with the unit to address developing ratings, but it wasn’t consistently done. 92013 Institutional Planning, Assessment & Research
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.