Download presentation
Presentation is loading. Please wait.
Published byРадосав Јовић Modified over 5 years ago
1
Reconsidering GenEd The most direct path to Institutional value
note that “value” does not oppose “quality” It’s our one chance to create the “Citizen-Scientist” future voters, homeowners, neighbors, business leaders, politicians majors/minors recruitment FRONT BOOK END
2
Possible models traditional, hybrid, on-line
Penn State: Geology of the National Parks see examples on SERC website Earth Hazards (n > 150) CSU Fullerton Freshman Programs Developed for majors recruitment, but more significant for increasing our institutional value Depth vs. breadth Shake and Bake; Dinosaur World Enrollment limited to freshmen target undeclared freshmen 3-day field trip funded by Associated Students, Inc. integrated lab FRONT BOOK END
3
Size counts! Technology Teaching techniques Successful implementation
classroom facilitation/management programs clicker surveys; on-line quizzes Teaching techniques think-pair-share (discussion) What-Do-You-See? (writing requirement) Successful implementation use your best instructors revisit Personnel Document? assign wtu proportional to enrollment need to track enrollment rates FRONT BOOK END
4
Details to consider Rewards Constraints
faculty release time for development flexible scheduling accommodates meetings, field work be creative and teach something you love Constraints available room sizes/times University technology support testing center, Blackboard enrollment/support Advertisement Advising center, Community Colleges, Freshman Programs FRONT BOOK END
5
Keys to success FRONT BOOK END
PLANNING ACTION ASSESSMENT FRONT BOOK END On animation compare planning, action, assessment to research process Research Plan Experiment Analysis ( what worked)
6
Looking Ahead to Learning Assessment
Not mysterious We do this everyday in our research lives No more than assuring ourselves that we are accomplishing the goals we set Formative and summative FRONT BOOK END
7
Instructional Program Assessment
Common elements of an IPA Assessment of student learning in individual courses at specific points in degree benchmark exams in capstone experiences Analysis of learning objectives in entire degree program Perception surveys regarding student learning Traveling Workshop:
8
Student learning: Individual courses
Learning objectives first defined for each course “Successful completion of course satisfactorily demonstrates student mastery of learning objectives”, assessed by class GPA portfolios e.g. UVermont lab portfolios Traveling Workshop:
9
Student learning: Individual courses
Pre-/Post- Testing mastery of SLO assessed by changes in scores [need link to examples] Tracking of annual student achievement data for high-enrollment, “standardized” classes e.g., Indiana U. of Penn, Concise Assessment Plan 2008 (re. Introductory classes, in table) Traveling Workshop:
10
Student Learning: Benchmark Exams
Annual exams e.g., Winona State Issue: how to keep students serious about taking a demanding, ungraded exam? Exit exams e.g., ASBOG Issue: does the content reflect your degree’s SLO? commercial flashcards ASBOG = Traveling Workshop:
11
Student Learning: Capstone Experiences
Research experiences/theses, assessed by satisfactory completion standardized rubrics can used by advisors professional presentations (abstracts, papers) assessment provided by peer-review process In-house presentations Department seminar College publication of student research Traveling Workshop:
12
Student Learning: Capstone Experiences
In-house presentations (continued) Research Day html assessed using rubrics, or consider awarding prizes in categories undergraduate vs. graduate proposals vs. completed projects judged by industry representatives, Advisory Board, and/or colleagues from neighboring institutions (ex: community college outreach) Traveling Workshop:
13
Student Learning: Capstone Experiences
Field camp, assessed by satisfactory completion CSU Chico rubric Miami University rubric final field-based exam Traveling Workshop:
14
Instructional Program Assessment
Common elements of an IPA Assessment of student learning in individual courses at specific points in degree benchmark exams in capstone experiences Analysis of learning objectives in entire degree program mapping the curriculum Perception surveys regarding learning Traveling Workshop:
15
Assessment of Degree Program: CSUF
Mapping exercise may serve as a useful springboard for redesigning the curriculum, faculty hires, resource allocation…
16
Instructional Program Assessment
Common elements of an IPA Assessment of student learning in individual courses at specific points in degree benchmark exams in capstone experiences Analysis of learning objectives in entire degree program Perception surveys regarding student learning Traveling Workshop:
17
Student Learning: Perception Surveys
User Surveys Student surveys exit surveys/interviews alumni surveys Employer surveys Examples are posted on the SERC site: Program Metrics and Instruments Traveling Workshop:
18
Student Learning: Perception Surveys
User Surveys Student surveys exit surveys/interviews alumni surveys Employer surveys Examples are posted on the SERC site: Program Metrics and Instruments Traveling Workshop:
19
Student Learning: Perception Surveys
Exit surveys and interviews surveys given in classes populated by seniors or by web, mail interviews conducted by Chair or by advisor Traveling Workshop:
20
Student Learning: Perception Surveys
Alumni surveys Collect similar data over extended time period consider separating data by graduation date e.g., 3 yr, 10 yr, 20 yr alumni only Lists of Exit/Alumni Survey Questions Traveling Workshop:
21
Student Learning: Perception Surveys
Alumni surveys Add one-time questions to inform current issues regarding curriculum development, scheduling, outreach, etc. Solicit input regarding other graduate programs for use in advising current seniors Lists of Exit/Alumni Survey Questions Traveling Workshop:
22
Student Learning: Perception Surveys
Employer Survey Brainstorm re possible survey questions How many graduates employed? Strengths/weaknesses? Compare to graduates of other programs? Consider incentives to participate? Possibility of improving the “product”? Opportunity to give expert advice Advisory Boards [SERC link?] Traveling Workshop:
23
Student Learning: Surveys
Web-based survey instruments SurveyMethods, Zoomerang, SurveyMonkey How to “code” qualitative responses Note: “Guidelines for maximizing response rates”
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.