Download presentation
Presentation is loading. Please wait.
Published byMurat Şaşmaz Modified over 5 years ago
1
Designing Assessment and Institutional Research for a New Institution
Georgia Gwinnett College Designing Assessment and Institutional Research for a New Institution Lily Hwang, Director, Institutional Research Juliana Lancaster, Director, Institutional Effectiveness
2
Origins 4-year, State College in the University System of Georgia
Authorized by GA Legislature in May 2005 President hired in September 2005 Campus opened with 118 students and 10 faculty in August 2006 Home of the Grizzlies!
3
Current Status Students: Faculty (Fall 2008): Facilities:
Fall 2007 Enrollment: Headcount 787 Spring 2007 Enrollment: Headcount 867 Fall 2008 Enrollment: Headcount 1563 Faculty (Fall 2008): Instructional full-time faculty: 120 Instructional part-time faculty: 10 Facilities: 6 Buildings: A, B, C, D (Student Services Ctr), E (Valentine Bldg), F (Fitness Ctr) Building E not occupied yet Total: 474,351 square feet Parking Deck: 734 cars Total acreage: >200 Four Degree Programs BBA, Business; BS, Biology; BS, Information Technology; BS, Psychology
4
Reimagining Higher Education for the 21st Century
Commitment at every level to student learning and effectiveness Institutional focus on interdisciplinary/ integrated education Openness to going “outside the box” – provided there is a plan for assessment Created the opportunity for a ground-up design of an INSTITUTIONAL assessment plan and of well-integrated institutional research functions
5
Institutional Effectiveness:
Initial Design The First Full Year Lessons Learned Next Steps
6
Institutional Effectiveness: Initial Design
Advantages of starting from scratch Strong executive level support for and understanding of IE Limited number of programs and offices at start-up Absence of legacy or standing processes and structures Disadvantages to starting from scratch Each individual brings a different set of assumptions and expectations Rapid growth and hiring leads to continuous need for explanation/education
7
Institutional Effectiveness: Initial Design (2006-07)
In order to get “…ongoing, integrated, and institution-wide research-based planning and evaluation processes…[SACS]” for we needed: Structure and resources Broad buy-in, consensus and agreement Working “ground rules” Institution-wide and pervasive Integrated with institution’s mission & strategic plan Faculty/staff participation and basic control Interdisciplinary and developmental assessment of student learning
8
Institutional Effectiveness: Initial Design (2006-07)
Program level student learning outcomes and assessment plans General Education curriculum designed around student learning outcomes Agreement to develop and assess for institutional student learning outcomes Agreement to integrate curricular and co-curricular student learning efforts Leading to: Integrated Educational Experience (IEE) Student Learning Outcome Goals for GGC
9
Institutional Effectiveness: Continuing Design (2007-08)
Conceptual Relationships Among Outcome Goals and Objectives Institutional Goals Administrative Unit Outcome Goals Integrated Educational Experience SLO Goals General Education Goals Program of Study Goals Student Affairs Goals Course Goals Student Affairs Activity Goals Lesson Objectives
10
Institutional Effectiveness: Continuing Design (2007-08)
Organizational Structure to Manage Resulting Flood of Data Assessment Steering Committee Integrated review of all assessment results Strategic analysis of results; impact on strategic plans IEE Assessment Review Committee Communication Integrated review of IEE assessment results Administrative Review Committee General Education Committee General Education Goal Teams IEE Goal Team Interdisciplinary Operationally define & plan assessment(s) Integrated review of program findings Program Goal Teams
11
Institutional Effectiveness: The First Full Year
Planning All operating units, both academic and administrative developed assessment plans. Academic units focused on course-level, embedded assessments. All faculty and numerous staff engaged in discussing and planning assessment. Goal teams developed operational definitions of each institution-level student learning outcome (GE and IEE)
12
Institutional Effectiveness: The First Full Year
Execution All units attempted to fully execute their assessment plans Some outcomes were not measurable Some measures called for unobtainable data All units were able to collect valid data on at least one outcome Most units were able to identify at least one needed action in response to assessment 60% identified needed changes in curriculum or operations 34% identified needed changes in assessment plans
13
Institutional Effectiveness: Lessons Learned
Challenges & Lessons Learned Implementing program-level assessment plans while still developing the institutional framework Communicating the history of and basis for having both General Education and IEE student learning outcomes at the institutional level Articulating the initial task of the Goal Teams: To operationally define each Student Learning Outcome Managing expectations at multiple levels
14
Institutional Effectiveness: Next Steps
Review the conceptual and actual relationships between the two sets of institution-wide student learning outcomes Initiate a campus-wide discussion about whether or not to make changes and, what those might be Continue developing a broad base of informed, skilled individuals across campus to lead assessment efforts. Continue efforts to establish systematic, manageable assessment at all levels
15
Institutional Research:
Unique Setting/Environment Major Tasks Major Challenges
16
Institutional Research
Institutional Environment Banner hosted institution -- technical environment located at a central location – Office of Information & Instructional Technology (OIIT) Internal support available for IR: a core data manager (Banner function person), and a programmer (IT). Since its onset, GGC has been a Banner hosted institution (one of the first cohort). The Institutional Research Office, thus, takes a heavier responsibility in working with the registrar’s office internally and with OIIT directly for institutional data validation, management, and reporting. 16
17
Institutional Research
Major Tasks To learn legacy data system, e.g., Student Information Reporting System (SIRS) and Curriculum Inventory Reporting (CIR), etc. To learn USG reports, e.g., Semester Enrollment Report (SER)—their definitions. To learn new Academic Data Mart (ADM) systems. Producing reports (routines, ad hoc/internal & external). Producing the College Fact Book. The major tasks of our first year have included experiencing the transitions from the legacy system (SIRS/CIR) to ADM reporting; and producing the first year’s College Factbook.
18
Institutional Research
New Major Task Began IPEDS reporting Began many other surveys: CUPA Faculty Salary Survey (began earlier) National Postsecondary Student Aid Study (did not have data due to non-Title IV status at the data point) The Consortium for Student Retention Data Exchange (CSRDE), National Student Clearinghouse—supported by USG.
19
Institutional Research
Major Challenges Entering in the transitional period from the legacy data system to new ADM system; allowing very brief learning curve. Learning together with other Units, e.g., the Registrar’s Office, Human Resources; requiring close relationships.
20
Institutional Research
Example: A collaborative effort on establishing a CIP list representing GGC’s teaching disciplines/areas. Why is this important for GGC? GGC does not have departments. School >>Major (program) >> Tracks/Concentration
21
Institutional Research
IE and IR As does every unit of GGC, IR operates within the college framework IE facilitates and monitors. Specific tasks for IR in support of IE operations: Institutional information request for accreditation purposes Information support for assessment projects, e.g., NSSE and Course Evaluations Anticipated tasks for IE in support of IR Providing benchmark and assessment data for Fact Book Collaboration in design of specific studies
22
Questions & Comments
23
THANK YOU! Presenters: Juliana Lancaster
Director, Institutional Effectiveness Lily Hwang Director, Institutional Research
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.