1 Academic Disciplines and Level of Academic Challenge Gary R. Pike University of Missouri–Columbia.

Slides:



Advertisements
Similar presentations
OCCC AtD Meeting, September 16, 2009 Mr. Stuart Harvey 1.
Advertisements

Dr. James Rehberg, Director of Honors, Gifted, and Accelerated Instruction Karen Jones, Assistant Principal.
Assessment of the Impact of Ubiquitous Computing on Learning Ross A. Griffith Wake Forest University Ubiquitous Computing Conference Seton Hall University.
Mission Profile Academic Program Campus Life Admission T O P I C S.
DATA UPDATES FACULTY PRESENTATION September 2009.
Indiana State University Assessment of General Education Objectives Using Indicators From National Survey of Student Engagement (NSSE)
Gallaudet University Results on National Survey of Student Engagement Office of Institutional Research August, 2007.
GGC and Student Engagement.  NSSE  Overall: 32%  First Year: 30%  Seniors: 33%  GGC  Overall: 28%  First Year: 26% (381)  Seniors: 38% (120)
Presentation to Student Affairs Directors November, 2010 Marcia Belcheir, Ph.D. Institutional Analysis, Assessment, & Reporting.
The Collaborative on Academic Careers in Higher Education 2010 Survey of Pre-tenure Faculty.
Mind the Gap: Overview of FSSE and BCSSE Jillian Kinzie NSSE.
Urban Universities: Student Characteristics and Engagement Donna Hawley Martha Shawver.
MMSTI Montana Math and Science Teacher Initiative.
Mathematics: the language of physics and engineering Professor Peter Main Maths in the Science Curriculum University of Southampton 29 July 2014
Basics of A-G Courses. UC ARTICULATION CONFERENCE Overview  Purpose of “a-g” subject requirements & course lists  Enhancements to “a-g” course criteria.
Multnomah County Student Achievement Presented to the Leaders Roundtable November 25, 2008 Source: Oregon Department of Education, Dr. Patrick.
Writing and Argumentation in Elementary Science: Day 1
National Survey of Student Engagement, 2008 Results for UBC-Okanagan.
Want to be first in your CLASSE? Investigating Student Engagement in Your Courses Want to be first in your CLASSE? Investigating Student Engagement in.
 New National Curriculum from September 2014 Key stage 3Key stage 4 Year groups7 – 910 – 11 Core subjects English  Mathematics  Science  Foundation.
Report of the Results of the Faculty Survey of Student Engagement William E. Knight and Jie Wu Office of Institutional Research Presentation to the Faculty.
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
Community College Survey of Student Engagement CCSSE 2014.
The Faculty Survey of Student Engagement (FSSE) measures faculty expectations for student engagement in educational practices that are known to be empirically.
CCSSE 2013 Findings for Cuesta College San Luis Obispo County Community College District.
Responding to Student Writing: Tips on Grading and Providing Constructive Feedback A presentation by the Academic Writing Centre.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
Getting More Students to Meet Standards: Research-based Practices That Work Southern Regional Education Board.
GRADE DISTRIBUTION FALL 2003 SEMESTER Information Report To The WV Higher Education Policy Commission April 23, 2004.
National Survey of Student Engagement, 2008 Results for UBC-Vancouver.
Gallaudet Institutional Research Report: National Survey of Student Engagement Pat Hulsebosch: Executive Director – Office of Academic Quality Faculty.
College and Career Readiness: Measures/Aligning Instructional Materials Dublin Scioto High School March 2012.
APSU 2009 National Survey of Student Engagement Patricia Mulkeen Office of Institutional Research and Effectiveness.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
VakkariIRiX meeting Perceived influence of the use of electronic information resources on scholarly work and publication productivity Pertti.
2009 National Survey of Student Engagement (NSSE) Report Institutional Research & Information November 18, 2009.
National Survey of Student Engagement 2009 Missouri Valley College January 6, 2010.
SHOCK EFFECT 2001 Moving From Data to Goals Dr. Gary Steward, Associate Dean & Professor of Sociology Dr. Stacy Southerland, Professor of Modern Languages,
NSSE 2005 CSUMB Report California State University at Monterey Bay Office of Institutional Effectiveness Office of Assessment and Research.
Looking Inside The “Oakland Experience” Another way to look at NSSE Data April 20, 2009.
© 2008 Brigham Young University–Idaho Course Evaluations at BYU-Idaho 1.
Plan of Study 9 th Grade Career Clusters #5. Pre-Test 1.What iseek screen will help with high school academic planning? 2.What type of coursework helps.
Appendix 6 College Comparisons. Mean Total Score by College (Possible Score Range 400 to 500) SSD = Total Scores for Colleges of Business, Education,
The University Of Michigan Presented By: Ian Ball Project 7: Taking a Tour of a College or University 12/8/10.
Kindergarten to College Focus: College Readiness Slides excerpted from: College Readiness Report Academic and Student Affairs Information Item November.
Preparing For College Ms. Zavala College and Career Counselor Room 701.
UNDERSTANDING 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) RESULTS Nicholls State University October 17, 2012.
Developing Common Course Syllabi November 20, 2010 Heather Sass, Education Consultant Preview.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
Southern Regional Education Board HSTW A New Design for High School Career/Technical Studies Gene Bottoms Senior Vice President
GGC and Student Engagement.  NSSE  Overall: 27% (down 5%)  First Year: 25% (down 5%)  Seniors: 28% (down 5%)  GGC  Overall: 35% (up 7%)  First.
University of Arizona-ASU Presented by: Will Lankard, Project #7 taking a tour of a college or university. 1/3/12.
Center for Institutional Effectiveness LaMont Rouse, Ph.D. Fall 2015.
Preparing Teachers for Disciplinary Literacy Instruction Amy Bottomley, MEd. University of Cincinnati Handouts for this session can be found on the AMLE2015.
Written Communication Next Steps  Refinement of Organization criterion  We have raised the bar for “meets standard” from “organizational pattern” being.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
The University of Texas-Pan American National Survey of Student Engagement 2005 Results & Recommendations Presented by: November, 2005 S. J. Sethi, Ph.D.
Office of Institutional Research and Effectiveness 1 The University of Texas-Pan American National Survey of Student Engagement 2003, 2004, 2005, 2006.
Why apply to graduate programs? Better job choices Ability to have more control over your career Enriching research that can have long lasting affects.
The University of Texas-Pan American S. J. Sethi, Ph.D. Assistant Director Office of Institutional Research & Effectiveness Faculty Survey of Student Engagement.
Faculty Senate Pat Hulsebosch, Office of Academic Quality 11/17/08.
School of Liberal Arts September 9, Overview of Accreditation Process Assessment Plans Resources.
National Survey of Student Engagement Noel-Levitz Satisfaction Surveys
The University of Texas-Pan American
The University of Texas-Pan American
The University of Texas-Pan American
High School Diploma The high school diploma requirements are those that every student must earn. Think of these as the base graduation requirements. School.
What do they mean and how can I use them?
GGC and Student Engagement
University of South Carolina Upstate
Presentation transcript:

1 Academic Disciplines and Level of Academic Challenge Gary R. Pike University of Missouri–Columbia

2 The Opportunity The MU benchmark score for Level of Academic Challenge was significantly lower than the benchmark score for our peer institutions (AAU Public Research Universities). This was particularly true for our seniors.

3 Step 1: Item Analysis Items with substantial differences: –Spending significant amounts of time studying (0.10) –Number of written papers of 20 pages or more (0.19) –Coursework emphasis: Analysis (0.08) –Coursework emphasis: Synthesis (0.12) –Coursework emphasis: Evaluation (0.17)

4 Step 2: Identifying Disciplinary Differences Rationale –There is a large body of literature indicating that different types of academic challenges are posed by different academic disciplines. –By identifying disciplinary differences it may be possible to target specific improvement actions.

5 The Approach Ratcliff, Jones, and their colleagues developed a method of linking specific patterns of course taking with gains in general education. It should be possible to use a variation of this approach to identify disciplinary differences in Level of Academic Challenge items.

6 The Method Calculate mean scores on each item for each discipline using AAU data and self-reported major. –Majors are variables –Items are observations Cluster together majors with similar response profiles. Use discriminant analysis to identify how the clusters differ.

7 The Method (Continued) Calculate parallel cluster means for each item using only the MU data. Compare AAU and MU means –Are the response profiles similar? –Are there substantive differences in means between the AAU and MU clusters?

8 Results: Cluster Analysis Cluster 1: Science, Math, Engineering –Biological Sciences, Computer & Information Sciences, Engineering, Health-Related Fields, Mathematics, Physical Sciences, & Visual Arts. Cluster 2: ? –Agriculture, Business, Communication, General Studies, Public Administration, & Social Sciences. Cluster 3: ? –Education, Foreign Languages, Humanities, Interdisciplinary, & Parks and Recreation.

9 Results: Discriminant Analysis Function 1 (Cluster 1): –High on studying, class preparation, and application. –Low on number of texts, writing, and evaluation. Function 2 (Cluster 3 vs. Cluster 2): –(3) High on class preparation and synthesis. –(3) Low on analysis and application.

10 Results for MU Seniors Cluster 1 (MU lower): –Time spent studying (0.18)* –Preparing for class (0.11)* –Papers of 20 or more pages (0.23) –Analysis (0.12) –Synthesis (0.23) –Evaluation (0.20) Cluster 1 (MU higher): –Assigned texts (0.10)

11 Results for MU Seniors Cluster 2 (MU lower): –Papers of 20 or more pages (0.13)* –Evaluation (0.11) Cluster 2 (MU higher): –Class preparation (0.15)* –Application (0.08)*

12 Results for MU Seniors Cluster 3 (MU lower): –Time spent studying (0.16) –Class preparation (0.07)* –Assigned texts (0.28) –Papers of 20 or more pages (0.29) –Papers of less than 20 pages (0.16) –Analysis (0.27)* –Synthesis (0.19)* –Evaluation (0.28) –Application (0.14)* Cluster 3 (MU higher) –Worked harder than you expected (0.16)

13 Conclusions Disciplines do make a difference and the results for MU were generally consistent with the results for other AAU institutions. MU’s cluster 2 students not different from cluster 2 students from other AAU institutions. MU’s cluster 1 students somewhat lower than cluster 1 students from other AAU institutions. MU’s cluster 3 students were substantially lower than cluster 3 students from other AAU institutions.