2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal.

Slides:



Advertisements
Similar presentations
TAIR 2009 – Lubbock Michael Taft and Paul Illich Using Data to Improve Developmental Education.
Advertisements

1 Communications Preferences & Priorities Survey October, 2013 vs. June, 2014.
NSSE and MSU Retention Chris Fastnow Office of Planning and Analysis December 4, 2008.
EBI Statistics 101.
CLRN Evaluation Update Educational Support Systems May 28, 2009
Expectation & Experience Surveys 1998 & 2002 AIRPO, June West Point, New York.
2012 Survey of California Home Sellers. Methodology Telephone surveys conducted in August/September of 600 randomly selected home sellers who sold in.
CORRELATIO NAL RESEARCH METHOD. The researcher wanted to determine if there is a significant relationship between the nursing personnel characteristics.
School Performance Measure Calculations SY Office of Achievement and Accountability.
Assessing Current Institutional Practices: The ALFI Toolkit PLA with a Purpose Symposium Columbus, Ohio April 29, 2014.
CCSSE Houston Community College System Presented by Margaret Drain June 19, 2007.
2 Enter your Paper Title Here. Enter your Name Here. Enter Your Paper Title Here. Enter Your Name Here. ANALYSIS OF THE RELATIONSHIP BETWEEN JOB SATISFACTION.
Research, Education, and Economics Information System Food and Agriculture Education Information System Panel of Experts July 15, 2003 Presented by: Michel.
History and Role of the FAEIS Peer Panel 2010 FAEIS Peer Panel Meeting Dallas, TX.
Results of AUC’s NSSE Administration in 2011 Office of Institutional Research February 9, 2012.
2003 Human Resources Salary Survey 25th March 2003 Graduate Institute of Management and Technology Presented by Bheki Sibiya President of the Black Management.
Student Engagement Survey Results and Analysis June 2011.
Student Engagement: Comparing Community College Students in the US and Canada Maureen Pettitt, Ph.D. Skagit Valley College, WA Karen Grigoleit Douglas.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
+ Equity Audit & Root Cause Analysis University of Mount Union.
Factors Influencing the Retention of Specially Educated Public Child Welfare Workers Nancy Dickinson, UNC Chapel Hill Robin Perry, Florida State University.
What Counts for Tenure? Preliminary Survey Results Kathleen A. Terry-Sharp Director of Academic Relations American Anthropological Association Analysis.
Before & After: What Undergraduates and Alumni Say About Their College Experience and Outcomes Angie L. Miller, NSSE & SNAAP Research Analyst Amber D.
WesternU Assessment Kick-off Meeting: The why’s, who’s, what’s, how’s, and when’s of assessment Institutional Research & Effectiveness Neil M. Patel, Ph.D.
FAEIS – A Resource for Agriculture Faculty and Institutions Mary A. Marchant, Joseph R. Hunnings, Jolene D. Hamm (Virginia Tech), Timothy P. Mack (Indiana.
1 ACSI American Customer Satisfaction Index TM Citizen Satisfaction with the U.S. Federal Government: A Review of 2011 Results from ACSI Forrest V. Morgeson.
The State of Maine Managerial Effectiveness Survey Results.
St. Thomas University ALUMNI SURVEY Executive Summary Undergraduate and Graduate alumni in university database from 2001 to 1991 were mailed a copy of.
FAEIS – A Resource for Agriculture Faculty and Institutions Mary A. Marchant, Joseph R. Hunnings, Jolene D. Hamm, Lisa Hightower (Virginia Tech), Timothy.
IntroductionDiscussion  Academic, mental health, behavioral, and social deficits in student adjustment are major causes of college attrition rates. 1.
+ Third Party Evaluation – Interim Report Presentation for Early Childhood Advisory Council December 19, 2013.
Research on the experience of disabled staff within the NHS workforce Peter Ryan & Mike Edwards Findings from the NHS 2014 staff survey and the 2014 Electronic.
EARLY WARNING SYSTEMS EARLY ADOPTERS’ SURVEY Interpretive Summary Highlights of EWS Early Adopters Learning and Sharing Summit Survey, George W. Bush Institute,
Agriculture, Human and Natural Resources Information Technology, Virginia Tech United States Department of Agriculture Peer Panel Meeting June 23-24, 2005.
USDA CSREES SERD Program Directors’ Conference Wednesday, March 30, 2005 New Orleans, LA Bill Richardson Ella Smith.
Survey Methodology Four Point Scale Very SatisfiedSatisfiedDissatisfiedVery Dissatisfied.
PROMOTING STUDENT SUCCESS: WHAT WE’RE LEARNING ABOUT WHAT MATTERS MOST Kay McClenney Director, Center for Community College Student Engagement The University.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Faculty Satisfaction Survey Results October 2009.
Benchmarking with the Community College Survey of Student Engagement Website.
FAEIS in Action Accessing data from the Food and Agricultural Education Information System Thursday, March 31, 2005 New Orleans, LA Bill Richardson Ella.
When Music Goes Up In Flames: The Impact of Advising on the Perceived Burnout of Music Majors Marilee L. Teasley, Department of Psychology Abstract Academic.
Wage & Operations Survey 2009 © RB Publishing Inc. Respondent Demographics Mailing Systems Technology 2009 Wage & Operations Survey Results Mailing Systems.
FA FAEIS & FALCON faeis.usda.gov Bill Richardson FAEIS Project Manager
UNDERSTANDING 2012 NATIONAL SURVEY OF STUDENT ENGAGEMENT (NSSE) RESULTS Nicholls State University October 17, 2012.
FACULTY SURVEY WORKSHOP Mary Marchant Bill Richardson Joe Hunnings.
Instructors’ General Perceptions on Students’ Self-Awareness Frances Feng-Mei Choi HUNGKUANG UNIVERSITY DEPARTMENT OF ENGLISH.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, April 2010.
1 Monroe County School District Spending vs. Student Achievement John R. Dick School Board District 4.
2008 FAEIS Triennial Evaluation The purpose of the FAEIS Triennial Evaluation is to conduct a comprehensive assessment of FAEIS once every three years.
July 11, 2011 SHRM Poll: The Hiring of 2011 Graduates.
Agriculture, Human and Natural Resources Information Technology, Virginia Tech United States Department of Agriculture Peer Panel Meeting July 8-9, 2004.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
RESULTS OF THE 2009 ADMINISTRATION OF THE COMMUNITYCOLLEGE SURVEY OF STUDENT ENGAGEMENT Office of Institutional Effectiveness, September 2009.
Strategies for Maintaining Data Quality Using Commercial Assessment Systems Nick Ortiz Colorado Department of Education Barb Jackson University of Nebraska.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
AAPA Research – 2318 Mill Road, Suite 1300, Alexandria VA, AAPA Member Satisfaction 2014 Results.
QRP = CONTINOUS IMPROVEMENT  The QRP was developed collaboratively with all WI Technical Colleges and the WI Technical College System state office as.
DESIGNING GOOD SURVEYS Laura P. Naumann Assistant Professor of Psychology Nevada State College.
AAMC Faculty Forward Engagement Survey Results
Interpreting, Editing, and Communicating MISO Survey Results
Understanding and Applying the Noel-Levitz Student Satisfaction Survey
FAEIS & FALCON faeis.usda.gov
UTRGV 2016 National Survey of Student Engagement (NSSE)
Centers for Medicare and Medicaid / SAS Usage 2011
Senate Ad hoc Committee for the Assessment of the Higher Education Research Institute (HERI) Faculty Survey Report on Findings Felicia Lassk, Associate.
School Performance Measure Calculations SY
Annual Longitudinal Assessment
UTRGV 2017 National Survey of Student Engagement (NSSE)
Comparing 1-Year Out Surveys from Three Concurrent Enrollment Programs
Presentation transcript:

2008 FAEIS Annual Longitudinal Assessment With a Comparison to the 2007 Survey Results The purpose of the FAEIS annual evaluation is to develop longitudinal and trend analyses in order to assess progress and develop program components for the future.

Objectives  Describe selected demographic and other variables that have potential association with FAEIS use.  Determine the level of usefulness of FAEIS data resources and applications.  Determine the level of satisfaction with selected components of FAEIS.  Determine levels of adoption of FAEIS and factors that explain the levels of adoption.  Determine the relationship of selected demographic and usage variables (involvement, use, experience, primary job title) with perceived usefulness and satisfaction with FAEIS.

Methodology  Panel of Experts - 7 FAEIS project staff, USDA representatives and chair of the FAEIS Panel  FAEIS Panel Review and Approval  Instrument Development  Population 1070 Users; 285 Respondents  Follow Up Reminders

2008 Participants  Staff  Faculty  Department Head  College Associate/Assistant Deans  College Dean or VP  University level administration  “Government” added with recoding

Question 1 in 2008 My Involvement with FAEIS Includes: (select as many as apply) (all possible combinations displayed; N=285)

Question 1 My Involvement with FAEIS Includes: (select as many as apply) (all possible combinations displayed; N=285) Involvement Type2007 Percent of Total 2008 Percent of Total Data Entry User of FAEIS Products89.1 Data Entry & Analytical77.7 Analytical & User of FAEIS products 36.0 Analytical 65.6 Data entry & User of FAEIS products 55.6 None32.8 Data entry, Analytical, & User of FAEIS products 42.5 No answer22.5 Analytical, Product developer, & User of FAEIS products Data entry & Product developer 31.1 Data entry, Analytical, Product developer, & User of FAEIS products 20.7 Data entry, Analytical & Product developer 20.4 Analytical & Product developer Product developer10.4 Product developer & User of FAEIS products

Question 1 Involvement with FAEIS of Survey Respondents (“none” and no answer responses eliminated) 2007 Percent of Total (N=308) 2008 Percent of Total (N=285) Data Entry8175 User of FAEIS Products 2127 Analytical2515 Product Developer 95 Analysis – in 2008 there were fewer data entry survey respondents, more users of FAEIS products, but fewer respondents indicated that they are analyzing and developing their own products, compared to 2007 survey respondents.

Question 2 How often have you used FAEIS in the past 12 months? (N=285) 2008

Question 2 How often have you used FAEIS in the past 12 months? 2007 Percent of Total (N=308) 2008 Percent of Total (N=285) Not at all times times More than 4 times No answer0.3-- Analysis – slight increase in the more frequent use categories (3-4 times & >4 times) in 2008

Question 3 My total experience using FAEIS is? (N=285) 2008

Question 3 My total experience using FAEIS is? 2007 Percent of Total (N=308) 2008 Percent of Total (N=285) More than 3 years to 3 years Less than 1 year No answer22.5 Analysis –greater number of “experienced” users (> 3 yrs) and greater number of “less experienced” users (<1 yr) in 2008 compared to 2007.

Question 4 My professional position is best described as? (N=285) 2008

Question 4 My professional position is best described as? 2007 Percent of Total (N=308) 2008 Percent of Total (N=285) Staff Faculty187.7 Department Head619.3 College Associate/Assistant Dean College Dean or VP57.7 University level Administration 52.5 Government--0.7 No answer1-- Analysis – More administrators (non-staff/faculty) participated in the survey in 2008 (44.9%) vs 2007 (32%).

Question 5 My level of adoption (use) of FAEIS is best described as? (N=285) 2008

Question 5. “My level of adoption (use) of FAEIS is best described as:” (no answer – eliminated from analysis) 2007 Percent of Total (N=290) 2008 Percent of Total (N=285) 1=unaware =aware of it but do not use it =Use it but not sure of long term use =Use it and it is integral to my job =Integral to my job and expect long term use =So important as to never be without FAEIS Mean Score

Question 5. “My level of adoption (use) of FAEIS is best described as:” (no answer – eliminated from analysis) 2007 Mean 2008 Mean Data Entry Users Only 2.49 (N=159) 2.30 (N=152) All Other Users3.16 (N=117) 3.24 (N=125) Total2.78 (N=276) 2.73 (N=277) 1=unaware 2=aware of it but do not use it 3=Use it but not sure of long term use 4=Use it and it is integral to my job 5=Integral to my job and expect long term use 6=So important as to never be without FAEIS

Question 6 Please rate the usefulness of the data resources in FAEIS (for student enrollment, degrees awarded and placement, disciplines, gender, ethnicity, faculty salaries, etc.). (1 = not useful, 5 = very useful) 2008

Question 6 Please rate the usefulness of the data resources in FAEIS (for student enrollment, degrees awarded and placement, disciplines, gender, ethnicity, faculty salaries, etc.). Score 2007 Percent of Total (N=192) 2008 Percent of Total (N=176) 1, Not useful , Very useful2421 Mean score (“no answer” and “not used” responses eliminated) Analysis – no significant changes between ‘07 & ‘08

Question 7 Please rate the usefulness of FAEIS at your institution for internal applications (institutional benchmarking, student and faculty recruitment, faculty hiring and fundraising). (1 = not useful, 5 = very useful) 2008

Question 7 Please rate the usefulness of FAEIS at your institution for internal applications (institutional benchmarking, student and faculty recruitment, faculty hiring and fundraising). (1 = not useful, 5 = very useful) Score 2007 Percent of Total (N=160) 2008 Percent of Total (N=158) 1, Not useful , Very useful1512 Mean score (“no answer” and “not used” responses eliminated) Analysis – slight increase in higher ratings in 2008 (totals of 4,5 in ’07 = 34%, ’08 = 38%). Lower percentage of “not used” answer in 2008 (’07 = 43%; ’08 = 36%).

Question 8 Describe your level of satisfaction with FAEIS components. (i.e., newsletter help desk, data entry, report builder, instructions) 1=not satisfied; 5=very satisfied 2008

Question 8 Describe your level of satisfaction with FAEIS components. (i.e., newsletter help desk, data entry, report builder, instructions) 1=not satisfied; 5=very satisfied Score 2007 Percent of Total (N=222) 2008 Percent of Total (N=208) 1, Not satisfied , Very satisfied2018 Mean score (“Unknown,” “no answer” and “not used” responses eliminated) Analysis – slight increase in higher ratings in 2008 (totals of 4,5 in ’07 = 52%, ’08 = 55%). Similar percentage of “not used” answer (’07 = 23%; ’08 = 23.7%).

2008 Annual FAEIS Evaluation Did the type of involvement (Q1) have a significant relationship with ratings of:  Frequency of use (Q2)  Experience with FAEIS (Q3)  Level of adoption (Q5)  Usefulness of data (Q6)  Usefulness of FAEIS (Q7)  Satisfaction with FAEIS components (Q8) Respondents listing “data entry” as their only level of involvement were compared to all other levels of involvement. “Not used” and “no answer” responses for Q6, Q7 and Q8 were removed from the analysis. There were significant differences for “Frequency of Use (Q2),” “Level of adoption” (Q5) and “Usefulness of data” (Q6), and Usefulness of FAEIS to your institution (Q7) and “Satisfaction with FAEIS components” (Q8) between the two groups. There was not a significant difference between the means of the two groups for Q3. Respondents listing “data entry” as their only level of involvement had significantly lower means for Q2, Q5, Q6, Q7 and Q8 than the other group.

2008 Annual FAEIS Evaluation Did “Frequency of Use” (Q2) have a significant relationship with:  Adoption of FAEIS (Q5) Yes, there is a positive correlation between “Frequency of use” (Q2) and “Adoption of FAEIS” (Q5). As the frequency of use increases, so does the rating of the adoption of the FAEIS. F=45.0 (P<0.05) Key: How often have you used FAEIS in the past 12 months? 1=Not at all 2=1-2 times 3=3-4 times 4=more than 4 times

2008 Annual FAEIS Evaluation Did “Frequency of Use” (Q2) have a significant relationship with:  Usefulness of data (Q6) Yes, there is a positive correlation between “Frequency of use” (Q2) and “Usefulness of data” (Q6). As the frequency of use increases, so does the rating of the usefulness of the data. F=58.2 (P<0.05) Key: How often have you used FAEIS in the past 12 months? 1=Not at all 2=1-2 times 3=3-4 times 4=more than 4 times

2008 Annual FAEIS Evaluation Did “Frequency of Use” (Q2) have a significant relationship with:  Usefulness of FAEIS (Q7) Yes, there is a positive correlation between “Frequency of use” (Q2) and “Usefulness of FAEIS” (Q7). As the frequency of use increases, so does the rating of the usefulness of FAEIS. F=25.2; P<0.05 Key: How often have you used FAEIS in the past 12 months? 1=Not at all 2=1-2 times 3=3-4 times 4=more than 4 times

2008 Annual FAEIS Evaluation Did “Frequency of Use” (Q2) have a significant relationship with:  Satisfaction with FAEIS components (Q8) Yes, there is a positive correlation between “Frequency of use” (Q2) and “Satisfaction with FAEIS Components” (Q8). As the frequency of use increases, so does the rating of the satisfaction with FAEIS components. F=26.1;P<0.05 Key: How often have you used FAEIS in the past 12 months? 1=Not at all 2=1-2 times 3=3-4 times 4=more than 4 times

2008 Annual FAEIS Evaluation Did the “years of experience with FAEIS” (Q3) have a significant relationship with ratings of:  Level of adoption (Q5) For Level of Adoption (Q5), the “Less than One Year of Experience with FAEIS” group was significantly lower than the other two “level of experience” groups. This seems to reason – the less experience with FAEIS, the less the level of adoption of FAEIS.

2008 Annual FAEIS Evaluation Did the “years of experience with FAEIS” (Q3) have a significant relationship with ratings of:  Usefulness of data (Q6)  Usefulness of FAEIS to your institution (Q7)  Satisfaction with FAEIS components (Q8) There was not a significant difference between the three groups for “Usefulness of data” (Q6), “Usefulness of FAEIS to their Institution” (Q7) and “Satisfaction of FAEIS Components” (Q8).

2008 Annual FAEIS Evaluation Did the professional position (Q4) have a significant relationship with ratings of :  Frequency of use (Q2)  Years of experience with FAEIS (Q3)  Level of adoption (Q5)  Usefulness of data (Q6)  Usefulness of FAEIS (Q7)  Satisfaction with FAEIS components (Q8) The only significant difference was with the “government” group for Frequency of use (Q2) and Level of adoption (Q5). This group had higher means than the other groups for both questions.

2008 Annual FAEIS Evaluation Did the level of adoption (Q5) have a significant relationship with ratings of:  Usefulness of data (Q6)  Usefulness of FAEIS (Q7)  Satisfaction with FAEIS components (Q8) Groups responding “unaware,” & “aware but do not use” to Q5 were compared to the other response groups for Q5 - “Use it, but not sure of long term use;” “Use it and it is integral to my job;” “Integral to my job and expect long term use;” & “So important as to never be without FAEIS.” There were significant differences for all three of the questions above between the two groups. The means of the “unaware,” & “aware but do not use” groups were always lower for each of the questions when compared to the other group.

2008 Annual FAEIS Evaluation Responses by non data entry users only2007 Mean 2008 Mean “My level of adoption (use) of FAEIS is best described as:” SCALE: 1=unaware 2=aware of it but do not use it 3=Use it but not sure of long term use 4=Use it and it is integral to my job 5=Integral to my job and expect long term use 6=So important as to never be without FAEIS

2008 Annual FAEIS Evaluation Responses by all users2007 Mean 2008 Mean Rate the usefulness of the data resources in FAEIS. Scale: 1=Not useful, 5=Very useful Rate the usefulness of FAEIS at your institution for internal applications. Scale: 1=Not useful, 5=Very useful Describe your level of satisfaction with FAEIS components. Scale: 1=Not satisfied, 5=Very satisfied