Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessing Effectiveness: Do your program activities make a difference? May 21, 2008 Mimi Lufkin Chief Executive Officer National Alliance for Partnerships.

Similar presentations


Presentation on theme: "Assessing Effectiveness: Do your program activities make a difference? May 21, 2008 Mimi Lufkin Chief Executive Officer National Alliance for Partnerships."— Presentation transcript:

1 Assessing Effectiveness: Do your program activities make a difference? May 21, 2008 Mimi Lufkin Chief Executive Officer National Alliance for Partnerships in Equity Education Foundation Tricia Berry Director Women in Engineering Program And Texas Girls Collaborative Project The University of Texas at Austin

2 STEM Equity Pipeline Project of the National Alliance for Partnerships in Equity Education Foundation Funded by the National Science Foundation Human Resources Directorate, Gender in Science and Engineering Program, Extension Services Grant

3 Goals Build the capacity of the formal education community Institutionalize the implemented strategies by connecting the outcomes to existing accountability systems Broaden the commitment to gender equity in STEM education

4 STEM Equity Pipeline Project Methods Professional Development Teacher Training Consulting and Technical Assistance Virtual Web-based Professional Learning Community Best Practices Handbook

5 How can you get involved? Participate on your State Team if from –CA, OK, MO, WI, IL Submit an application to be one of the next states in the project due July 18, 2008 Participate in the virtual learning community by going to www.stemequitypipeline.org

6 Assessing Effectiveness: Do your program activities make a difference? Assessing Effectiveness: Do your program activities make a difference? Insights learned from the Assessing Women and Men in Engineering (AWE) Project Tricia Berry Director, Women in Engineering Program Director, Texas Girls Collaborative Project The University of Texas at Austin May 21, 2008 STEM Equity Pipeline Webinar

7 Are You an Assessment Guru? Make your selection in the poll to the right and then hit the submit button below the poll window.

8 Overview AWE Background What is Assessment Justification for Assessment Steps to Assess and Evaluate A Real World Example Insights Gained and Words of Wisdom

9 What is AWE? National Science Foundation funded project to develop exportable assessment tools and methods Tools tested and validated in tests with students at AWE partner institutions including… – Pennsylvania State University – The University of Texas at Austin – Georgia Institute of Technology – University of Louisville – University of Arizona – Rennselaer Polytechnic Institute Co-PI’s on original grant: Barbara Bogue, Pennsylvania State University Rose Marra, University of Missouri

10 AWE Goals Provide the tools and researched knowledge base to create an assessment-based culture Create exportable assessment tools for typical engineering pre-college and undergraduate retention activities Develop capacity building tools for program directors, organizers and implementers

11 AWE Addresses Real World Problems for Implementing Good Assessment Lack of time and money to develop and conduct good assessment Lack of easily accessed expertise to conduct good assessment Bad habits such as recycling of borrowed or current assessment practices and resulting data that are not necessarily relevant to objectives and goals Practitioner orientation of most program directors/activity coordinators and developers – Judged on fundraising or participation – Small or volunteer staffs – Understandable focus on well run outreach and support activities

12 www.aweonline.org Registration is free

13 Overview of AWE Tools: Assessment & Evaluation Instruments Longitudinal Assessment of Engineering Self-Efficacy Survey (LAESE) Undergraduate Mentor, Mentee and “Pretty Darn Quick” Instruments Students Leaving/Persisting in Engineering Instruments Pre-College Participation Instruments College Choice Survey

14 Example of Instrument Information Undergraduate Engineering Mentor Surveys Pre-Survey Post-Survey

15 AWE Pre-college Instruments Discipline specific – Engineering, Science, & Computer Versions – Designed for adaptation to additional disciplines Core Surveys – Cover demographics, measure self-efficacy, confidence, career awareness, interest in or attitudes to STEM disciplines/careers, student evaluation of activity (post only) Optional Question Modules – Add to core surveys to measure sense of community, skills development, recruitment/branding

16 Overview of AWE Tools: Capacity Building Tools Research Overviews – Summarize relevant research – Include overviews on Self Efficacy, Spatial Visualization, Sense of Community, etc. Annotated Bibliography – Highlights readings that can help shape successful efforts Workshops – Provides introduction and training to others

17 Example of Literature Overview Mentoring and Women in Engineering

18 What is Assessment? Process of gathering data to determine the extent to which a person’s performance or a product or program has met its intended objectives Focus on objectives Focus on data that provide information about achievement of objectives Underlying question: How did the participant or program or activity perform relative to stated objectives? Issues: – Creating instruments, validity, reliability – Ensuring that activities further larger mission and goals

19 Formative vs. Summative Assessment Formative Assessment – On-going assessments or reviews – Focused on the process – Assessment “for learning” Summative Assessment – End assessment – Focused on objectives or learning outcomes – Assessment “of learning”

20 Doesn’t Everyone Do Assessment? Many do…but the “how” varies Typically formative assessment As practiced, often ineffective or misleading – Just-in-time approach – Based on confirming participant enjoyment of event – Often based on someone else’s survey rather than on carefully crafted objectives – Measures level of enjoyment, rather than effective practices, ineffective practices

21 What is on Your Post-Survey? Make your selections in the poll to the right and then hit the submit button below the poll window. You may select more than one option.

22 Closed Feedback Loop Typical Happy Face Survey Did you enjoy this activity? Yes, but… Talks are boring; I like action Improvement s in delivery of activity Missing: Have the Objectives Been Met?

23 Justification for Assessment Determine if we actually accomplish anything Objectively evaluate program offerings Identify opportunities Compare initiatives across program/institution Drives allocation of resources Elevate program value Justify existence to administration Report to funders; attract and secure funding

24 Steps to Assess and Evaluate 1.Determine fit with mission 2.Set goals 3.Define measurable objectives 4.Develop/modify/implement assessments 5.Analyze and evaluate the data 6.Do something with the results

25 Real World Example: WEP FIGs First-year Interest Groups Weekly seminar with WEP staff facilitator and student mentor Cohort classes Maximum 25 students grouped by major

26 Steps to Assess and Evaluate 1. Determine Fit with Mission The mission of the Women in Engineering Program is to increase the overall percentage of women in the Cockrell School of Engineering at The University of Texas at Austin. WEP strives to: educate girls and women about engineering inspire women to pursue the unlimited opportunities within the world of engineering and empower women engineers to benefit society

27 Steps to Assess and Evaluate 2. Set Goals Contribute to the overall goal of WEP to recruit, retain and graduate women in the Cockrell School of Engineering at The University of Texas at Austin Provide first year engineering women with the information and resources they need to make a successful transition from high school to college and informed decisions about their majors and career paths Provide a positive female role model in the upper-division peer mentor Be a forum where students can explore their options and discuss issues relevant to their first year Help first year engineering women to feel a connection to the Cockrell School of Engineering and a sense of belonging within the UT engineering community

28 Steps to Assess and Evaluate 3. Define Measurable Objectives Objectives should… Be specific – Address the target audience – Define a specific change we want in our participants Support overall program goals Be measurable

29 Steps to Assess and Evaluate 3. Define Measurable Objectives 100% of FIG students participate in at least 1 additional WEP event Retain 95% of FIG students into their 2 nd Year 100% of FIG participants are registered with the Engineering Career Center Retain 80% of FIG students into their 3 rd Year 95% of FIG students feel part of the engineering community as indicated on end of semester surveys 95% of FIG students indicate on end of semester survey that the FIG met their goals for participation 95% of FIG participants will indicate on the end of semester survey that they are confident that they will complete an engineering degree

30 Steps to Assess and Evaluate 4. Develop/Modify/Implement Assessments Participant Registration and Event Check-in Data Participant Immediate Post Surveys End of Semester Surveys Retention and Graduation Data

31 Steps to Assess and Evaluate 4. Develop/Modify/Implement Assessments FIG End of Semester Questions The WEP FIG met my goals for participation. 5=Strongly Agree, 4=Agree, 3=Neutral, 2=Disagree, 1=Strongly Disagree I am confident that I will complete an engineering degree. 5=Strongly Agree, 4=Agree, 3=Neutral, 2=Disagree, 1=Strongly Disagree As a result of my participation in the WEP FIG, I feel like a part of the Engineering Community. 5=Strongly Agree, 4=Agree, 3=Neutral, 2=Disagree, 1=Strongly Disagree

32 Steps to Assess and Evaluate 4. Develop/Modify/Implement Assessments Session Immediate Post Survey

33 Steps to Assess and Evaluate 5. Analyze and Evaluate the Data 100% of FIG students participate in at least 1 additional WEP event Retain 95% of FIG students into their 2nd Year 100% of FIG participants are registered with the Engineering Career Center (90% registered) Retain 80% of FIG students into their 3rd Year 95% of FIG students feel part of the engineering community as indicated on end of semester surveys (100%) 95% of FIG students indicate on end of semester survey that the FIG met their goals for participation (91%) 95% of FIG participants will indicate on the end of semester survey that they are confident that they will complete an engineering degree (91%)

34 Steps to Assess and Evaluate 5. Analyze and Evaluate the Data Academic Year 2002-20032003-2004Overall FIG NameFab Few WISESummary 2nd semester retention93.33%83.33%100.00%93.35% 4th semester retention66.67%75.00%100.00%76.34% 6th semester retention60.00%75.00%57.14%59.14% 8th semester retention60.00%75.00%57.14%64.05% ANY UT DEGREE 4 YRS53.33%66.66%28.57%52.9% CSE DEGREE 4 YRS33.33%66.66%28.57%44.1% ANY UT DEGREE 5 YRS80.00%75.00%57.14%73.5% CSE DEGREE 5 YRS60.00%66.66%57.14%61.8% ANY UT DEGREE 6 YRS86.66% 86.7% CSE DEGREE 6 YRS66.66% 66.7% Longitudinal Data

35 Steps to Assess and Evaluate 6. Do Something With the Results Report to Funders or Administration

36 How Are You Feeling? Make your selection in the poll to the right and then hit the submit button below the poll window.

37 Can We Help You Feel Better? Make your selection in the poll to the right and then hit the submit button below the poll window.

38 Insights Learned Assessment is a process, not a one time event Building a culture of assessment takes time…but is worth it If assessment is included in the planning cycle, it is easier to accomplish Good assessments are available…we don’t need to create our own from scratch

39 Words of Wisdom Start small and focused in your assessment – Don’t try to measure everything – Think about the use of the results first Don’t reinvent the wheel – Use the AWE products – Use other’s assessments with modifications Keep at it…it will eventually become a part of your program’s culture

40 For More Information… Tricia Berry (tsberry@mail.utexas.edu) AWE Project – Dana Hosko (dhosko@engr.psu.edu) – Barbara Bogue (bbogue@psu.edu) – Rose Marra (rmarra@missouri.edu)

41 Next Webinar Building Effective Program Assessments: How to Use the AWE Tools Monday, June 16, 2008 2pm ET, 1pm MT, 12 noon CT, 11am PT Tricia Berry, Director Women in Engineering Program The University of Texas at Austin For more information go to www.stemequitypipeline.org

42 Questions? Mimi Lufkin Chief Executive Officer National Alliance for Partnerships in Equity Education Foundation www.stemequitypipeline.org mimilufkin@napequity.org Tricia Berry Director Women in Engineering Program And Texas Girls Collaborative Project The University of Texas at Austin tsberry@mail.utexas.edu


Download ppt "Assessing Effectiveness: Do your program activities make a difference? May 21, 2008 Mimi Lufkin Chief Executive Officer National Alliance for Partnerships."

Similar presentations


Ads by Google