Assessing Effectiveness: Do your program activities make a difference? May 21, 2008 Mimi Lufkin Chief Executive Officer National Alliance for Partnerships.

Slides:



Advertisements
Similar presentations
LAO PDR Summary Findings from NOSPA Mission and Possible Next Steps.
Advertisements

Report to the KSD Board June 9, Provide Kent School District the necessary guidance and assistance to create an equitable, academically enriching,
Minnesota’s Professional Development Plan to Prepare for the 2014 GED Test Amy Vickers, Minneapolis Adult Education Astrid Liden, Minnesota Department.
Using Data to Inform our Practice with Peer Leader Programs Brett Bruner, Director of Persistence & Retention Fort Hays State University 2014 Peer Mentor.
Pre-engineering Education Collaborative: Providing for the Education of American Indian Engineers A Collaboration between: College of Menominee Nation.
Orientation for New Site Visitors CIDA’s Mission, Value, and the Guiding Principles of Peer Review.
ActionWebs : Education and Outreach ActionWebs Kickoff Meeting December 17, 2009 │Berkeley, California Education Overview Kristen Gates, Ed.D University.
Group Seminar Field Instruction Model.  1. Delivery of consistent competency based field instruction and augmented case supervision.  2. Provision of.
Exploring Collaborations: Successful Strategies for Increasing Equity and Access to STEM Strategy Session 1: Overview and Game Plan 1.
CEC Advisory Council October 25, 2013 Miami 2020 Plan: Moments that Transorm.
Key Communities and Objectives Outcomes- Based Assessment Telling the Story Results Closing the Loop.
Retention of Undergraduate Engineering Students: Extending Research Into Practice Susan Staffin Metz, Co-PI Stevens Institute of Technology PI: Suzanne.
1 Faculty Leadership Development Programs at Virginia Tech Peggy Layne, P.E., Director, AdvanceVT.
Talbert House Project PASS Goals and Outcomes.
Group Mentoring Program Mentor & Mentee Preparation for Mentoring Helping People Succeed.
Standards and Guidelines for Quality Assurance in the European
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
Principles of Effective Evaluation Prof Keithia Wilson Academic Leader Student Success & Retention.
GCAC COLLEGE ACCESS FOR ALL STUDENTS: A PROFESSIONAL LEARNING OPPORTUNITY
Program Overview The College Community School District's Mentoring and Induction Program is designed to increase retention of promising beginning educators.
The Role of Assessment in the EdD – The USC Approach.
I am ready! A look at how career classes are preparing students for career success Katy Hinz, Program Coordinator, Office for Student Engagement. Career.
Innovative Strategies for Conducting Career Center Assessment Presented by Marian Higgins, Nicole Lechene & Penny Benton The University of Georgia.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Georgia Piedmont Technical College Clarkston, Georgia Dr. Natoshia Anderson Ms. Roz Bogle.
Peer Assisted Learning (PAL)
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
International Conference on Enhancement and Innovation in Higher Education Crowne Plaza Hotel, Glasgow 9-11 June 2015 Welcome.
A glimpse at what it's like to work in a collaboratively led school from the perspective of a teacher and what the teacher-powered movement looks like.
Student Engagement Survey Results and Analysis June 2011.
Funded by a grant from the National Science Foundation, GSE/EXT: STEM Equity Pipeline Project, Grant No. HRD © 2009 National Alliance for Partnerships.
Nontraditional Career Preparation: Root Causes and Strategies Tools for Professional Development Thursday, June 4, 2009 Mimi Lufkin Chief Executive Officer.
Full Implementation of the Common Core. Last Meeting Performance Tasks Smarter Balanced Assessment Consortium Upcoming Accountability Measure Strong teaching.
Where the Jobs Are: developing competency in the use of labour market information Carole Brown Past President, Career Development Association of Australia.
Program Improvement Process for Equity in STEM (PIPESTEM) Mimi Lufkin, Chief Executive Officer Career and Technical Education Equity Council Conference.
Inspiring Oregonians… to do what it takes to make our schools among the nation’s best.
Gulf of Maine and the World Ocean REU Efforts to Increase Minority Participation in the Ocean Sciences David M. Fields, Rebecca A.
A College-level Analysis of Service-Learning at CUNY, Queensborough Community College IARSCLE 2011 Nov. 3, 2011 Chicago, IL Sharon S. Ellerton, PhD Assoc.Prof.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
General Capacity Building Components for Non Profit and Faith Based Agencies Lakewood Resource and Referral Center nd Street, suite 204 Lakewood,
Continuing the work of the Bill & Melinda Gates Foundation Presented by: Jeff Stauffer WebJunction Service Manager Date: 3 February 2005.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Alain Thomas Overview workshop Background to the Principles Definitions The National Principles for Public Engagement What.
Jim Dorward Sarah Giersch Kaye Howe Rena Janke Mimi Recker Andy Walker NSF Awards: NSDL ;TPC Using Online Science & Math Resources in Classrooms.
Science Mentoring Program Hughes STEM High School Community Partnership Experiences February 7, 2012 Kent Buckingham, Ph.D., Program Coordinator.
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
ActionWebs : Education and Outreach ActionWebs Meeting July 23, 2010 │Berkeley, California Education Overview Kristen Gates, Ed.D University of California,
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
Focus on Professional Learning Communities State Personnel Development Grant D. Ahrens 5/10/2013.
Using Groups in Academic Advising Dr. Nancy S. King Kennesaw State University.
Funded by a grant from the National Science Foundation, GSE/EXT: STEM Equity Pipeline Project, Grant No. HRD © 2009 National Alliance for Partnerships.
Using PLCs to Build Expertise Community of Practice October 9, 2013 Tammy Bresnahan & Tammy Ferguson.
Human Resources Office of 1 Summary of Results College of Design Dean’s Reports.
PLYMOUTH STEM STRATEGY DRAFT Vision To unify and monitor the positive momentum in STEM to ensure its leadership across Plymouth is aligned to.
STEM Equity Pipeline What and Why? Mimi Lufkin Chief Executive Officer National Alliance for Partnerships in Equity Education Foundation Susan Metz Senior.
One Piece of the Puzzle “Helping you, help your child complete their puzzle of life.”
Increasing the Participation and Completion of Women in Project Lead the Way Mimi Lufkin National Alliance for Partnerships in Equity National Alliance.
The Five Step Program Improvement Process: STEP Three Choose Best Solutions Tuesday, January 13, 2009 Mimi Lufkin Chief Executive Officer National Alliance.
For the Students Students in elementary school right now have always used technology, classes seem outdated and boring to most because of the lack of.
1 Vanderbilt University Name: Vanderbilt TAR Fellows Program Persons responsible: Thomas R. Harris, Derek Bruff, Jean Alley Time Commitment: Introductory.
Mimi Lufkin Chief Executive Officer National Alliance for Partnerships in Equity Education Foundation The Five Step Program Improvement Process Step One:
Jim Breen – VP Project Lead March 8, 2017
HI-TEC Conference Session, 10:30-11:15 a.m.
Bruce Grey Child and Family Services
District 1 Membership FLC 2017 Stayner’s HEE HAW!.
Introduction Introduction
Seminar on the Evaluation of AUT STEM Programme
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

Assessing Effectiveness: Do your program activities make a difference? May 21, 2008 Mimi Lufkin Chief Executive Officer National Alliance for Partnerships in Equity Education Foundation Tricia Berry Director Women in Engineering Program And Texas Girls Collaborative Project The University of Texas at Austin

STEM Equity Pipeline Project of the National Alliance for Partnerships in Equity Education Foundation Funded by the National Science Foundation Human Resources Directorate, Gender in Science and Engineering Program, Extension Services Grant

Goals Build the capacity of the formal education community Institutionalize the implemented strategies by connecting the outcomes to existing accountability systems Broaden the commitment to gender equity in STEM education

STEM Equity Pipeline Project Methods Professional Development Teacher Training Consulting and Technical Assistance Virtual Web-based Professional Learning Community Best Practices Handbook

How can you get involved? Participate on your State Team if from –CA, OK, MO, WI, IL Submit an application to be one of the next states in the project due July 18, 2008 Participate in the virtual learning community by going to

Assessing Effectiveness: Do your program activities make a difference? Assessing Effectiveness: Do your program activities make a difference? Insights learned from the Assessing Women and Men in Engineering (AWE) Project Tricia Berry Director, Women in Engineering Program Director, Texas Girls Collaborative Project The University of Texas at Austin May 21, 2008 STEM Equity Pipeline Webinar

Are You an Assessment Guru? Make your selection in the poll to the right and then hit the submit button below the poll window.

Overview AWE Background What is Assessment Justification for Assessment Steps to Assess and Evaluate A Real World Example Insights Gained and Words of Wisdom

What is AWE? National Science Foundation funded project to develop exportable assessment tools and methods Tools tested and validated in tests with students at AWE partner institutions including… – Pennsylvania State University – The University of Texas at Austin – Georgia Institute of Technology – University of Louisville – University of Arizona – Rennselaer Polytechnic Institute Co-PI’s on original grant: Barbara Bogue, Pennsylvania State University Rose Marra, University of Missouri

AWE Goals Provide the tools and researched knowledge base to create an assessment-based culture Create exportable assessment tools for typical engineering pre-college and undergraduate retention activities Develop capacity building tools for program directors, organizers and implementers

AWE Addresses Real World Problems for Implementing Good Assessment Lack of time and money to develop and conduct good assessment Lack of easily accessed expertise to conduct good assessment Bad habits such as recycling of borrowed or current assessment practices and resulting data that are not necessarily relevant to objectives and goals Practitioner orientation of most program directors/activity coordinators and developers – Judged on fundraising or participation – Small or volunteer staffs – Understandable focus on well run outreach and support activities

Registration is free

Overview of AWE Tools: Assessment & Evaluation Instruments Longitudinal Assessment of Engineering Self-Efficacy Survey (LAESE) Undergraduate Mentor, Mentee and “Pretty Darn Quick” Instruments Students Leaving/Persisting in Engineering Instruments Pre-College Participation Instruments College Choice Survey

Example of Instrument Information Undergraduate Engineering Mentor Surveys Pre-Survey Post-Survey

AWE Pre-college Instruments Discipline specific – Engineering, Science, & Computer Versions – Designed for adaptation to additional disciplines Core Surveys – Cover demographics, measure self-efficacy, confidence, career awareness, interest in or attitudes to STEM disciplines/careers, student evaluation of activity (post only) Optional Question Modules – Add to core surveys to measure sense of community, skills development, recruitment/branding

Overview of AWE Tools: Capacity Building Tools Research Overviews – Summarize relevant research – Include overviews on Self Efficacy, Spatial Visualization, Sense of Community, etc. Annotated Bibliography – Highlights readings that can help shape successful efforts Workshops – Provides introduction and training to others

Example of Literature Overview Mentoring and Women in Engineering

What is Assessment? Process of gathering data to determine the extent to which a person’s performance or a product or program has met its intended objectives Focus on objectives Focus on data that provide information about achievement of objectives Underlying question: How did the participant or program or activity perform relative to stated objectives? Issues: – Creating instruments, validity, reliability – Ensuring that activities further larger mission and goals

Formative vs. Summative Assessment Formative Assessment – On-going assessments or reviews – Focused on the process – Assessment “for learning” Summative Assessment – End assessment – Focused on objectives or learning outcomes – Assessment “of learning”

Doesn’t Everyone Do Assessment? Many do…but the “how” varies Typically formative assessment As practiced, often ineffective or misleading – Just-in-time approach – Based on confirming participant enjoyment of event – Often based on someone else’s survey rather than on carefully crafted objectives – Measures level of enjoyment, rather than effective practices, ineffective practices

What is on Your Post-Survey? Make your selections in the poll to the right and then hit the submit button below the poll window. You may select more than one option.

Closed Feedback Loop Typical Happy Face Survey Did you enjoy this activity? Yes, but… Talks are boring; I like action Improvement s in delivery of activity Missing: Have the Objectives Been Met?

Justification for Assessment Determine if we actually accomplish anything Objectively evaluate program offerings Identify opportunities Compare initiatives across program/institution Drives allocation of resources Elevate program value Justify existence to administration Report to funders; attract and secure funding

Steps to Assess and Evaluate 1.Determine fit with mission 2.Set goals 3.Define measurable objectives 4.Develop/modify/implement assessments 5.Analyze and evaluate the data 6.Do something with the results

Real World Example: WEP FIGs First-year Interest Groups Weekly seminar with WEP staff facilitator and student mentor Cohort classes Maximum 25 students grouped by major

Steps to Assess and Evaluate 1. Determine Fit with Mission The mission of the Women in Engineering Program is to increase the overall percentage of women in the Cockrell School of Engineering at The University of Texas at Austin. WEP strives to: educate girls and women about engineering inspire women to pursue the unlimited opportunities within the world of engineering and empower women engineers to benefit society

Steps to Assess and Evaluate 2. Set Goals Contribute to the overall goal of WEP to recruit, retain and graduate women in the Cockrell School of Engineering at The University of Texas at Austin Provide first year engineering women with the information and resources they need to make a successful transition from high school to college and informed decisions about their majors and career paths Provide a positive female role model in the upper-division peer mentor Be a forum where students can explore their options and discuss issues relevant to their first year Help first year engineering women to feel a connection to the Cockrell School of Engineering and a sense of belonging within the UT engineering community

Steps to Assess and Evaluate 3. Define Measurable Objectives Objectives should… Be specific – Address the target audience – Define a specific change we want in our participants Support overall program goals Be measurable

Steps to Assess and Evaluate 3. Define Measurable Objectives 100% of FIG students participate in at least 1 additional WEP event Retain 95% of FIG students into their 2 nd Year 100% of FIG participants are registered with the Engineering Career Center Retain 80% of FIG students into their 3 rd Year 95% of FIG students feel part of the engineering community as indicated on end of semester surveys 95% of FIG students indicate on end of semester survey that the FIG met their goals for participation 95% of FIG participants will indicate on the end of semester survey that they are confident that they will complete an engineering degree

Steps to Assess and Evaluate 4. Develop/Modify/Implement Assessments Participant Registration and Event Check-in Data Participant Immediate Post Surveys End of Semester Surveys Retention and Graduation Data

Steps to Assess and Evaluate 4. Develop/Modify/Implement Assessments FIG End of Semester Questions The WEP FIG met my goals for participation. 5=Strongly Agree, 4=Agree, 3=Neutral, 2=Disagree, 1=Strongly Disagree I am confident that I will complete an engineering degree. 5=Strongly Agree, 4=Agree, 3=Neutral, 2=Disagree, 1=Strongly Disagree As a result of my participation in the WEP FIG, I feel like a part of the Engineering Community. 5=Strongly Agree, 4=Agree, 3=Neutral, 2=Disagree, 1=Strongly Disagree

Steps to Assess and Evaluate 4. Develop/Modify/Implement Assessments Session Immediate Post Survey

Steps to Assess and Evaluate 5. Analyze and Evaluate the Data 100% of FIG students participate in at least 1 additional WEP event Retain 95% of FIG students into their 2nd Year 100% of FIG participants are registered with the Engineering Career Center (90% registered) Retain 80% of FIG students into their 3rd Year 95% of FIG students feel part of the engineering community as indicated on end of semester surveys (100%) 95% of FIG students indicate on end of semester survey that the FIG met their goals for participation (91%) 95% of FIG participants will indicate on the end of semester survey that they are confident that they will complete an engineering degree (91%)

Steps to Assess and Evaluate 5. Analyze and Evaluate the Data Academic Year Overall FIG NameFab Few WISESummary 2nd semester retention93.33%83.33%100.00%93.35% 4th semester retention66.67%75.00%100.00%76.34% 6th semester retention60.00%75.00%57.14%59.14% 8th semester retention60.00%75.00%57.14%64.05% ANY UT DEGREE 4 YRS53.33%66.66%28.57%52.9% CSE DEGREE 4 YRS33.33%66.66%28.57%44.1% ANY UT DEGREE 5 YRS80.00%75.00%57.14%73.5% CSE DEGREE 5 YRS60.00%66.66%57.14%61.8% ANY UT DEGREE 6 YRS86.66% 86.7% CSE DEGREE 6 YRS66.66% 66.7% Longitudinal Data

Steps to Assess and Evaluate 6. Do Something With the Results Report to Funders or Administration

How Are You Feeling? Make your selection in the poll to the right and then hit the submit button below the poll window.

Can We Help You Feel Better? Make your selection in the poll to the right and then hit the submit button below the poll window.

Insights Learned Assessment is a process, not a one time event Building a culture of assessment takes time…but is worth it If assessment is included in the planning cycle, it is easier to accomplish Good assessments are available…we don’t need to create our own from scratch

Words of Wisdom Start small and focused in your assessment – Don’t try to measure everything – Think about the use of the results first Don’t reinvent the wheel – Use the AWE products – Use other’s assessments with modifications Keep at it…it will eventually become a part of your program’s culture

For More Information… Tricia Berry AWE Project – Dana Hosko – Barbara Bogue – Rose Marra

Next Webinar Building Effective Program Assessments: How to Use the AWE Tools Monday, June 16, pm ET, 1pm MT, 12 noon CT, 11am PT Tricia Berry, Director Women in Engineering Program The University of Texas at Austin For more information go to

Questions? Mimi Lufkin Chief Executive Officer National Alliance for Partnerships in Equity Education Foundation Tricia Berry Director Women in Engineering Program And Texas Girls Collaborative Project The University of Texas at Austin