Download presentation
Presentation is loading. Please wait.
Published byRonald Tyler Modified over 9 years ago
1
Best Practices in Measuring Success & Demonstrating Outcomes for Microenterprise Development Program! Facilitated by: Marian Doub, Associate, Friedman Associates Jason Friedman, Principal, Friedman Associates Guest Practitioners: Nancy Swift, Executive Director, Jefferson Economic Development Institute (JEDI), Mt. Shasta, CA Alex Forrester, Chief Operations Officer, Rising Tide Capital, Jersey City, NJ Hosted by Little Dixie Community Action Agency and funded in part by the U.S. Small Business Administration Program in Investments in Microentrepreneurs Program (PRIME).
2
Introduction to Friedman Associates & Webinar Instructions As a community-based organization that helps low-wealth individuals and communities build wealth, create jobs and small businesses, your work is essential to the nation’s economic recovery. The mission of Friedman Associates is help you achieve your vision for a sustainable and economically vibrant community – and demonstrate the results that lead to increased funding and long-term success. Areas of specialization include product development and staff training in microfinance and small business lending; business development services; systems for client tracking and program performance; strategic planning, board development and fund development strategies.
3
33 Presenter: Marian Doub One of the nation's top specialists in integrated systems for monitoring and evaluating microenterprise development programs. Certified as a MicroTest trainer by the Aspen Institute. Research and Evaluation Manager for Women’s Initiative for Self Employment (S.F. & Oakland, CA) from 1998-2004. Developed practical systems for promoting best practices and innovation with dozens of microenterprise and community econ. development providers and intermediaries (Aspen Institute, MicroTest, LISC, NeighborWorks America). MA in Urban and Environmental Policy from Tufts University’s Department of Urban and Environmental Policy & Planning in Medford, MA.
4
4 Guest Presenter: Nancy Swift, ED & Program Director, JEDI 19 year veteran to the field and has served as a practitioner, advocate and visionary. The Jefferson Economic Development Institute is an award- winning microenterprise and asset development corporation located under the Mt. Shasta volcano in northern most frontier region of California. Serves approximately 350 people annually who are primarily low income and are most likely to be women, Native American, African American or have a disability. Since 1996, has assisted over 4,500 people to create 1365 jobs or new businesses. Nearly 50% of businesses increased their revenues by 50% after working with JEDI an average 1.5 years
5
5 Guest Presenter: Alex Forrester, COO, Rising Tide Capital (RTC) Co-founded RTC based in Jersey City, NJ in 2004 with fellow Harvard classmate Alfa Demmellash. Serves as Chief Operations Officer, with primary oversight on financial management, institutional fundraising, grants management, outcome measurement, and technology. Core programs: The Community Business Academy (40- hour training course) and Business Acceleration Services (year-round coaching and seminars). 280 graduates of CBA since Dec 2006. 100 currently in business; 128 in the planning stages. CEO Alfa Demmellash was selected in 2009 as a CNN Hero; Recognized by President Obama in White House speech in June 2009.
6
6 Our Agenda for Today Measuring Success —why now more than ever? Practical Paths for Measuring Success— 1.Most Basic/Pre-Standard 2.Basics+ Industry Standards, 3.Integrated Systems Resources, Process & Products Strengths and Weaknesses Examples from the field National standards, benchmarks Q&A and Action Planning
7
77 What’s Going On in the Big Picture for MDOs? – External Factors The pressure is on to ‘make the case’ for programs that support micro- and small businesses in competitive funding and policy environments. Demand for accountability and evidence of track records and outcomes results (what happens during and after services) is also at an all time high. Strategic demand for greater MDO scale and results. Increasingly MDOs must prove and improve program results by tracking outcomes. Early adopters of robust program performance & outcomes systems are now accepted leaders in the field.
8
88 What’s Going On in the Big Picture? – Internal Factors Weak or minimal data and knowledge management systems severely limit MDO attempts to assess and improve performance, attract funding and, demonstrate their relevance to the community. Where do I start? What can I expect? How do I plan for and implement these systems? Limited resources for building, sustaining and optimizing data & knowledge management systems—very few capacity building resources exist. Where do I turn for resources?
9
9 What is Measuring Success? Regular and systematic use of information about program performance and outcomes to prove and improve results. The regular, systematic tracking of the extent to which program participants experience the benefits or changes intended by the Mission (United Way, 2002). Measuring Success process fosters continuous learning and evidence-based change to prove and improve program results during data definition, collection, entry, storage, and use.
10
Measuring Success answers 3 kinds of questions:* 1.How is our program performing? 2.How are our clients doing? 3.Are we achieving our Mission? *Thank you to the Aspen Institute’s MicroTest Program for material on this page.
11
11 Measuring Success Uses 3 types of Monitoring and Evaluation Data: Program Performance questions can be answered with data that an ME program needs to collect and maintain in order to function: In client contact database, in loan portfolio management systems, in accounting systems, etc. Client Outcomes questions can only be answered by going outside the program and surveying clients who have received substantial services and have had time to put what they learned/received into use. Program Impact measures how strongly the outcomes are related to the program experience using a control group, statistical tests, large sample sizes, or data gathered at set points in time over a long period. *Thank you to the Aspen Institute’s MicroTest Program for material on this page.
12
12 Standard Client Outcomes Indicators Most commonly required indicators of success are outcomes—occur a year or more after a loan or training—and prove our role in economic development and recovery: New businesses (start ups) (HUD/CDBG, SBA, MT) Jobs created and retained (HUD/CDBG, SBA, HHS, MT) Annual business revenue increases (HUD/CDBG, SBA, HHS, MT) Existing businesses stay in business (survive) & thrive Business profitability Owners improve their personal and household financial stability and security Serving distressed, underserved communities: women, people of color, un- and under-employed, etc.
13
13 Strategic visioning and decision making Program planning (innovation, goal setting) Fundraising (proposals and reports) Program management (decisions, work flow, customer service) Monitoring and evaluation (are we meeting the needs and achieving our Mission) Communicating results to clients, Board, community, policy makers, and other supporters Why Measure Success?
14
14 What is Your Path to Measuring Success? Why & How Much Do You Measure Success? How Do You Best Know and Use Your Results? Gather and organize information that can help you improve program management and decision-making Make the case well Improve work-flow What do you need to know vs. want to know? What is your return on investment/value proposition for Measuring Success?
15
15 3 Common Paths for MDOs: Most Basic/Pre-Standard Most Basic: Use data management system (database, data collection tools) for client contact information, demographics, and basic program activity Fulfills basic reporting, program management requirements. Databases: Excel spreadsheets, Contact Relationship Management (CRM) Systems, Loan Fund Management.
16
3 Common Paths for MDOs: Basic Industry Standard Basics+ Industry Standards: Data management system and adoption of MicroTest Program Performance and Outcomes Standards and Tools – Meets the standards for the MDO industry and provides outcome monitoring results on an annual basis. – MicroTest membership includes tools and customized reports.
17
17 3 Common Paths for MDOs: Integrated Measuring Success Use of Mission-driven outcomes throughout program services and data management system. Provides Mission-driven, just-in-time information about program performance and outcomes to continuously improve results for clients and staff. MIS/Database must manage historical, relational data— changes over time for clients and their businesses. Produces MicroTest results Resource intensive to transition. Efficiency and effectiveness improves as data integrity and use (analysis) improve.
18
What are the Pros and Cons of Each Approach?
19
19 Data Integrity ProsCons Basics+ Standards Path 1.Standard industry tools, indicators & definitions 2.Maintains program performance & contact data 3.Outcomes data collection, entry, analysis and reporting 1x year 4.Staff learn & use data more consistently 5.Client follow-up improves client results & data 6.Direct service staff manage data—requires attention to detail 1.No/little historical outcome data for daily use 2.Hard to stay in touch with clients if not program feature Often— 1.Multiple or parallel data sources develop 2.Data not validated or consistent 3.‘Just in time’ design: Danger of ’too many cooks in the kitchen’ with new data needs Integrated Path 1.Reliable, validated Mission- driven MIS tools, indicators & definitions integrated with program delivery 2.Staff & clients rely on & use 3.Staff & clients maintain up-to- date, accurate data 1.Constant data management required to maintain, clean, and update data & systems 2.Data analyst and/or manager staff position or FTE needed
20
20 Data Use ProsCons Basics+ Standards Path 1.Funder-driven 2.Program management 3.Simple reports about program performance 4.MDO trend and benchmark analysis—how you are doing in comparison to others 5.Stats compliment ‘success stories’ 1.Up-to-date outcomes not known for day-to-day use by staff 2.Hard to report changes that occur after loan or training-- ‘jobs created’ ‘business starts’ 3.Funder-driven 4.Direct service staff or Board rarely see/use reports Integrated Path 1.Clear, concise use of success indicators & metrics throughout org. 2.Accurate, up-to-date data focuses customer service on results 3.Reports customized to staff needs, policy advocacy, etc. 4.Continuous feedback & communication 1.Requires high level of data analysis capacity on staff or contract to produce high integrity reports as needed
21
21 Start Up Resources ProsCons Basics+ Standards Path 1.Often free or low cost 2.‘Accidental techies’ & Excel power-users on staff manage 3.Max. of $500/year MT membership—waived with new member training 4.MIS assessment & enhancement to track basics + other program areas or requirements 5.Dedicated staff time and attention: buy-in improves 1.Excel is best as analysis and reporting tool, not data storage & management 2.Start from scratch, reinvent MDO specifications 3.MicroTest is time-intensive first few years depending on data integrity and MIS Integrated Path 1.Creates strong foundation for future org. growth 2.Database manager, analyst or coordinator 3.Configuration, conversion to database that tracks historical, relational data 1.Higher initial costs: database configuration & conversion; staff training; staffing
22
Integrated Measuring Success: The JEDI and RTC Experiences What is your return on investment/value proposition for deciding to integrate Measuring Success-outcome tracking—throughout your organization? Why & how much do you hope to Measure Success? What were your systems like when you started? What have you done so far? Where are you in the process? What has changed for your organization?
23
Basic Industry Standard Best Practice Resource MicroTest Performance and Outcomes www.microtest.org
24
What is MicroTest? An initiative of the Aspen Institute’s FIELD Program. Management tool that empowers microenterprise practitioners to gauge and improve the performance of their program and the outcomes of their clients. Practitioner-built tools and protocols for collecting and using data to answer the 3 important questions. Uses standard indicators and metrics to document and define standard and top performance for the MDO field in the U.S. in aggregate and peer groups. Active peer-group of microenterprise development programs monitoring performance and outcomes.
25
MicroTest is…
26
MT Performance Workbook Basics What is it? The MT Performance Workbook is a set of linked excel worksheets that gathers key information on your microenterprise program’s training and credit activities and provides immediate feedback on the costs, efficiency and sustainability of those activities. Plus, the integrated custom report allows you to see how your program is changing over time, how it compares to other similar microenterprise programs, and how it compares to “top performance” in the industry. Why do people use it? The MT performance workbook provides information crucial to adapting and refining program services and assembling winning grant proposals. Programs that complete the workbook also cite an expanded data collection and analysis capacity within their organizations as a key reason for participating. How is the MicroTest performance workbook unique? MT defined the set of standard measures accepted by the microenterprise industry which allows you to hone in on your microenterprise organization’s performance and discuss this performance using terms and definitions the industry agrees on. TA from MT staff allows the data to really mean something and be used in a productive way for the program. FIELD - The Aspen Institute
27
27 FIELD - The Aspen Institute MicroTest Program Performance Custom Report Interactive Features of the custom report allow you to further personalize the document for your program. Look at your program’s progress over time using trend data for all 50 MT measures. Compare your program’s performance to those MT programs achieving Top Performance for key measures. Compare your program’s performance to your peers.
28
28 JEDI ‘data treasure’: MicroTest Performance Report What do you need to know vs. want to know? Target Market Reach (n=58 MicroTest members submitting 2007 data) Measure Rural Program (n=8) Mature Program (n=32) Training-Led Program (n=34) Low- Income Focused Program (n=23) Your Program 2007 Your Prog. 2008 Total Clients Served56278199.5153 171 % Women Served39%68%73% 80% % Minorities Served39%58%72%76%5%20% % Disabled Served11%9% 6%11% % Low Income (100% of HHS)39%26%32%34%39%23% % Low Income (150% of HHS)65%48%51%56%65%51% % Low Income (80% of HUD)90%75%78%77%90%74% % of TANF Clients13%5%8%10%18%4%
29
MT Outcomes Workbook Basics Adapted from: http://fieldus.org/Microtest/OutcomesDetails.html What is it? MicroTest Outcomes Workbook is a series of linked Excel worksheets—including intake (baseline) and survey (outcomes)—with an accompanying Instruction Guide. MicroTest staff provide data cleaning, analysis, custom report, and technical assistance. Analysis and custom report includes: non-response bias, dashboard, overview, longitudinal analysis of results. Why do programs use it? Designed to answer key questions about clients’ business and household outcomes: Are the clients in business? Are the businesses growing? Creating jobs? Does the business contribute income to the household? Are clients moving out of poverty? The indicators are few and focused on key questions managers must constantly answer with respect to program effectiveness. The data collection process and analysis is simple. The approach is a way for programs to monitor outcomes - it is not outcomes assessment or evaluation. The MicroTest Outcomes Workbook and Instruction Guide are updated every year based on members’ feedback.
30
30 JEDI ‘data treasure’: MicroTest Outcomes Report
31
Integrated Measuring Success (Path Three) Best Practice Process & Products
32
32 Measuring Success Four Practical Steps for Building Integrated Systems Step One: Name and Define Success What does your Mission-driven success look like? Purpose: Mission-driven goals & framework for describing program and longer term benefits (measures of success) Tools: Theory of Change, Logic Model, Stakeholder Interviews, Data Treasure Hunt, Benchmarks Literature Review
33
33 Name Your Program Success on your terms…key outcomes indicators Mission: JEDI increases the economic well-being of people and communities through business development and local wealth creation. JEDI succeeds when entrepreneurs succeed in one or more years to: 1.Start businesses; 2.Strengthen, formalize, and expand businesses; 3.Establish and maintain profitable businesses; 4.Create and/or retain employment for themselves and others; and 5.Improve household financial security.
34
34 Name Your Program Success on your terms…two tools others use
35
JEDI’s key outcomes indicators JEDI knows businesses are successful when in 2 to 6 years: – Income from business contributes to household financial self-sufficiency and security; – Business revenue allows owners and other employees to increase purchasing of goods and services (household & business-to-business spending) and increase the income tax base; – Businesses provide local communities with needed goods and services; – Businesses attract regional markets and investment; – Businesses provide local communities with cultural and social assets; and – Owners give back and reinvest in local community.
36
JEDI’s key outcomes indicators defined: JEDI SUCCEEDS WHEN CLIENTS SUCCEED IN ONE OR MORE YEARS TO… DEVELOP BUSINESSES AND… (read down) How we know success…How Our Results Measure Up… Businesses start Feasibility assessment complete Under 1 year of consistent sales Establishing operations, marketing, and plans Date business begins 66 percent of clients without businesses at entry start businesses (64 % national average: 31 businesses started in 2008) Businesses stabilizeBusiness does one or more of the following and maintains change for at least 3 months: Reaches break even Stabilizes operations Businesses formalizeOwner does two of the following: Completes 2 action steps in each area: business management, marketing, financial management Completes business plan ready for investors or partners 76 percent of clients complete business plans (national average: 63%, national top performers: 88%) Businesses grow And More… Business does one or more of the following and maintains change for at least 3 months: Increases revenue (average net change of >5%) Opens new location or expands square footage Increases number of employees Launches a new product line or service 42 percent of businesses increase revenue by 46 percent (median increase of $8,900; nationally 37% median revenue increase) 22 percent of JEDI businesses increase revenue by 50 percent or more a year Businesses hire 0.5 new employees on average
37
RTC Quarterly Dashboard (mock data) RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft) What do you need to know vs. want to know?
38
RTC Quarterly Dashboard (mock data) RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft) What do you need to know vs. want to know?
39
RTC Quarterly Dashboard (mock data) RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft) What do you need to know vs. want to know?
40
40 Measuring Success-Four Practical Steps for building integrated systems Step Two: Assess and Align Internal Capacity to Manage the Data You Need Purpose: Design & resource your MIS—data collection, entry, storage, use—to measure and result in Mission-driven success. Tools: Monitoring & Evaluation Plan; Data Fields Inventory, Audit and Alignment; Data collection tool templates, pilot 1. Assess and select outcome tracking resources & method(s); 2. Inventory/audit of baseline and outcome fields/questions in databases, data collection tools, reports; 3. Revise forms, databases, reports, processes as needed
41
41 Data Collection for Measuring Success—A Good Intake Tool… Sets the baseline questions for measuring long term outcomes—the questions are asked and tracked throughout the system in almost exactly the same way at intake, on update forms, and surveys. Encourages the program staff and clients to assess and update progress on a regular basis—it is useful for more than data. Asks for 1-2 other contact people to help stay in touch. Records the data event, collection, and entry dates as well as staff names. ‘Translates’ evaluation & metrics into a language everyone can use. Communicates clear guidelines for information use.
42
42 Data Collection for Measuring Success— RTC RTC Program Applications/Intake Forms Baseline Goals Scope of Work/Terms of Service Agreements
43
43 Data Collection for Measuring Success – In Action RTC One-on-One Coaching form RTC Outcome Update form (for staff) JEDI Outcome Update form (for staff) JEDI Client Action Plan Encourage Staff and Clients to achieve & report Mission-driven results
44
44 Data Collection for Measuring Success— A Good Survey Tool is… Administered no more than 1x a year by phone, web- based, or mail-in. Phone is usually most successful. Uses incentives—raffle of something useful to business owners—for those who complete survey. Administered by trained staff or volunteers who are not direct service providers. Looked forward to once the ‘check-in’ practice is established. Does not take the place of program evaluation or experimental research.
45
45 Data Collection for Measuring Success JEDI Phone Interview Survey Tool
46
46 Data Collection for Measuring Success RTC Online Survey Tool
47
47 Measuring Success-Four Practical Steps for building integrated systems Step Three: Redesign and Implement Data Management System Purpose: Mobilize and implement MIS and human resources to support measuring success. Tools: Database Needs Assessment; Database Product Assessment & Selection; Database re-engineering (configuration) & transition (conversion of existing data); MIS for Microenterprise: A Practical Approach to Managing Information Successfully http://fieldus.org/Publications/MISManual.pdf Goal is to increase efficiency and effectiveness of internal systems—eliminate parallel data sources, duplicate data entry, time intensive reports, etc. Dedicate enough resources—time, staff, money—to do this project well—this is a long term investment. Document!
48
48 Step 3: Select MIS Products for Tracking Program Data—including Outcomes Promising Options VistaShare Outcome Tracker; WebCATS (Client Activity Tracking Software, primarily SBA grantees; The Exceptional Assistant, by Common Goals Software; Salesforce (with other platforms for outcome tracking (MicroTest Outcomes or Success Measures); Efforts to Outcomes/Social Solutions; Money In, Money Out, Technical Assistance (MIMOTA) by Villagesoft; Portfol Applied Business Software Others?
49
49 Measuring Success-Four Practical Steps for building integrated systems Step Four: Use and Sustain the Results & System Purpose: Use Mission-driven goals, framework, information for to prove and improve program success. Tools: Adjust staffing plan/job descriptions to include data analyst, collection, entry, management/coordination, reporting; Report/use design, pilot, plan; Staff training & TA/support (ongoing); Operations manuals Program Management: Scorecards, Dashboards, regular reports Public Relations/Fundraising Materials Learning Circles-highlight and explore strategic issues using results with staff, clients, Board, communities And Many More…
50
50 Why Do You Measure Success? JEDI ‘data treasure’: Fact Sheet 2009
51
RTC Fundraising Proposal: Over the past few years, Rising Tide Capital has trained over 300 entrepreneurs in the basics of business planning and management. 117 of these graduates are currently in business and an additional 150 are currently in the planning stages. An outcome survey conducted during Summer 2010 showed that: – Within one year of going through our programs our entrepreneurs have experienced an average increase in business revenue of 80% and a corresponding increase in household income of 14%. – Collectively, these businesses are generating nearly $2 million in annual business sales—producing significant economic activity in an area of otherwise concentrated poverty. – These outcomes mean that for every dollar donors give to Rising Tide Capital, we are able to produce $3.80 in economic impact.
53
09/15/10
54
54 JEDI & RTC Comments What step is your organization taking now? Examples of most useful tools-framework & data collection (forms & process) & reports Most useful results & surprises Challenges & lessons learned
55
55 What is Your Path to Measuring Success? Why & How Much Do You Measure Success? How Do You Best Know and Use Your Results? Gather and organize information that can help you improve program management and decision-making Make the case well Improve work-flow What do you need to know vs. want to know? What is your return on investment/value proposition for Measuring Success?
56
MicroTest Training from Friedman Associates for new and renewing members Individual Onsite Training—1.5 days Hands on introduction and practicum in the MicroTest Performance and Outcomes systems Assessment and recommendations for optimizing systems related to MT Up to five (5) one-hour phone consultations First year of MT membership for new members Group—2 days for 3 + organizations, location TBD Case study-based introduction and practicum in the MicroTest Performance and Outcomes systems Up to five (5) one-hour phone consultations for each org. First year of MT membership for new members
57
Questions?
58
58 Please contact us for more information : Jason Friedman jason@friedmanassociates.net (319) 341-3556 Marian Doub marian@friedmanassociates.net (415) 730-1873 Nancy Swift nswift@e-jedi.org (530) 926-6676 Alex Forrester alex@risingtidecapital.org (201) 432-4316 THANK YOU!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.