Facilitated by: Marian Doub, Associate, Friedman Associates

58
Best Practices in Measuring Success & Demonstrating Outcomes for Microenterprise Development Program! Facilitated by: Marian Doub, Associate, Friedman Associates Jason Friedman, Principal, Friedman Associates Guest Practitioners: Nancy Swift, Executive Director, Jefferson Economic Development Institute (JEDI), Mt. Shasta, CA Alex Forrester, Chief Operations Officer, Rising Tide Capital, Jersey City, NJ Hosted by Little Dixie Community Action Agency and funded in part by the U.S. Small Business Administration Program in Investments in Microentrepreneurs Program (PRIME).

description

Best Practices in Measuring Success & Demonstrating Outcomes for Microenterprise Development Program!. Facilitated by: Marian Doub, Associate, Friedman Associates Jason Friedman, Principal, Friedman Associates Guest Practitioners: - PowerPoint PPT Presentation

Transcript of Facilitated by: Marian Doub, Associate, Friedman Associates

Page 1: Facilitated by:   Marian Doub, Associate, Friedman Associates

Best Practices in Measuring Success & Demonstrating Outcomes for Microenterprise Development

Program! Facilitated by: Marian Doub, Associate, Friedman AssociatesJason Friedman, Principal, Friedman Associates

Guest Practitioners: Nancy Swift, Executive Director, Jefferson Economic

Development Institute (JEDI), Mt. Shasta, CA Alex Forrester, Chief Operations Officer, Rising Tide Capital,

Jersey City, NJ

Hosted by Little Dixie Community Action Agency and funded in part by the U.S. Small Business Administration Program in Investments in Microentrepreneurs Program (PRIME).

Page 2: Facilitated by:   Marian Doub, Associate, Friedman Associates

Introduction to Friedman Associates & Webinar Instructions

• As a community-based organization that helps low-wealth individuals and communities build wealth, create jobs and small businesses, your work is essential to the nation’s economic recovery.

• The mission of Friedman Associates is help you achieve your vision for a sustainable and economically vibrant community – and demonstrate the results that lead to increased funding and long-term success.

• Areas of specialization include product development and staff training in microfinance and small business lending; business development services; systems for client tracking and program performance; strategic planning, board development and fund development strategies.

Page 3: Facilitated by:   Marian Doub, Associate, Friedman Associates

33

Presenter: Marian Doub

• One of the nation's top specialists in integrated systems for monitoring and evaluating microenterprise development programs. Certified as a MicroTest trainer by the Aspen Institute.

• Research and Evaluation Manager for Women’s Initiative for Self Employment (S.F. & Oakland, CA) from 1998-2004.

• Developed practical systems for promoting best practices and innovation with dozens of microenterprise and community econ. development providers and intermediaries (Aspen Institute, MicroTest, LISC, NeighborWorks America).

• MA in Urban and Environmental Policy from Tufts University’s Department of Urban and Environmental Policy & Planning in Medford, MA.

Page 4: Facilitated by:   Marian Doub, Associate, Friedman Associates

4

Guest Presenter: Nancy Swift, ED & Program Director, JEDI

• 19 year veteran to the field and has served as a practitioner, advocate and visionary.

• The Jefferson Economic Development Institute is an award-winning microenterprise and asset development corporation located under the Mt. Shasta volcano in northern most frontier region of California.

• Serves approximately 350 people annually who are primarily low income and are most likely to be women, Native American, African American or have a disability.

• Since 1996, has assisted over 4,500 people to create 1365 jobs or new businesses.

• Nearly 50% of businesses increased their revenues by 50% after working with JEDI an average 1.5 years

Page 5: Facilitated by:   Marian Doub, Associate, Friedman Associates

5

Guest Presenter: Alex Forrester, COO, Rising Tide Capital (RTC)

• Co-founded RTC based in Jersey City, NJ in 2004 with fellow Harvard classmate Alfa Demmellash.

• Serves as Chief Operations Officer, with primary oversight on financial management, institutional fundraising, grants management, outcome measurement, and technology.

• Core programs: The Community Business Academy (40-hour training course) and Business Acceleration Services (year-round coaching and seminars).

• 280 graduates of CBA since Dec 2006. 100 currently in business; 128 in the planning stages.

• CEO Alfa Demmellash was selected in 2009 as a CNN Hero; Recognized by President Obama in White House speech in June 2009.

Page 6: Facilitated by:   Marian Doub, Associate, Friedman Associates

6

Our Agenda for Today

• Measuring Success —why now more than ever?

• Practical Paths for Measuring Success—

1. Most Basic/Pre-Standard

2. Basics+ Industry Standards,

3. Integrated Systems

Resources, Process & Products Strengths and Weaknesses Examples from the field National standards, benchmarks

• Q&A and Action Planning

Page 7: Facilitated by:   Marian Doub, Associate, Friedman Associates

77

What’s Going On in the Big Picture for MDOs? – External Factors

• The pressure is on to ‘make the case’ for programs that support micro- and small businesses in competitive funding and policy environments.

• Demand for accountability and evidence of track records and outcomes results (what happens during and after services) is also at an all time high.

• Strategic demand for greater MDO scale and results.

• Increasingly MDOs must prove and improve program results by tracking outcomes.

• Early adopters of robust program performance & outcomes systems are now accepted leaders in the field.

Page 8: Facilitated by:   Marian Doub, Associate, Friedman Associates

88

What’s Going On in the Big Picture? – Internal Factors

• Weak or minimal data and knowledge management systems severely limit MDO attempts to assess and improve performance, attract funding and, demonstrate their relevance to the community.

Where do I start? What can I expect? How do I plan for and implement these systems?

• Limited resources for building, sustaining and optimizing data & knowledge management systems—very few capacity building resources exist.

Where do I turn for resources?

Page 9: Facilitated by:   Marian Doub, Associate, Friedman Associates

9

What is Measuring Success?

• Regular and systematic use of information about program performance and outcomes to prove and improve results.

• The regular, systematic tracking of the extent to which program participants experience the benefits or changes intended by the Mission (United Way, 2002).

• Measuring Success process fosters continuous learning and evidence-based change to prove and improve program results during data definition, collection, entry, storage, and use.

Page 10: Facilitated by:   Marian Doub, Associate, Friedman Associates

Measuring Success answers 3 kinds of questions:*

1. How is our program performing?

2. How are our clients doing?

3. Are we achieving our Mission?

*Thank you to the Aspen Institute’s MicroTest Program for material on this page.

Page 11: Facilitated by:   Marian Doub, Associate, Friedman Associates

11

Measuring Success Uses 3 types of Monitoring and Evaluation Data:

• Program Performance questions can be answered with data that an ME program needs to collect and maintain in order to function:• In client contact database, in loan portfolio management

systems, in accounting systems, etc.

• Client Outcomes questions can only be answered by going outside the program and surveying clients who have received substantial services and have had time to put what they learned/received into use.

• Program Impact measures how strongly the outcomes are related to the program experience using a control group, statistical tests, large sample sizes, or data gathered at set points in time over a long period.

*Thank you to the Aspen Institute’s MicroTest Program for material on this page.

Page 12: Facilitated by:   Marian Doub, Associate, Friedman Associates

1212

Standard Client Outcomes Indicators

• Most commonly required indicators of success are outcomes—occur a year or more after a loan or training—and prove our role in economic development and recovery: New businesses (start ups) (HUD/CDBG, SBA, MT)

Jobs created and retained (HUD/CDBG, SBA, HHS, MT)

Annual business revenue increases (HUD/CDBG, SBA, HHS, MT)

Existing businesses stay in business (survive) & thrive • Business profitability

• Owners improve their personal and household financial stability and security

• Serving distressed, underserved communities: women, people of color, un- and under-employed, etc.

Page 13: Facilitated by:   Marian Doub, Associate, Friedman Associates

1313

• Strategic visioning and decision making

• Program planning (innovation, goal setting)

• Fundraising (proposals and reports)

• Program management (decisions, work flow, customer service)

• Monitoring and evaluation (are we meeting the needs and achieving our Mission)

• Communicating results to clients, Board, community, policy makers, and other supporters

Why Measure Success?

Page 14: Facilitated by:   Marian Doub, Associate, Friedman Associates

1414

What is Your Path to Measuring Success?

• Why & How Much Do You Measure Success?

• How Do You Best Know and Use Your Results? Gather and organize information that can help you

improve program management and decision-making

Make the case well

Improve work-flow

What do you need to know vs. want to know?

• What is your return on investment/value proposition for Measuring Success?

Page 15: Facilitated by:   Marian Doub, Associate, Friedman Associates

1515

3 Common Paths for MDOs: Most Basic/Pre-Standard

• Most Basic: Use data management system (database, data collection tools) for client contact information, demographics, and basic program activity

Fulfills basic reporting, program management requirements.

Databases: Excel spreadsheets, Contact Relationship Management (CRM) Systems, Loan Fund Management.

Page 16: Facilitated by:   Marian Doub, Associate, Friedman Associates

3 Common Paths for MDOs: Basic Industry Standard

• Basics+ Industry Standards: Data management system and adoption of MicroTest Program Performance and Outcomes Standards and Tools

– Meets the standards for the MDO industry and provides outcome monitoring results on an annual basis.

– MicroTest membership includes tools and customized reports.

Page 17: Facilitated by:   Marian Doub, Associate, Friedman Associates

1717

3 Common Paths for MDOs: Integrated Measuring Success

• Use of Mission-driven outcomes throughout program services and data management system.

Provides Mission-driven, just-in-time information about program performance and outcomes to continuously improve results for clients and staff.

MIS/Database must manage historical, relational data—changes over time for clients and their businesses.

Produces MicroTest results

Resource intensive to transition.

Efficiency and effectiveness improves as data integrity and use (analysis) improve.

Page 18: Facilitated by:   Marian Doub, Associate, Friedman Associates

What are the Pros and Cons of Each Approach?

Page 19: Facilitated by:   Marian Doub, Associate, Friedman Associates

1919

Data Integrity Pros Cons

Basics+ Standards Path

1. Standard industry tools, indicators & definitions

2. Maintains program performance & contact data

3. Outcomes data collection, entry, analysis and reporting 1x year

4. Staff learn & use data more consistently

5. Client follow-up improves client results & data

6. Direct service staff manage data—requires attention to detail

1. No/little historical outcome data for daily use

2. Hard to stay in touch with clients if not program feature

Often—1. Multiple or parallel data

sources develop2. Data not validated or

consistent 3. ‘Just in time’ design: Danger

of ’too many cooks in the kitchen’ with new data needs

Integrated Path

1. Reliable, validated Mission-driven MIS tools, indicators & definitions integrated with program delivery

2. Staff & clients rely on & use3. Staff & clients maintain up-to-

date, accurate data

1. Constant data management required to maintain, clean, and update data & systems

2. Data analyst and/or manager staff position or FTE needed

Page 20: Facilitated by:   Marian Doub, Associate, Friedman Associates

2020

Data Use

Pros Cons

Basics+ Standards Path

1. Funder-driven 2. Program management 3. Simple reports about

program performance4. MDO trend and benchmark

analysis—how you are doing in comparison to others

5. Stats compliment ‘success stories’

1. Up-to-date outcomes not known for day-to-day use by staff

2. Hard to report changes that occur after loan or training-- ‘jobs created’ ‘business starts’

3. Funder-driven 4. Direct service staff or Board

rarely see/use reports

Integrated Path

1. Clear, concise use of success indicators & metrics throughout org.

2. Accurate, up-to-date data focuses customer service on results

3. Reports customized to staff needs, policy advocacy, etc.

4. Continuous feedback & communication

1. Requires high level of data analysis capacity on staff or contract to produce high integrity reports as needed

Page 21: Facilitated by:   Marian Doub, Associate, Friedman Associates

2121

Start Up Resources

Pros Cons

Basics+ Standards Path

1. Often free or low cost2. ‘Accidental techies’ &

Excel power-users on staff manage

3. Max. of $500/year MT membership—waived with new member training

4. MIS assessment & enhancement to track basics + other program areas or requirements

5. Dedicated staff time and attention: buy-in improves

1. Excel is best as analysis and reporting tool, not data storage & management

2. Start from scratch, reinvent MDO specifications

3. MicroTest is time-intensive first few years depending on data integrity and MIS

Integrated Path

1. Creates strong foundation for future org. growth

2. Database manager, analyst or coordinator

3. Configuration, conversion to database that tracks historical, relational data

1. Higher initial costs: database configuration & conversion; staff training; staffing

Page 22: Facilitated by:   Marian Doub, Associate, Friedman Associates

Integrated Measuring Success:The JEDI and RTC Experiences

• What is your return on investment/value proposition for deciding to integrate Measuring Success-outcome tracking—throughout your organization?

• Why & how much do you hope to Measure Success?

• What were your systems like when you started?

• What have you done so far? Where are you in the process?

• What has changed for your organization?

Page 23: Facilitated by:   Marian Doub, Associate, Friedman Associates

Basic Industry StandardBest Practice Resource

MicroTest Performance and Outcomes

www.microtest.org

Page 24: Facilitated by:   Marian Doub, Associate, Friedman Associates

What is MicroTest?

• An initiative of the Aspen Institute’s FIELD Program.

• Management tool that empowers microenterprise practitioners to gauge and improve the performance of their program and the outcomes of their clients.

• Practitioner-built tools and protocols for collecting and using data to answer the 3 important questions.

• Uses standard indicators and metrics to document and define standard and top performance for the MDO field in the U.S. in aggregate and peer groups.

• Active peer-group of microenterprise development programs monitoring performance and outcomes.

Page 25: Facilitated by:   Marian Doub, Associate, Friedman Associates

MicroTest is…

Page 26: Facilitated by:   Marian Doub, Associate, Friedman Associates

MT Performance Workbook BasicsWhat is it?

The MT Performance Workbook is a set of linked excel worksheets that gathers key information on your microenterprise program’s training and credit activities and provides immediate feedback on the costs, efficiency and sustainability of those activities. Plus, the integrated custom report allows you to see how your program is changing over time, how it compares to other similar microenterprise programs, and how it compares to “top performance” in the industry.

Why do people use it?

The MT performance workbook provides information crucial to adapting and refining program services and assembling winning grant proposals. Programs that complete the workbook also cite an expanded data collection and analysis capacity within their organizations as a key reason for participating.

How is the MicroTest performance workbook unique?

MT defined the set of standard measures accepted by the microenterprise industry which allows you to hone in on your microenterprise organization’s performance and discuss this performance using terms and definitions the industry agrees on. TA from MT staff allows the data to really mean something and be used in a productive way for the program. FIELD - The Aspen Institute

Page 27: Facilitated by:   Marian Doub, Associate, Friedman Associates

27FIELD - The Aspen Institute

MicroTest Program Performance Custom ReportInteractive Features of the custom report allow you to further

personalize the document for your

program.

Look at your program’s progress over time using trend data for all 50 MT

measures.

Compare your program’s

performance to those MT programs

achieving Top Performance for

key measures.

Compare your program’s

performance to your peers.

Page 28: Facilitated by:   Marian Doub, Associate, Friedman Associates

28

JEDI ‘data treasure’: MicroTest Performance Report What do you need to know vs. want to know?

Target Market Reach (n=58 MicroTest members submitting 2007 data)

MeasureRural

Program(n=8)

Mature Program(n=32)

Training-Led Program(n=34)

Low-Income

Focused Program(n=23)

Your Program

2007

Your Prog. 2008

Total Clients Served 56 278 199.5 153 153 171

% Women Served 39% 68% 73% 73% 80% 80%

% Minorities Served 39% 58% 72% 76% 5% 20%

% Disabled Served 11% 9% 9% 6% 11% 11%

% Low Income (100% of HHS) 39% 26% 32% 34% 39% 23%

% Low Income (150% of HHS) 65% 48% 51% 56% 65% 51%

% Low Income (80% of HUD) 90% 75% 78% 77% 90% 74%

% of TANF Clients 13% 5% 8% 10% 18% 4%

Page 29: Facilitated by:   Marian Doub, Associate, Friedman Associates

MT Outcomes Workbook BasicsAdapted from: http://fieldus.org/Microtest/OutcomesDetails.html

What is it?MicroTest Outcomes Workbook is a series of linked Excel worksheets—including intake (baseline) and survey (outcomes)—with an accompanying Instruction Guide. MicroTest staff provide data cleaning, analysis, custom report, and technical assistance. Analysis and custom report includes: non-response bias, dashboard, overview, longitudinal analysis of results.

Why do programs use it?

Designed to answer key questions about clients’ business and household outcomes: Are the clients in business? Are the businesses growing? Creating jobs? Does the business contribute income to the household? Are clients moving out of poverty?

The indicators are few and focused on key questions managers must constantly answer with respect to program effectiveness. The data collection process and analysis is simple.

The approach is a way for programs to monitor outcomes - it is not outcomes assessment or evaluation.

The MicroTest Outcomes Workbook and Instruction Guide are updated every year based on members’ feedback.

Page 30: Facilitated by:   Marian Doub, Associate, Friedman Associates

30

JEDI ‘data treasure’: MicroTest Outcomes Report

Page 31: Facilitated by:   Marian Doub, Associate, Friedman Associates

Integrated Measuring Success(Path Three)

Best Practice

Process & Products

Page 32: Facilitated by:   Marian Doub, Associate, Friedman Associates

32

Measuring SuccessFour Practical Steps

for Building Integrated Systems

Step One: Name and Define Success

What does your Mission-driven success look like?

Purpose: Mission-driven goals & framework for describing program and longer term benefits (measures of success)

Tools: Theory of Change, Logic Model, Stakeholder Interviews, Data Treasure Hunt, Benchmarks Literature Review

Page 33: Facilitated by:   Marian Doub, Associate, Friedman Associates

33

Name Your Program Success on your terms…key outcomes indicators

Mission: JEDI increases the economic well-being of people and communities through business development and local wealth creation.

JEDI succeeds when entrepreneurs succeed in one or more years to:

1.Start businesses;

2. Strengthen, formalize, and expand businesses;

3. Establish and maintain profitable businesses;

4.Create and/or retain employment for themselves and others; and

5. Improve household financial security.

Page 34: Facilitated by:   Marian Doub, Associate, Friedman Associates

34 34

Name Your Program Success on your terms…two tools others use

JEDI Theory of Change

Personal & household financial health practices & status improves

Businesses start, strengthened & sustained

Small business owners create, retain, & sustain living wage jobs for self & local residents

Business revenue increases household income to improve household self sufficiency & assets

Local businesses create healthier local economy

Healthier local economy leverages more micro- and small business markets, financing & resources

JEDI provides Siskiyou County residents with business development, asset and credit building, and community development training, technical assistance, and linkages to resources.

JEDI business development & local wealth creation increases the economic well-being of people and communities.

Page 35: Facilitated by:   Marian Doub, Associate, Friedman Associates

JEDI’s key outcomes indicators

JEDI knows businesses are successful when in 2 to 6 years:

– Income from business contributes to household financial self-sufficiency and security;

– Business revenue allows owners and other employees to increase purchasing of goods and services (household & business-to-business spending) and increase the income tax base;

– Businesses provide local communities with needed goods and services;

– Businesses attract regional markets and investment;

– Businesses provide local communities with cultural and social assets; and

– Owners give back and reinvest in local community.

Page 36: Facilitated by:   Marian Doub, Associate, Friedman Associates

JEDI’s key outcomes indicators defined:JEDI SUCCEEDS WHEN CLIENTS SUCCEED IN ONE OR MORE YEARS TO…

DEVELOP BUSINESSES AND…(read down)

How we know success… How Our Results Measure Up…

Businesses start Feasibility assessment complete Under 1 year of consistent sales Establishing operations, marketing, and plans Date business begins

66 percent of clients without businesses at entry start businesses (64 % national average: 31 businesses started in 2008) Businesses stabilize Business does one or more of the following and

maintains change for at least 3 months: Reaches break even Stabilizes operations

Businesses formalize

Owner does two of the following: Completes 2 action steps in each area: business management, marketing, financial management Completes business plan ready for investors or partners

76 percent of clients complete business plans (national average: 63%, national top performers: 88%)

Businesses grow

And More…

Business does one or more of the following and maintains change for at least 3 months: Increases revenue (average net change of >5%) Opens new location or expands square footage Increases number of employees Launches a new product line or service

42 percent of businesses increase revenue by 46 percent (median increase of $8,900; nationally 37% median revenue increase)22 percent of JEDI businesses increase revenue by 50 percent or more a yearBusinesses hire 0.5 new employees on average

Page 37: Facilitated by:   Marian Doub, Associate, Friedman Associates

RTC Quarterly Dashboard (mock data)

RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft)

What do you need to know vs. want to know?

Page 38: Facilitated by:   Marian Doub, Associate, Friedman Associates

RTC Quarterly Dashboard (mock data)

RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft)

What do you need to know vs. want to know?

Page 39: Facilitated by:   Marian Doub, Associate, Friedman Associates

RTC Quarterly Dashboard (mock data)

RTC ‘data treasure’: Quarterly Dashboard Report (mock up/draft)

What do you need to know vs. want to know?

Page 40: Facilitated by:   Marian Doub, Associate, Friedman Associates

40

Measuring Success-Four Practical Stepsfor building integrated systems

Step Two: Assess and Align Internal Capacity to Manage the Data You Need

Purpose: Design & resource your MIS—data collection, entry, storage, use—to measure and result in Mission-driven success.

Tools: Monitoring & Evaluation Plan; Data Fields Inventory, Audit and Alignment; Data collection tool templates, pilot

1. Assess and select outcome tracking resources & method(s);

2. Inventory/audit of baseline and outcome fields/questions in databases, data collection tools, reports;

3. Revise forms, databases, reports, processes as needed

Page 41: Facilitated by:   Marian Doub, Associate, Friedman Associates

4141

Data Collection for Measuring Success—A Good Intake Tool…

• Sets the baseline questions for measuring long term outcomes—the questions are asked and tracked throughout the system in almost exactly the same way at intake, on update forms, and surveys.

• Encourages the program staff and clients to assess and update progress on a regular basis—it is useful for more than data.

• Asks for 1-2 other contact people to help stay in touch.

• Records the data event, collection, and entry dates as well as staff names.

• ‘Translates’ evaluation & metrics into a language everyone can use.

• Communicates clear guidelines for information use.

Page 42: Facilitated by:   Marian Doub, Associate, Friedman Associates

4242

Data Collection for Measuring Success— RTC RTC Program Applications/Intake Forms

• Baseline• Goals• Scope of Work/Terms of Service Agreements

Page 43: Facilitated by:   Marian Doub, Associate, Friedman Associates

4343

Data Collection for Measuring Success – In Action

RTC One-on-One Coaching formRTC Outcome Update form (for staff)

JEDI Outcome Update form (for staff)JEDI Client Action Plan

Encourage Staff and Clients to achieve & report Mission-driven results

Page 44: Facilitated by:   Marian Doub, Associate, Friedman Associates

4444

Data Collection for Measuring Success—A Good Survey Tool is…

• Administered no more than 1x a year by phone, web- based, or mail-in. Phone is usually most successful.

• Uses incentives—raffle of something useful to business owners—for those who complete survey.

• Administered by trained staff or volunteers who are not direct service providers.

• Looked forward to once the ‘check-in’ practice is established.

• Does not take the place of program evaluation or experimental research.

Page 45: Facilitated by:   Marian Doub, Associate, Friedman Associates

4545

Data Collection for Measuring Success JEDI Phone Interview Survey Tool

Page 46: Facilitated by:   Marian Doub, Associate, Friedman Associates

4646

Data Collection for Measuring Success RTC Online Survey Tool

Page 47: Facilitated by:   Marian Doub, Associate, Friedman Associates

47

Measuring Success-Four Practical Stepsfor building integrated systems

Step Three: Redesign and Implement Data Management System

Purpose: Mobilize and implement MIS and human resources to support measuring success.

Tools: Database Needs Assessment; Database Product Assessment & Selection; Database re-engineering (configuration) & transition (conversion of existing data);

MIS for Microenterprise: A Practical Approach to Managing Information Successfully

http://fieldus.org/Publications/MISManual.pdf

Goal is to increase efficiency and effectiveness of internal systems—eliminate parallel data sources, duplicate data entry, time intensive reports, etc.

Dedicate enough resources—time, staff, money—to do this project well—this is a long term investment.

Document!

Page 48: Facilitated by:   Marian Doub, Associate, Friedman Associates

4848

Step 3: Select MIS Products for Tracking Program Data—including Outcomes

Promising Options• VistaShare Outcome Tracker;

• WebCATS (Client Activity Tracking Software, primarily SBA grantees;

• The Exceptional Assistant, by Common Goals Software;

• Salesforce (with other platforms for outcome tracking (MicroTest Outcomes or Success Measures);

• Efforts to Outcomes/Social Solutions;

• Money In, Money Out, Technical Assistance (MIMOTA) by

• Villagesoft;

• Portfol

• Applied Business Software

• Others?

Page 49: Facilitated by:   Marian Doub, Associate, Friedman Associates

49

Measuring Success-Four Practical Stepsfor building integrated systems

Step Four: Use and Sustain the Results & System

Purpose: Use Mission-driven goals, framework, information for to prove and improve program success.

Tools: Adjust staffing plan/job descriptions to include data analyst, collection, entry, management/coordination, reporting; Report/use design, pilot, plan; Staff training & TA/support (ongoing); Operations manuals

Program Management: Scorecards, Dashboards, regular reports

Public Relations/Fundraising Materials Learning Circles-highlight and explore strategic issues using

results with staff, clients, Board, communities And Many More…

Page 50: Facilitated by:   Marian Doub, Associate, Friedman Associates

50 50

Why Do You Measure Success? JEDI ‘data treasure’: Fact Sheet 2009

Page 51: Facilitated by:   Marian Doub, Associate, Friedman Associates

RTC Fundraising Proposal:

Over the past few years, Rising Tide Capital has trained over 300 entrepreneurs in the basics of business planning and management. 117 of these graduates are currently in business and an additional 150 are currently in the planning stages. An outcome survey conducted during Summer 2010 showed that:

– Within one year of going through our programs our entrepreneurs have experienced an average increase in business revenue of 80% and a corresponding increase in household income of 14%.

– Collectively, these businesses are generating nearly $2 million in annual business sales—producing significant economic activity in an area of otherwise concentrated poverty.

– These outcomes mean that for every dollar donors give to Rising Tide Capital, we are able to produce $3.80 in economic impact.

Page 52: Facilitated by:   Marian Doub, Associate, Friedman Associates
Page 53: Facilitated by:   Marian Doub, Associate, Friedman Associates

09/15/10

Page 54: Facilitated by:   Marian Doub, Associate, Friedman Associates

54

JEDI & RTC Comments

• What step is your organization taking now?

• Examples of most useful tools-framework & data collection (forms & process) & reports

• Most useful results & surprises

• Challenges & lessons learned

Page 55: Facilitated by:   Marian Doub, Associate, Friedman Associates

5555

What is Your Path to Measuring Success?

• Why & How Much Do You Measure Success?

• How Do You Best Know and Use Your Results? Gather and organize information that can help you

improve program management and decision-making

Make the case well

Improve work-flow

What do you need to know vs. want to know?

• What is your return on investment/value proposition for Measuring Success?

Page 56: Facilitated by:   Marian Doub, Associate, Friedman Associates

MicroTest Training from Friedman Associates for new and renewing members

Individual Onsite Training—1.5 days• Hands on introduction and practicum in the MicroTest

Performance and Outcomes systems• Assessment and recommendations for optimizing systems

related to MT• Up to five (5) one-hour phone consultations• First year of MT membership for new members

Group—2 days for 3 + organizations, location TBD• Case study-based introduction and practicum in the

MicroTest Performance and Outcomes systems• Up to five (5) one-hour phone consultations for each org.• First year of MT membership for new members

Page 57: Facilitated by:   Marian Doub, Associate, Friedman Associates

Questions?

Page 58: Facilitated by:   Marian Doub, Associate, Friedman Associates

5858

Please contact us for more information :

Jason Friedman [email protected](319) 341-3556

Marian Doub [email protected](415) 730-1873

Nancy Swift [email protected](530) 926-6676

Alex Forrester [email protected]

(201) 432-4316

THANK YOU!