Family Works HB and Results Based Accountability
description
Transcript of Family Works HB and Results Based Accountability
Family Works HB and Results Based Accountability
What did we do and how well did we do it?A consistent approach to identifying and
measuring outcomes
RBA – what it means to Family Works HBSystematic approach to planning and
measuring outcomesA framework for monitoring and evaluating
what we doA framework for reflectionA framework for reporting A framework for planning
To know we are doing is making a difference
How did we get started?: Evaluation cycle: monitoring, evaluation and learning
Define and plan service delivery
Do itImplement
service
Analyse data and review
service outcomes
Reflect on outcomes and refine service
·Accountability·Clients ·Governance·Funders·Communications · internal & external·Fundraising
Evaluation in Action
Action Research model
Process used to establish evaluation framework
Map the programme ‘focus group’ discussions with staff describing “what
we do, how we do it and how we know when its working”
Build a programme logic derived from programme map
Identify most useful performance measures
based on an RBA approach
The JourneyIn 2008 we developed our programme logic and
embarked on RBA.
In July 2009 we merged another organisation into Family Works HB – 10 new staff, 2 new services and a number of new programmes.
In 2010 – we purchased a data base and have spent time ensuring a fit with our tools, processes etc designed around an RBA framework.
This is work largely completed.
INPUTS ACTIVITIES OUTPUTS
Direct products of the
activities
OUTCOMESShort-termResults
we expect to see
Medium termResults we want to see
Longer termResults we hope to see
Planned work Intended results Impacts
PROGRAMME LOGIC: A way of describing a programme for planning and evaluation
Resources + What you do = Results
Within your control
INPUTS ACTIVITIES OUTPUTS
Direct products of the
activities
OUTCOMESShort-termResults
we expect to see
Medium termResults we want to see
Longer termResults we hope to see
Planned work Intended results Impacts
· · ·
· · ·
Performance Measures / Indicators
· · ·
· · ·
PROGRAMME LOGIC: A way of describing a programme for planning and evaluation
Resources + What you do = Results
How do you know this has happened?
INPUTS ACTIVITIES OUTPUTS OUTCOMESShort-term
(working with FW )Medium term
(On closure with FW )Longer term
(6–12 mnths post?)
Performance Measures / Indicators
Tracking Inputs:
costs staff time # clientsresources
Quantity & quality of Activities:
Level & quality of engagement with family
Quality of the services
Monitor & review intra/inter-agency processes
Tracking Outputs:
• Assessment completed
contract agreed
goal plan in progress
# sessions # reviews case status
on closure Safety
assessment completed
Community participation assessment completed
Client participation rate
Change achieved in issues in Goal Plan – client reviews
Changes in Safety, Care & Stability – SW reviews
Community participation reviewed
Client participation rate
Change achieved in issues in Goal Plan – intake vs closure
Changes in Safety, Care & Stability – intake vs closure
Client service evaluation – incl. PSNZ measure
Community participation
Educational status-children
Family PHOregistration-
going to GP as required
Number of adverse contacts with Agencies for client families (CYF, WINZ, Police - s.15 notifications; s.19 referrals)
Families represent through PR
How did we start: programme mapping
How much service did we deliver?
Performance Measures
How welldid we
deliver it?
How much change / effect
did we produce?
What quality of change / effect
did we produce?
Quantity Quality
Effe
ct
Effo
rt
O
utpu
t
In
put
How much did we do?Ou
tcom
es -
Effe
ct
Inpu
ts -
Effo
rt # clients # sessions # reviews funding case completions vs non-completions Referral sources Agencies involved
Client demographics Presenting issues Match with priority populations
Background Info Community profiles Events / issues in community
Client service evaluation – satisfaction survey Social worker satisfaction survey - [workloads, resources, support – supervision & direction etc.] Social work practice - progress with Goal plan etc. , % reviews at 6-8 sessions Closures / completion rates / reasons for non-completion Client participation rates / Level and quality of engagement with clients
Ratings show changes in issues listed in Client contract – client assessedRatings show changes in Child Safety dimensions– SW assessedRatings show changes in community participation rates- SW assessed Client service evaluation – satisfaction survey including PSNZ measure Client engagement & participation rateAfter 6 & 12 months: Educational status of children- children enrolled and attending school Family PHO Registration-going to GP as required Number of adverse contacts with Agencies for client families (CYF, WINZ, Police - s.15 notifications; s.19 referrals)
How well did we do it?
Is anyone better off?
What have we achieved to date.
Client plans completed
Client outcomes met
24%
2008 / 09 2009/10 2010/11 2011/12
57% 45% 96% 95%
2008/09 2009/10 2010/11 2011/1229% 55% 75% 85%
Social Work results
0%10%20%30%40%50%60%70%80%90%
100%
2008 - 20092009-20102010-20112011/2012
Our results – what happenedReview Findings
Tools were okay – some minor tweeksStaff inconsistent in use of tools and
reportingInadequate QA
Action planImprove QA and monitor practice closelyPolicy changes and training around “forming
a belief” and reporting accurately.Infrastructure changes
Key enablersOutcomes reporting and RBA was supported
by MSD
My CEO, and Executive wanted a tool to evaluate what we did.
Staff understand our need to report outcomes
CMS aided the process of implementation.
Barriers/ ChallengesNew Manager, new staff , new systems, new
CMS.Existing capability / capacity versus increasing
demand and complexityGreater levels of quality assurance neededCMS – added an additional challengeFear – a tool to measure individual
performance?Fear – of change,Inconsistent practice
Lessons learntBring all staff on the journey, don’t leave any
behindBe clear with staff of our legal obligations for
reporting to funders and our Board and clientsOne step at a time.Don’t be afraid of changeThis is about the service , not individualsLook for patterns – these tell a story on
their own
Lessons learntBe prepared to delve into the files for
solutions and to get the story behind the data e.g. – 25% disengagement – find out when in the process and why and take action
Review data management tools and rating scales- are they doing what you want them to do? Don’t be afraid of change.
What surprised me?Our results.
Not all our results are good but we had a starting place from which to launch practice improvements.
We have to have a framework by which to hang our results
Client demographics remain stable -No surprises
Issues and challengesConsistency, completeness, case reviews,
quality assurance – protect against garbage in garbage out.
All our work is on CMS and reporting is easier.A recent restructure and subsequent staff
changes challenge our ability to meet demand.Reviewing and updating our practice model-
change raises levels of anxiety.
RecapProgramme mapping
Vigilance around reporting
Good QA system
Integrity of data
Would I recommend RBAMost definitelyIt provides structure , it asks and allows us to
answer the key questions – do we make a difference in the lives of those we work with.
AndIf not , why not.It helps to justify our existence by providing
an evidence based framework.
Children born healthy, Children succeeding in school, Safe communities, Clean Environment, Prosperous Economy
Rate of low-birthweight babies, rate of high school graduation, crime rate, air quality index
1. How much did we do? 2. How well did we do it?
3. Is anyone better off?
RESULT or OUTCOME
INDICATOR or BENCHMARK
PERFORMANCE MEASURE
A condition of well-being for children, adults, families or communities.
A measure of how well a programme, agency or service system is working. Three types:
A measure which helps quantify the achievement of a result.
= Customer Results
Whole population
Client population