We acknowledge the traditional owners of the land and we pay respect to their Elders both past and present.

Acknowledgement- Aboriginal FlagAcknowledgement- Torres Strait Islander Flag

Implementing an evidence-based program

Evidence-based programs are intended to lead to positive outcomes for participants, using effective program design and planning processes. Implementation is the planned and intentional process used to apply the program design and plan into a real world setting [1] [2]. The success of a program (i.e. whether it leads to the intended outcome) will depend on the quality of implementation.

Many programs have failed to reach their full potential due to issues that are inherent in the service system, as well as in the implementation process. For example, social problems such as drug and alcohol use, child maltreatment, poverty, family violence and mental health disorders impact upon the capacity for clients to participate in a program. In addition, organisational characteristics and systemic problems such as high caseloads and staff turnover can also be limiting and resistant to change. However, wider organisational factors and workforce competence that are within the organisation’s control can improve quality implementation.

Quality implementation processes relate to:
»» Workforce competence (staff recruitment and selection, training and supervision)
»» Organisational capacity (resourcing and leadership)
»» Ongoing fidelity (adherence to the model)
»» Outcome monitoring (evaluation)

Some factors to consider when implementing a program are program match, program quality and organisational resources [3].

Program Match

Once the community needs have been established and the program goal and objectives are outlined a useful step is to look for existing evidence-based programs. Better outcomes may be achieved for families and children if programs have the strongest evidence-base.

In recent years there has been a significant increase in the number of evidence-based programs designed to reduce individual and family problems and promote healthy development. Because each program has undergone rigorous testing and evaluation, program practitioners can reassure potential program sponsors that the program is likely to be effective under the right conditions, with the appropriate audience and with the proper implementation [3].

However, knowing which program is the “right” one for a particular setting and audience is not always easy to determine. When selecting a program, it is important to consider whether a program fits with the local agency’s goals and values, the community setting and the needs of the targeted audience.

The Hexagon Tool [4] uses six factors to evaluate new and existing interventions to help
agencies to identify the right evidence-based program to implement.

Program quality

Program fidelity is broadly defined as the degree to which an intervention is delivered as prescribed. A major appeal of evidence-based programs is their promise of effectiveness. These programs have shown, through rigorous evaluations, that they can significantly affect important outcomes for participants. The best of them have demonstrated positive effects in a number of different settings. For policymakers, funders, and program practitioners, that potential for effectiveness can make an evidence-based program more attractive than an unproven program.
However, we can only assume that a program will continue to have those effects if it is implemented according to the original program design. Program drift is when the program is adapted either with intent or unintentional delivery. Changes to program content, duration or delivery style may diminish a program’s effectiveness.

Measures of program fidelity include:

»» a program’s level of integrity
»» the level of confidence that can be drawn from the program outcomes, i.e. the degree to which the intervention is driving any reported changes in the process and/or impact outcomes associated with the program’s evaluation.

Organisational resources

Many professionals working with children and families will already be working effectively. It can be a challenging and demanding task especially given the changing and diverse nature of Australian society and family life. An effective ongoing process of professional development and support will help professionals deal with these demands and potentially improve outcomes for children and families. Different types of professionals working with young children and families have different training requirements, however a number of training needs have been found to be common for all professionals working with young children and their families [5]. They are:
»» training in communication and counselling skills
»» family-centred practice
»» cross-cultural competence
»» inter-disciplinary teamwork
»» inter-agency collaboration
»» inclusive practices;
»» how to use natural learning environments

In determining the most appropriate method for building professional capacity, careful consideration also needs to be given to the intended purpose of the learning and development and to the needs of individuals. For example, if there is significant practice change being implemented across an organisation, a tailored training package to the whole work group will be required, with methods for supporting and reinforcing learning in the workplace. This may require a combination of methods. If you have an individual practitioner who would like to advance their skills and conceptual thinking and for whom qualifications offer some appeal, supporting them to undertake a higher degree may be appropriate.

Models of practice

When implementing a program or service, it is useful to articulate a model to guides practice. A model of practice, or practice framework, is a conceptual map that brings together an agency’s approach to practice. It establishes a unified vision for work that is grounded in the realities of practice, supported by research and embedded in a set of principles and values that are essential to the work. It provides a clear understanding of what underpins the work, and how this informs our interventions with children and families. As a tool for practitioners, it provides a theoretically informed intervention logic and a set of triggers to support best practice.

A model of practice supports program staff to work within a context of endorsed organisational values, and enables staff to adopt practices that support these values. This means that staff do not ‘do the right thing’ only because they feel a personal moral imperative or because the rules require it, but because it is endorsed by organisational values and expectations. A model of practice can promote consistency in approaches across an organisation, which can sometimes be subject to emerging trends and evidence, reactions to crises and changing leadership. A model of practice is used as a reference point for make decisions about service-delivery and can help prevent program drift. It can shape organisational design and guide the content of organisational policy and inform staff training, shape quality assurance processes and staff performance expectations.


1] Fixsen, D. L., Naoom, S. F., Blase, K. A., & Friedman, R. M. (2005). Implementation research: a synthesis of the literature.

2] Mitchell, P. F. (2011). Evidence-based practice in real-world services for young people with complex needs: New opportunities suggested by recent implementation science. Children and Youth Services Review, 33(2), 207-216.

3] University of Wisconsin Cooperative Extension: Evaluation Resources: http://www.uwex.edu/ces/pdande/ and Logic Model Course: http://www1.uwex.edu/ces/lmcourse

4. Blasé, K., Kiser, L, & Van Dyke, M. (2013). The Hexagon Tool: Exploring Context. Chapel Hill, MC: National Implementation Research Network, FPG Child Development Institute, University of North Carolina at Chapel Hill.

5. Moore, T. (2005). What do we need to know to work effectively with young children and families? Paper presented at the 9th Australian Institute of Family Studies Conference, Melbourne, Australia. <www.aifs.gov.au/conferences/aifs9/moore2.html>

Further reading

Centres for Disease Control and Prevention: http://www.cdc.gov/eval/
Centre for Family Research and Evaluation (CFRE) www.ds.org.au
Community Tool Box, University of Kansas: http://ctb.ku.edu/
Creating Pathways to Prevention, Griffith University: https://www.griffith.edu.au/criminology-law/griffithcriminology-institute/our-programs-of-research/creating-pathways-to-prevention
Diversity https://aifs.gov.au/cfca/publications/strengthening-aboriginal-family-functioning-what-worksand
Evaluation toolkit – www.evaluationtoolbox.net.au
Evidence based programs – www.aifs.gov.au/cfca/guidebook/programs
Evidence based practice & fidelity: https://www.youtube.com/watch?v=BIl-8_R5QRQ
Harvard Family Research Project: http://www.gse.harvard.edu/hfrp/
Innovation Network: http://innonet.org
Theory of change: http://developingchild.harvard.edu/resources/building-adult-capabilities-to-improvechild-outcomes-a-theory-of-change/
University of Wisconsin Cooperative Extension: Evaluation Resources: http://www.uwex.edu/ces/pdande/ and Logic Model Course: http://www1.uwex.edu/ces/lmcourse
W.K. Kellogg Foundation: http://www.wkkf.org/Programming/Overview.aspx?CID=281
World Health Organisation www.who.org.au

Back to top