By Zoe Staines, Queensland University of Technology
The Closing the Gap framework sought to halve the employment gap between Indigenous and non-Indigenous Australians, among other targets. But the employment target expired unmet this year.
In remote parts of Australia, the gap has actually widened since 2011.
Governments have relied on a series of employment programs to tackle the employment gap, but these have not yielded positive outcomes. Before the new program starts in 2019 we need more evidence of what does and doesn’t work.
There has been no robust evaluation of the last two employment programs. If we had evidence of what does work might help us finally start closing the gap.
Although the median Indigenous income has improved overall, the income gap between Indigenous and non-Indigenous Australians is also growing, particularly in remote areas.
This is a concerning trend and does not align with government narratives around reducing Indigenous disadvantage.
Employment programs
Since the Community Development Employment Projects scheme began to be rolled back in 2007 (before it was later abolished), a series of other programs have operated in remote communities.
These have included the Job Network, Job Services Australia, the Remote Jobs and Communities Program and the current Community Development Programme.
The standard approach of these programs has been to increase pressure on job-seekers to participate in ‘work-for-the-dole’ (for example, through increased participation hours), and mete out financial penalties when job-seekers fail to abide by the program rules.
In this way, it’s hoped programs can somehow push job-seekers into employment.
The four programs are very similar in terms of their modes of delivery, funding structures and core components. However, they also differ in important ways.
For example, although Job Network and Job Services Australia included graduated support for more severely disadvantaged jobseekers, this was removed from the Remote Jobs and Communities Program and Community Development Programme.
Funding for broader community development (to create more jobs) that existed under the Remote Jobs and Communities Program was also dramatically reduced under the Community Development Programme.
Many, including program providers, participants, Indigenous leaders and academics, have argued this approach oversimplifies the challenges involved in improving remote employment.
For example, employment programs haven’t adequately addressed structural barriers to gaining employment, such as the availability of jobs and the long-term effects of poorer educational attainment, health and wellbeing.
Nevertheless, robust evidence concerning outcomes and impacts of these recent programs is scarce.
Evaluations of Job Network and Job Services Australia were undertaken, but they were not independent, and had methodological problems.
This meant they could not reliably distinguish program effects from other factors that may have also influenced results. Even so, the evaluations only uncovered minimal evidence of positive outcomes.
The subsequent program — the Remote Jobs and Communities Program (2013-2015) — was not evaluated at all.
More harm than good?
The Community Development Programme (2015-present) has been subject to a number of reviews, including by the Senate Standing Committee on Finance and Public Administration and the Australian National Audit Office.
These reviews, and other research and commentary, have pointed to anecdotal evidence the Community Development Programme has caused harm.
For example, inflexible program rules have resulted in disproportionate fines being imposed. This has hurt income stability and food security for some jobseekers, many of whom are already living in circumstances of disadvantage.
An independent evaluation of the Community Development Programme is currently under way. However, despite the evaluation being planned for completion in mid-2018, no findings have been publicly released.
The Community Development Programme evaluation design was only developed and signed off between seven and 10 months after the program was implemented (rather than forming part of the program design).
This contradicts one of the best practice principles for evaluation in Indigenous affairs. There was also no consideration of the initial design by an evaluation reference group.
According to the Department of Prime Minister and Cabinet, the evaluation is supposed to “assess early signs of impact and explore what works for who and in what circumstances”.
However, aside from some information regarding the types of data being used, the exact methods used in the evaluation are unclear.
In particular, it’s unclear how or whether the evaluation will be able to isolate the impacts of the Community Development Programme.
A new program is planned to replace the Community Development Programme from February 1 2019. Ideally, the evaluation findings would have been available to inform ongoing consultation.
But most of this consultation has now already taken place.
The federal government has committed to improving the evidence base in Indigenous affairs. It has highlighted the importance of achieving greater transparency in the public release of evaluation reports (in line with similar calls elsewhere) and also made moves to appoint an Indigenous commissioner to the Productivity Commission.
These are positive steps. But the government must hold itself to the same standards as it seeks to hold others to.
Rigorous, well-designed evaluation is important in informing future policy-making and developing a stronger evidence base for strategies that hold true potential for closing the remote employment gap. Monitoring and evaluation are also important for ensuring programs intended to reduce disadvantage do not, instead, exacerbate it.
This article was originally published on The Conversation. Read the original article.
COMMENTS
SmartCompany is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while it is being reviewed, but we’re working as fast as we can to keep the conversation rolling.
The SmartCompany comment section is members-only content. Please subscribe to leave a comment.
The SmartCompany comment section is members-only content. Please login to leave a comment.