Table 3.1 ANAO Audit Report No. 30 2010-11
- implementation
of recommendations
- early
program challenges
- procurement
- value for
money
- purchasing
decisions
- ICT
supporting infrastructure
- additional
fees
- data accuracy
- progress on computer
implementation
- program monitoring
and evaluation
- performance
indictors
- program
evaluation.
Implementation of ANAO
recommendations
3.24 As noted above, the ANAO’s report contained three recommendations
aimed at improving the administration of the NSSCF. Referencing these in his
opening statement, the Auditor-General outlined the report findings which
suggested that DEEWR’s administration of the program ‘had been effective in
supporting progress through a partnership approach towards the computer fund’s
objective of increasing the computer-to-student ratio’. However, the
Auditor-General also noted the report found that ‘there were aspects of the Department’s
oversight that could have been strengthened’.[16]
3.25
In response to the statement made by the Auditor-General, DEEWR noted
the ANAO’s valuable contribution in the continuous improvement of
administrative processes. DEEWR also advised that they consider that the first
two recommendations made by the ANAO have been fully implemented, with the
third relating the 2012 calendar year to be fully implemented in due course.[17]
3.26
The Committee asked DEEWR for further detail on the implementation of
each of the three recommendations.
Implementation of Recommendation
No.1
3.27
In response to the ANAO’s recommendation to strengthen future program
funding agreements with non-government education authorities, DEEWR noted that:
…since the audit report… the basis of the program has
changed to that of a national partnership. However, under the DER funding
agreements, which are part of the national partnership arrangements,
non-government education authorities are, in fact, required to submit
six-monthly progress reports, outlining their progress to reach a
computer-to-student ratio of one to one by the end of 2011. All non-government
education authorities have complied with these requirements, and the most
recent progress report was received by us on 15 July 2011.[18]
Implementation of Recommendation
No.2
3.28
The second recommendation suggested establishing a balanced set of
portfolio budget statements key deliverables and performance indicators to
measure the effectiveness of the program.[19]
3.29
DEEWR considered they have fully responded to this recommendation with
the inclusion of additional information in the 2011-12 DEEWR Portfolio
Budget Statement (PBS). The key performance indicators listed in the PBS are:
- number of schools
assisted; and
- number of computers
installed.[20]
Implementation of Recommendation No.3
3.30
The third ANAO recommendation suggested action to be taken in early 2012
aimed at increasing assurance over schools' achievement of the computer-to-student
ratio of one-to-one through an audit of a sample of schools.[21]
3.31
According to DEEWR, this recommendation is being advanced as part of the
overall evaluation strategy for the fund. A mid-program review is planned in
2012 to cover a number of aspects and ‘will incorporate an audit in conformance
with the recommendations of the Audit Office’.[22]
Early program challenges
3.32
The ANAO provided commentary on a significant funding issue arising from
the rush to meet the Government’s 100 day commitment, whereby the on-costs
associated with the deployment and support of the computers had not been
agreed. The Government initially thought that states and territories would
cover these costs, however, the COAG did not endorse the proposal. Dr Paul
Grimes, then Deputy Secretary of the Department of Finance and Deregulation,
was appointed to undertake a review to determine the full costs of computers
funded through the NSSCF.[23] Following the conclusion
of the review and agreement from COAG, the Government announced a further $807
million in funding for the program.[24]
3.33
In trying to determine why these costs were not agreed as part of the
initial funding announcement, the Committee asked what advice the Department
had given to the Government prior to the first COAG meeting.
3.34
Noting that the audit canvassed these events in detail, and are
otherwise on the public record, Dr Evan Arthur, Group Manager, National Schools
and Youth Partnerships Group, provided a historical account as follows:
Immediately after the election of the Rudd government, there
was a COAG meeting which agreed to the rollout of the computers, but there was
a reservation, as I recall, in terms of funding of the legitimate additional
costs. At a subsequent COAG meeting…the Commonwealth agreed that it would meet
legitimate additional costs for installation of the computers. …A process was
then agreed by which we would quantify what those legitimate additional costs
were. That was the process which was managed by Dr Grimes. … On receipt of the
Grimes report, the Government decided to accept the quantification of the costs
contained within the Grimes report.[25]
3.35
Additional questions on this matter were taken by the Department on
notice. The Department’s response to the Committee had not been provided at the
time of the publication of this report.
3.36
In regard to the overall program rollout, DEEWR asserted that beyond the
initial disagreement over funding for on-costs, there have been ‘no substantive
difficulties within the administration of the program’. While over the course
of the program questions have arisen, DEEWR consider these ‘have all been
handled in an entirely cooperative way’. [26]
3.37
DEEWR reinforced their view of the program’s successful progress with
the advice that ‘at no point has the program failed to meet its timetable of
pre-existing commitments’.[27]
Procurement
Value for money
3.38
The ANAO noted the concept of ‘value for money’ was raised by the
Government at the very early stages as an expectation of this program.
Following consultation with central agencies, a unit price of $1000 per
computer was established. By way of example, the ANAO report explained that based
on this unit price, a school requiring 10 computers would receive $10,000 in application
round funding.[28]
3.39
Taking into account the highly competitive ICT market in Australia, the
ANAO report noted DEEWR encouraged government education authorities to undertake
centralised purchasing processes for schools to achieve maximum purchasing
power.[29]
3.40
The ANAO found the DEEWR approach to encouraging value for money was generally
sound. The option to use residual funding for complementary ICT equipment
provided flexibility for education authorities and schools as well as a strong
incentive to achieve value for money.[30]
3.41
The Committee asked DEEWR for their view on this policy and whether they
would consider using this type of mechanism, or a similar refined version, for
future programs.
3.42
In regard to the current program, DEEWR commented that rather than
micromanage expenditure of funds at school level, this policy decision allowed
flexibility to apply the funds to a range of complementary purposes.[31]
3.43
The second part of the question relating to further promulgation of this
policy was put on notice. The Department’s response to the Committee had not
been provided at the time of the publication of this report.
Purchasing decisions
3.44
Noting the flexible arrangements, the Committee asked DEEWR whether any
data was being collected identifying school ICT purchasing decisions. The
Committee was interested in the overall benefit being realised by students.
This included whether schools are providing laptops and if so, are these being
made available to students outside of school hours.
3.45
According to DEEWR, the majority of deployments have been Netbooks.
However, some schools elected to install desktops and there has been an
emerging trend toward investment in slates (iPads or similar). Overall, a very
wide range of choices have been made, depending on both end-user requirements
and the procurement model used.[32]
3.46
DEEWR advised that a range of procurement and deployment models have
been established by state and territory education authorities, and that these models
in turn go some way in determining the options available to schools.[33]
3.47
DEEWR used the following examples to demonstrate the different
approaches taken:
- Using a centralised
model, the NSW education department made definitive technology choices. These
were then made available for deployment to NSW government schools. The NSW program
provides a laptop to each Year 9 student at the start of the year, which they
can access at all times and keep until they leave school. [34]
- Victorian government
schools have been allowed a self-management approach. Schools have been able to
make their own technology decisions, as well as whether computers are made
available to students outside of school hours. This is also largely the case
for independent schools.[35]
3.48
The ANAO’s report provides additional information with two case studies
outlining the alternate models adopted by the NSW and Victorian education
departments. [36]
ICT supporting infrastructure
3.49
DEEWR claimed that the DER program ‘has also made significant changes to
the ICT environment within schools’. Having an easily accessible, supporting
network in place is a precondition of effective use of technology in education.
According to DEEWR, this has been achieved as a result of the one-off funding
injection for on costs.[37]
3.50
The Department took questions on notice from the Committee in regard to
the ICT supporting infrastructure. Specifically:
- whether any policy
work been done or proposed to be done on the nine-year rollout of the National
Broadband Network; and
- regarding the school
hub, whether any policy work on that becoming a community hub for internet
technology and ICT improvements more generally.
The Department’s response to the
Committee had not been provided at the time of the publication of this report.
Additional fees
3.51
The Committee noted that in recent months there have been a number of
media articles claiming that parents of students in Queensland schools are
required to pay additional fees for laptop computers provided under the NSSCF.[38]
The Committee asked DEEWR to respond to these claims, and advise whether those
fees are in any way associated with the DER program or any deficiencies in the
NSSCF?
3.52
DEEWR refuted the reports, stating:
…there is a very clear position from the governments on this
issue, which is that there should not be any fees associated with the provision
of computers funded by the Commonwealth. The Commonwealth, as a result of the
COAG agreements … is… meeting the total cost of ownership of the device for
four years.[39]
3.53
However, DEEWR noted that the Commonwealth Government has ‘no role in
restricting the choices schools make and how they fund their activities’.
Expanding on this DEEWR commented on three situations in which schools may be
charging parents:
- if the school wishes
to buy devices which are more expensive than the notional price funded under
the NSSCF;
- to cover costs of
providing computers to students in years other than Years 9 to 12;
or
- to support the
school’s own sustainment of the computers that they had in place at the time
that the fund was introduced in 2008.[40]
3.54
Further, DEEWR explained that if the fees are to fund the school’s
ongoing maintenance of computers outside those provided under the NSSCF (either
prior to the establishment of the fund or to years other than Years 9 to 12),
the Commonwealth could not prevent that, but there is a stipulation that it has
to fully discussed and with the agreement of the parent body.[41]
3.55
DEEWR considers that where appropriate consultation has taken place, it would
be unreasonable for the Commonwealth Government to interfere with a school’s
internal economy beyond stipulating that ‘there should be no components of
those fees which are a cost associated with the provision of the Commonwealth
computers’.[42]
3.56
Further to DEEWR’s first point that some school’s may charge fees in
order to provide computers with higher specifications, the Committee wanted to
confirm that the NSSCF notional price provided for technology of a suitable
level for activities being undertaken in classrooms.
3.57
DEEWR’s response confirmed that the funding envelope provided caters for
a ‘very highly capable device’. DEEWR noted that as the devices are universally
sourced from overseas, the appreciation of the Australian dollar has also been
of benefit. DEEWR did note that some schools may have elected to purchase more
expensive computers to support speciality software for classes such as music or
graphic design. [43]
Data accuracy
3.58
The ANAO report noted the importance of effective and timely
identification of need and capacity to support the program’s roll-out. To meet
the Government’s ‘100 day commitment’ DEEWR moved quickly to develop and
distribute a preliminary survey. On 18 January 2008, education authorities were
advised that completed surveys were due back to DEEWR by 7 February 2008. DEEWR
acknowledged the rushed timeframe.[44]
3.59
The ANAO reported that there were 460 instances (16 per cent of 2929 schools)
where schools had provided anomalous data. However, they also noted that for
the majority of these instances, the size of data discrepancies was in the
vicinity of 10 computers. The ANAO’s report suggested where data discrepancies
exceeded 10 computers, DEEWR could have asked education authorities to review
and confirm or amend data provided.[45]
3.60
Noting the historical issues with data discrepancy, the Committee asked
what assurances DEEWR could give that the data provided in the 2012 planned
review will be accurate.
3.61
DEEWR acknowledged the suggestions in the ANAO report regarding
mechanisms to improve data collection. However, DEEWR stated that they ‘do not
have information that there are discrepancies in the data’. DEEWR explained
that in order to make decisions on funding only two sets of figures were used
from the data collected. These figures were then put through a number of
iterative checks with education authorities to confirm their veracity.[46]
3.62
Following DEEWR advice on the accuracy of data, the Committee sought to
confirm whether the number of computers to be provided to schools was based on
the number of students as at the end of the 2007 school year or the beginning
of the 2008 academic year.
3.63
DEEWR advised that the figures were taken from the annual Commonwealth
census data on the number of students in the Australian school system. DEEWR
noted that some schools may have considered the figures to be not entirely
accurate, but that overall within this ‘$2 billion-plus program’ there has been
evidence of ‘significant savings in the price paid for equipment’, with
increase purchasing capacity as a result of exchange rate movement. In summary,
DEEWR stated:
There are more than enough dollars provided in this program
for any issues around the margins of those figures to be addressed.[47]
Installation progress
3.64
The ANAO found that educational authorities had reported solid progress on
the installation of computers. Most recently, at the 2011-2012 Supplementary
Budget Estimates hearings held on 20 October 2011, DEEWR advised that at the last
formal reporting date of 30 June 2011, installation was on target at
75 per cent. According to DEEWR ‘educational authorities have publicly stated
and repeated assurances to the Department and the Government that they will
meet the time frames’.[48]
3.65
In relation to the installation figure of 75 per cent, the Committee
asked if this meant the computers were delivered to schools and operational.
Further, noting that the 30 December 2011 target has been extended to early
2012, the Committee asked DEEWR to predict when all computers will have been
installed.
3.66
DEEWR confirmed that the 30 June 2011 figures were for computers that
had been delivered and were operational. In regard to all computers being in
place and functioning to bring the student to computer ratio to 1:1 for Years 9 to 12,
DEEWR advised that they expect students to have their computers at the start of
the new school year.[49]
3.67
The Committee asked whether more recent data on the progress of installation
was available. The Member for Lyne commented that in October 2011 there
appeared to have been a ‘flood of computers land’ in his electorate.[50]
3.68
DEEWR advised the Committee that while they do have more recent
information, a decision had been taken by the Government that ‘it would only
publish information derived from the six monthly consolidated reports from all
education authorities’.[51]
Program monitoring and evaluation
Performance indicators
3.69
The ANAO report notes the NPA sets out high level governance
arrangements for the delivery of the program, including performance benchmarks
(KPIs), but these primarily relate to computer installation. The ANAO noted the
difficulties of evaluating a 'multi-jurisdictional program focused on changing
teaching and learning in schools’. [52]
3.70
The Committee acknowledged DEEWR's implementation of the ANAO's
Recommendation No.2, but noted these were quantitative measures that don't
provide an indication of whether the program is meeting the stated objective of
preparing students for the digital world.
3.71
The Member for Robertson commented on the important social benefits
being achieved in her electorate as a result of the program:
I do not know that you [DEEWR] get to see the faces of the
students who receive them or to knock on doors and have a mother come to you
and say, “My son has changed his whole attitude to education because this is
the first new thing he has ever had in his life”. That is the sort of testimony
to this program that I have experienced in my electorate and particularly in the
suburb of Kariong where many families have been very advantaged by this. Their
kids have got the advantages they need to progress into the future.[53]
3.72
The Committee asked DEEWR if any KPIs are being developed to indicate
that students are significantly improving their performance across a range of
subject areas because of their engagement with new technologies.
3.73
DEEWR noted that ‘technology is only a means to an end’. In support of
this statement, DEEWR drew the Committee’s attention to the documented outcomes
the Department developed for the program, which focus on educational outcomes
rather than the provision of computers.[54]
3.74
Further, DEEWR contended as there are multiple inputs into ‘good or bad
educational outcomes’, it is extremely difficult to isolate the influence of a
single factor, in this case technology. DEEWR referred to high-level studies by
the former British Government agency, Becta[55] into the correlation
between introduction of technology and results. While the results appeared
positive, DEEWR cautioned overemphasise on Becta’s findings in a ‘very fraught
methodological area’.[56]
3.75
The Committee referred DEEWR to a recent article in the Courier Mail[57]
where the significant improvement in NAPLAN results of the Doomadgee State
School was in part attributed to ICT. The Committee suggested tools such as
NAPLAN may be able to provide the longitudinal work.
3.76
DEEWR noted that a number of studies have identified the dominant
variables influencing results as principals, leadership and teaching quality.[58]
Offering a personal perspective Dr Arthur commented:
… if you combine those
strengths with the kind of potential that technology offers, I personally am
sure that you can get exceptionally good results from that. I am just being
cautious in the sense that I would not want to claim that we can demonstrate
that to a level of proof which would satisfy academic peer review rigour.[59]
3.77
In summary, DEEWR advised they understand the importance of the issue
and are continuing work in this area.[60]
3.78
Noting DEEWR’s response, the ANAO drew the Committee’s attention to
chapter five of the ANAO’s report, which outlines the longitudinal study being
undertaken by the NSW Department of Education and Training in partnership with
the University of Wollongong. The study is ‘looking at issues and effects from
the program in relation to pedagogy, student engagement and outcomes’.[61]
The ANAO report suggests DEEWR leverage of this work.[62]
3.79
DEEWR advised that in their six-monthly reports education authorities are
required to report on the four themes developed at the commencement of the
program: the installation of computers, leadership, teacher capability and
digital resources. Respondents are also asked to provide case studies ‘that can
be used and built on in the evaluation of good practice in the classroom’.[63]
Program evaluation
3.80
According to the ANAO’s report, the timetable for the implementation of
the DER program led to a focus on key administrative activities, leaving the
development of an evaluation framework to be considered later following
completion of more detailed program planning. At the time of the ANAO audit a
final evaluation framework had not been released. The ANAO concluded that
'earlier investment in evaluation methodologies and associated data as the
program evolved would have provided a stronger foundation for measuring the
impact of the DER program'.[64]
3.81
DEEWR informed the Committee that an evaluation strategy has been
developed in consultation with the Commonwealth, state and non‑government
authorities, and the Australian Information and Communications Technology in
Education Committee[65]. DEEWR advised it has
been agreed that the evaluation will:
- pick up mechanisms
that are qualitative as well as quantitative;
- comprise a
mid-program review in 2012 which will also go to addressing the audit
recommended by the ANAO; and
- occur over time to look
at aspects that contribute to education outcomes.[66]
3.82
DEEWR also advised the Committee that they are in the process of
identifying a service provider to undertake the mid-program review and the
audit in 2012.
3.83
Referring back to the program objective to ‘prepare students for further
education, and training, and to live and work in a digital world’, the
Committee was interested in what efforts had been made to engage the ‘digital
world’. More specifically, the Committee wants to be sure that the skills being
developed as a result of the DER meet the requirements of post‑secondary
education providers (universities as well as the vocational and education
training sector) and potential employers.
3.84
Beyond the schools and education authorities, the Committee asked if any
consultation had been undertaken with these post-secondary stakeholders in
terms of helping to identify performance indicators that would demonstrate that
there has been some development in technology capability of the students.
3.85
The Department undertook to respond to this question on notice. The
Department’s response to the Committee had not been provided at the time of the
publication of this report.
Committee comment
3.86
Overall, the Committee acknowledges that despite the early challenges,
DEEWR has managed the program effectively to meet announced implementation
timeframes. In other areas, such as KPIs and evaluation, the Committee believes
DEEWR could have done more.
3.87
In particular, the Committee does not agree with DEEWR that the ANAO’s
Recommendation No.2 regarding performance indicators has been fully
implemented. The Committee acknowledges the difficulties associated with
measuring high level qualitative achievements as a result of individual
programs, such as DER’s stated aim ‘to contribute sustainable and meaningful
change to teaching and learning in Australian schools’.
3.88
However, the Committee considers that if such high level aims are to be
stated then it is reasonable to expect that a corresponding system of
measurement be developed. If this cannot be done in full efforts should be made
to develop indicators toward the high level outcome for the program, even if
they only provide a partial gauge of the programs contribution. Given the size
of the funding allocated to the DER program the Committee considers efforts in
this regard even more important.
3.89
To assist with improved performance measurement, the Committee agrees
with the ANAO’s comments in their report that there is merit in DEEWR
leveraging off the evaluation work of state and territory education
authorities. However, the Committee feels that DEEWR should go beyond this and
also develop relationships with appropriate research bodies to study the program’s
qualitative achievements. Such bodies could include universities and other peak
representative organisations in the education sphere.
3.90
Further, the Committee concurs with the ANAO’s advice that the
evaluation mechanisms should be developed at the start of a program. While
accepting that the initial ‘100 day’ timeline placed pressure on the
Department, the Committee contends that DEEWR could have leveraged previous
program experience to produce an evaluation model earlier. The Committee notes
that DEEWR has more recently been working with stakeholders to develop an
evaluation strategy, but remains concerned that some arrangements are still
being decided so close to the deadline for the full implementation of the computer
roll-out.
3.91
The Committee was concerned with the suggestion in the ANAO’s report
that the initial payment acquittal arrangements did not adequately protect the
Australian Government’s interests. The Committee acknowledges DEEWR’s advice
that this matter has been rectified following the move to the National
Partnership Agreement on the Digital Education Revolution. The Committee trusts
that the Department has learnt from this and has processes in place that ensure
stronger future program funding agreements that include appropriate and timely
acquittal mechanisms.
3.92
In terms of DEEWR’s procurement strategy, the Committee commends the
Department’s initiative to encourage flexibility and value for money by
allowing any residual funding to be applied to complementary ICT. The Committee
would like to see this type of thinking applied to suitable similar programs
across Government.
3.93
Ten questions on notice were submitted to DEEWR. While acknowledging the
limited timeframe for responses, the Committee is nonetheless disappointed that
no responses had been received at the time of report finalisation. The
Committee had a particular interest in obtaining answers to the questions on
the broader reach of the program, for example:
- the critical area of
professional development for teachers to ensure they are able to maximise the
potential of computers and complementary ICT in classrooms;
- engagement with
post-secondary stakeholders to establish the skills expected to be required by
students upon leaving secondary school; and
- with the increase of
ICT infrastructure and complementary technology in classrooms, the possibility
of access to facilities by the community.
3.94
The Committee notes that there are a number of DEEWR initiatives
underway to boost schools’ ICT infrastructure and capacity to use the
technology, as well as a sizeable program administered by the Department of
Broadband, Communication and the Digital Economy (DBCDE) to integrate the
benefits of the National Broadband Network[67]. There are also many state
and territory programs, such as the Connected Classrooms program in NSW. The
Committee emphasises the importance of leveraging investments in computers or
infrastructure to ensure classrooms a fully networked, and is encouraged to see
initiatives towards this end. Ensuring that classrooms are as connected as
possible is essential to maximise the educational outcomes for our children
into the future.
3.95
With this combined multi‑billion dollar investment across
government agencies and levels of governments, the Parliament and the public are
entitled to be informed of the progress and outcomes in a timely manner.
Therefore, the mid-program review should be made public soon after its
completion. It is also important that there is comprehensive and transparent
reporting of the program as a whole. The Committee therefore reemphasises the
comments and recommendation made in Chapter 2 of this report - that more work
needs to be done on improved cross-agency and cross‑jurisdictional
financial reporting as part of the Commonwealth Financial Accountability Review.
Recommendation 2 |
|
The Joint Committee of Public Accounts and Audit recommends
the Department of Education, Employment and Workplace Relations publicly
release in full the findings from the mid‑program review scheduled for
2012 within three months of completion. |
Navigation: Previous Page | Contents | Next Page
Back to top