Chapter 3 The Committee’s review
Introduction
3.1
The Committee received evidence on the following issues relating to the
2011–12 Major Project Report (MPR):
- Cost
- Project
financial assurance statements
- Contingency
funds
- Schedule
- Slippage
issues
- Project
maturity scores
- Governance and
business processes
- Consistency
of information and system rationalisation
- Accountability
for projects
- Sustainment
reporting
- MPR
stakeholder survey
Cost
Project financial assurance statements
3.2
The financial information included in the Project Data Summary Sheets
(PDSSs) for each MPR project is now presented in an ‘out-turned’ format.
This means that the forecast price indexation for the life of each project is now
included in the data presented. For projects approved prior to 1 July 2010,
budget variations for actual and forecast indexation continue to be itemised in
PDSSs, while for projects approved after 1 July 2010 the forecast
indexation is included within the original budget approved by the Government. Out-turned
budgeting requires the Defence Materiel Organisation (DMO) to manage indexation
costs internally to the initially approved project budgets, rather than seeking
budget supplementation as prices increase.
3.3
To help allay concerns about indexation no longer being disaggregated in
project financial information, in 2012 the Committee accepted the DMO’s
proposal to provide a financial assurance statement in each PDSS with an
overall assessment of the project’s budgetary position.[1]
In its report on the 2010–11 MPR, the Committee indicated that it would closely
monitor the reliability of these project financial assurance statements over
time and revisit the issue if necessary.[2]
3.4
The financial assurance statement for each of the 2011–12 MPR’s 29 projects
indicated that there was sufficient remaining budget for the project to be
completed. However, for six projects, factors that may potentially impact on
budgets in the future were also identified.[3] Five of these six
projects were included in an independent third party review of the procedures
and controls in place to support the financial assurance statements, which
helped enable the DMO’s Chief Finance Officer to provide overarching assurance
for the projects.[4]
3.5
Although project financial assurance statements were excluded from the
scope of its formal review conclusion, the Australian National Audit Office
(ANAO) reviewed the assurance framework. It noted that the DMO’s Chief Finance
Officer’s overarching assessment of the project financial assurance statements
was unqualified, reflecting the DMO’s confidence that the projects would be
completed within budget. However, the ANAO also noted that project-level
assurance statements have limitations as they are based on ‘current financial
contractual obligations’ and the known risks and expenditure estimates as at 30
June 2012.[5]
3.6
At its public hearing in Canberra, the Committee sought comments from
the DMO and the ANAO on the value of project financial assurance statements,
and whether their introduction had provided the level of assurance that was
anticipated.
3.7
The DMO and the ANAO both agreed that the assurance statements were a
valuable addition. The Chief Executive Officer of the DMO informed the
Committee that:
The discipline of looking ahead and assuring ourselves—or
otherwise, of course—that we can execute the project in the planned budget is
an important piece of work and an important focus.[6]
3.8
The DMO’s Chief Finance Officer told the Committee that the statements
had proved ‘very useful’ for his management of the portfolio of projects, and
that:
The level of detail that we now have visibility of is
enhanced. It allows the projects themselves to do a risk assessment, in the
financial sense, which has been very useful.[7]
3.9
The DMO also advised the Committee that the process underpinning the
development of financial assurance statements was sustainable, and backed up by
the independent assessments by Ernst & Young, which ‘confirmed the
projects’ view’.[8]
3.10
Similarly, the Auditor-General told the Committee that the development
of project financial assurance statements was ‘a powerful discipline on project
managers’ and ‘a very worthwhile addition to the report’. He pointed out that there
were six projects identified as areas of concern which would need to be
carefully managed, but noted that the DMO’s Chief Finance Officer had provided
overall assurance that there was sufficient budget within DMO to manage the
concerns.[9]
Committee comment
3.11
The Committee was pleased to hear that there was strong agreement from
both the ANAO and the DMO that project financial assurance statements had added
value to the MPR process, and that in particular, had enabled potential risks
to project budgets to be identified, assessed, and included in the report.
3.12
The process underpinning the development of the statements appears to have
been robust. The gaining of independent, third party assurance for
appropriately selected projects is of particular value to ensuring the
integrity of the process, and the Committee wishes to see this practice
continue for future reports. The Committee understands that the selection of
five major projects for external assurance review was based on specific risk
factors that had been identified for those project’s budgets. The Committee
expects that the DMO, in consultation with the ANAO, will continue to ensure future
external review of projects for which potential budget risks have been
identified.
Recommendation 1 |
|
To help ensure that project financial assurance statements continue
to be robust and meaningful, the Committee recommends that, in consultation
with the Australian National Audit Office, the Defence Materiel Organisation
continue to seek independent financial assurance during the development of future
Major Project Reports for an appropriately selected sample of projects. |
Contingency funds
3.13
The MPR disclosed that approximately $1.1 billion had been drawn upon
from project contingency budgets in 2011–12 to retire project risks, equivalent
to 2.3 per cent of the total approved project budget.[10]
3.14
As was noted in Chapter 2, the change in supplementation policy
associated with out-turned budgets means that price indexation is now ‘a major
risk or issue in some projects’, and for which contingency funds may need to be
drawn upon.[11]
3.15
The ANAO noted in its overview of the 2011–12 MPR that:
… the emergence of any indexation risk has, to some extent,
changed the nature and use of the contingency budget from dealing with project
risk management to broader project management, and requires project staff to
have a greater understanding of the factors that influence indices and their
likely movement over the life of the project.[12]
3.16
In a written question to the DMO prior to the public hearing, the
Committee asked which projects included in the MPR had utilised contingency
funds in 2011–12. The Committee also asked whether there were any barriers to
the projects that have used contingency funds, or the amount of funds that have
been used, being documented in the Project Data Summary Sheets for future MPRs.
3.17
The DMO response listed fifteen major projects which had used
contingency funds in 2011–12,[13] and indicated that:
Public release of details regarding project contingency
provisions could be prejudicial to taxpayers’ interests. DMO experience
indicates that knowledge of contingency provisions encourages some contractors
to find ways to gain access to the funds, which can have negative implications
for good project governance.[14]
3.18
At the public hearing, the DMO was asked to further explain the reasons
for information on the use of contingency funds not being published. The DMO’s
Chief Executive Officer confirmed that his main concern was that ‘disclosing
the amount of money in each project provides an opportunity for people to go
after that money’.[15]
3.19
The Committee asked the DMO to explore how the utilisation of contingency
budgets could be better disclosed in the MPR, including the amount spent, why
and when it was spent, and to whom the money went. In response, the DMO undertook
to report back to the Committee with a proposal on how it could disclose the
realisation of contingency into project expenditure.[16]
3.20
The Auditor-General was also asked to comment on how the use of
contingency funds could be checked and publicly disclosed in a way that does
not cause significant harm to the Commonwealth. The Auditor‑General expressed
his understanding of the risks to taxpayer interests referred to by the DMO,
but suggested that the DMO’s offer to look at the matter was ‘probably the best
sign of progress that we have seen for quite some time in this area’.[17]
Committee comment
3.21
The Committee understands and appreciates the DMO’s concern that
disclosing too much information about contingency funds could be prejudicial to
taxpayers’ interests.
3.22
However, the Committee also recognises that the nature and use of
contingency funds will be of increasing importance as the out-turned budgets of
projects are tested over time. Ensuring an adequate level of transparency around
when contingency budgets are being drawn upon is therefore a key area of
interest for the Committee.
3.23
The Committee welcomes the DMO’s willingness to develop a proposal for
how the expenditure of contingency funds could be better disclosed in the MPR,
and looks forward to receiving this proposal.
3.24
The Committee’s initial opinion is that, while the DMO has legitimate
concerns that the amount of contingency budget available is not disclosed to
contractors, there would be less danger in disclosing information about funds
that have already been spent.
3.25
The Committee notes that the MPR currently discloses the total
contingency allocated across the 29 projects, but does not provide
project-specific information. In response to a written question, however, the
DMO was able to provide a list of projects for which contingency funds had been
expended in 2011–12. The Committee considers that, at a minimum and in addition
to any proposal for disclosing further information about actual contingency
expenditure, this amount of information should be routinely included in future
MPRs.
Recommendation 2 |
|
The Committee recommends that, by 20 June 2013, the Defence
Materiel Organisation submit a proposal, for incorporation into the 2013‑14
Major Projects Report Guidelines, on how project-level contingency fund data
could be disclosed in future Major Projects Reports without being
significantly prejudicial to taxpayers’ interests. At a minimum, projects
that have utilised contingency funds during the previous financial year or
are anticipated to use contingency funds in the forthcoming financial year,
and the amount of such funds, should be identified in the reports. |
Schedule
Slippage issues
3.26
In its overview of the 2011–12 MPR, the ANAO identified a range of
pressures that can contribute to schedule slippage, including actions by
contractors; economic conditions impacting on workforce supply and demand; and
procurement decisions by other nations which may impact on downstream purchases
in terms of time to delivery.[18]
3.27
The DMO provided a summary in this year’s MPR of actions being taken to
address slippage.[19] The JCPAA recommended
inclusion of this information in last year’s review, having regarded schedule
slippage as a ‘major concern’ and DMO’s explanations as being ‘unhelpful’.[20]
3.28
In the 2011–12 MPR, the DMO claimed that ‘since 2000, the average level
of slippage has decreased from over 50% to around 30% and … DMO project
schedules are now comparable with the private sector’.[21]
3.29
At the public hearing, the DMO was asked about the range of pressures
contributing to schedule slippage, and the role that project scope and
complexity play in this.
3.30
The DMO gave the Committee an overview of a review by a private sector
consultant, Independent Project Analysis (IPA), benchmarking the DMO’s
performance in managing major projects against a sample of projects from the
IPA’s database of over 14 000 private sector capital projects across more than
200 companies, primarily in the processing and extractive industries. The IPA
report found that the DMO performed better than the private sector on budget
and capability performance.[22]
3.31
The DMO also discussed the IPA report’s findings on schedule performance.
The IPA found that DMO projects suffered from schedule slippage of around 30
per cent, which was similar to the private sector. While there had been
statistically significant improvements since the 2003 Defence Procurement
Review (the Kinnaird Review), the DMO conceded that this was an area in which
further improvement was needed.[23]
3.32
In regard to the reasons for schedule slippage, the DMO identified the
following issues and examples:[24]
- The complexity of the
technology. For example, the DMO’s project to install developmental phased
array radar onto airborne early-warning aircraft encountered a ‘large range’ of
unexpected technical problems.
- The availability of
platforms for modification, particularly for projects that are upgrading
existing platforms. As availability is driven by ‘operational tempos’, the
deployment of assets into theatre will mean they are not accessible and
schedules may need to be adjusted. Examples given of projects where this has
occurred included the Collins replacement combat system and upgrades to the
P-3C aircraft.
- Underestimation of the
maturity of a system and the amount of development required. For example, when
the decision was made to acquire MRH 90 helicopters:
- … it was assumed it had been offered to Defence as a mature
capability and it was not as mature as had been assessed. So it took longer to
mature that capability than was planned in the project at the start.[25]
Committee comment
3.33
The Committee notes that schedule slippage remains the DMO’s biggest
project management challenge.[26] There are currently few
signs of immediate improvement across the range of MPR projects, with overall slippage
for MPR projects having increased slightly, in percentage terms, between
2010–11 and 2011–12.
3.34
However, it is encouraging that there appears to have been some progress
to reduce slippage over recent years, stemming from reforms associated with the
Kinnaird and Mortimer reviews and the demerger of the DMO from Defence in 2005.
The ANAO’s analysis has shown that the largest schedule delays continue to be
associated with more developmental projects that were approved prior to 2005,
and the DMO’s overall schedule performance appears to be being let down by a
relatively small number of older projects that have experienced excessively
long schedule delays.[27] This observation is further
evidenced by the IPA’s finding that while median DMO schedule slippage
was the same or better than industry benchmarks, mean slippage was
worse.[28] The Committee hopes that
overall schedule results will start to improve over the coming years as these older
developmental projects eventually reach Final Operational Capability and exit
the MPR.
3.35
The Committee welcomes the new section in the MPR outlining what DMO is
doing to minimise schedule slippage, as was recommended by the Committee in its
review of the 2010–11 report. Given the historical correlation between acquisition
types and final schedule outcomes, the Committee is pleased to note that, in
future MPRs, the consideration of Military Off-the-Shelf (MOTS) or Commercial Off-the-Shelf
(COTS) options prior to acquisition will be reported on for all new
developmental and ‘Australianised MOTS’ (AMOTS) projects.
3.36
The Committee appreciates that the DMO’s difficulty in accessing
platforms to perform upgrades has had an impact on schedules for some projects.
However, the Committee was not convinced that these challenges are entirely unpredictable,
and considers that the potentially limited availability of operational
platforms could be better planned for during the establishment of the original
project schedules.
3.37
The ANAO’s assessment of the reasons for schedule slippage was that ‘underestimation
of both the scope and complexity of work’ was the primary factor.[29]
The DMO’s public hearing comments about the MRH90 helicopters were that
incorrect assumptions about project maturity had been made at the time of
approval, leading to lengthy schedule slippage. Given the seriousness of these
matters, the Committee has recently written to the Auditor-General to inform
him that a performance audit of Defence’s test and evaluation processes,
including those that take place prior to acquisition, is an ‘audit priority of
the parliament’.
3.38
The Committee notes that the Government has recently agreed to
recommendations by the Senate Foreign Affairs, Defence and
Trade References Committee to require greater engagement of test and
evaluation practitioners in the early stages of acquisitions.[30]
It is hoped that, over time, these changes will reduce the chance of MRH90-type
problems occurring for future procurements.
Project maturity scores
3.39
Project maturity scores are allocated by DMO to each project to quantify
their maturity at defined milestones. The ANAO has noted that maturity scores at
Second Pass Approval for projects in the 2011–12 MPR ‘vary and are generally
inconsistent with the presentation in the DCP [Defence Capability Plan]’.[31]
3.40
The ANAO Overview section of the MPR included graphs produced by
comparing, in percentage terms, project maturity scores to the budget expended
and time elapsed for each MPR project.[32] For some projects, the
maturity scores were vastly higher than the proportion of budget expended and
time elapsed.[33]
3.41
In a written question, the Committee asked the DMO if where project
maturity far exceeds time elapsed and budget expended, this should be viewed as
an indication of an overly optimistic assessment of project maturity.
3.42
The DMO response suggested that the question would be more appropriately
addressed to the ANAO, who performed the analysis. The DMO indicated that it
would not analyse project maturity scores in this way, as it would imply a
direct linear relationship between the time elapsed and the maturity score:
While it is expected that during a project's life the
maturity score indicator will increase, the indicator can also deteriorate. For
example, the commercial score could decrease in the event of a major dispute or
if the contractor runs into financial difficulties.[34]
3.43
The DMO indicated that it had informed the ANAO of its concerns about
the analysis during the preparation of the MPR.
3.44
In a separate written question, the Committee asked the DMO why the
benchmark maturity scores used for projects in the MPR did not take into
account the inherent differences in project maturity between Developmental
projects and Military Off-the-Shelf (MOTS) projects. The Committee also asked
how the DMO ensured that maturity scores were applied consistently for all
projects, noting that some Developmental projects (such as the Joint Strike
Fighter) were given higher maturity scores at Second Pass Approval than some
MOTS projects (such as the Additional Chinook project).
3.45
The DMO advised that it had promulgated an agency-wide instruction on
the use and application of maturity scores in September 2010. It added that:
Maturity scores are a helpful tool, but they are ultimately
indicative and advisory. At key points in the project lifecycle they may help
with consideration of relative risk. Where scores are lower than nominal
benchmarks [they] indicate a higher relative risk exposure—but would not
necessarily lead to a decision not to proceed with a project.[35]
3.46
The response also indicated that the application of maturity scores can
‘vary from project to project depending on the assumptions made by the project
manager that underpin the assessment’. It suggested that the Joint Strike
Fighter project’s maturity score at approval reflected ‘a very optimistic
assessment of the project at the time’—but noted that this was done prior to
the DMO’s current maturity score procedures being promulgated.
3.47
While MOTS systems would ‘typically’ have higher scores against the key maturity
attributes due to a greater level of data being available from test and
evaluation, the DMO indicated that:
… until we have confirmation that Operational Test and
Evaluation (OT&E) data is available to Australia the score may be lower
than expected for some MOTS items.[36]
Committee comment
3.48
The Committee notes DMO’s comments in response to a written question
that maturity scores should not be directly compared to time elapsed or budget
expended, and understands that there is not a direct linear relationship
between each of these aspects. However, the Committee argues that there is value
in such analysis in order to help identify projects that may be maturing slower
than expected, or where overly optimistic assessments of project maturity may
have been made. The Committee supports the ANAO’s ongoing use of these graphs
in future MPRs.
3.49
The Committee stands by its decision to ask the DMO about comparative information
which was compiled and presented by the ANAO using DMO data. The question gave
the DMO the opportunity to both respond to the ANAO’s analysis and to explain
its reasons for disagreeing with the comparisons being made. The DMO’s brief
response to the Committee’s question did not provide any specific reasons for
project maturity scores being disproportionally high in some cases.
3.50
The Committee notes the DMO’s comments that maturity scores are
‘ultimately indicative and advisory’; will ‘vary from project to project
depending on the assumptions made by the project manager’;[37]
and ‘are not precise and are not intended to enable exact comparisons across
projects’.[38]
3.51
While accepting that some level of variation will be unavoidable, it is
crucial that the DMO improve its consistency in this area. The 2012 Defence
Capability Plan indicates that maturity scores are used to assist the
Government to compare the maturity of different project options ‘as a measure
of the relative confidence associated with them at the time they are being
considered’, and that projects with higher maturity scores are considered to carry
lower risk. Off-the-Shelf projects are expected to have considerably higher
maturity scores at the time of decision-making than Developmental projects.[39]
Imprecise and inconsistent maturity scores could therefore make a difference in
determining which projects receive approval—decisions which can have
significant long-term consequences for both budgets and defence capability.
3.52
The Committee welcomes the 2010 promulgation of DMO-wide instructions on
the use of maturity scores, and hopes this has gone some way to improving
consistency in the way that maturity scores are assessed. Noting the issues that
continue to be raised by the ANAO, the Committee suggests that the DMO follow
up these instructions with further action to improve consistency in the way that
maturity scores are applied, and work towards alignment with the presentation of
maturity scores as outlined in the Defence Capability Plan.
Recommendation 3 |
|
The Committee recommends that the Defence Materiel
Organisation review its current approach to assigning maturity scores to
projects to improve the consistency of their application and their
consistency with the Defence Capability Plan. |
Governance and business processes
Consistency of information and system rationalisation
3.53
The Committee noted in its review of the 2010–11 MPR that it expected to
see in the next MPR ‘concrete evidence of results and progress’ having been
made to achieve consistency of information across projects. [40]
3.54
In its overview of the 2011–12 MPR, the ANAO noted that inconsistency of
information across projects continued to be an issue. Issues with business
systems were again noted by the ANAO, particularly differences between
information technology systems in terms of risk management, financial
management and document management. Furthermore, inconsistently recorded
information has created issues for project management and decision-making.[41]
3.55
The ANAO provided an update in the 2011–12 MPR on DMO’s efforts to
improve its systems, noting that ‘the DMO has advised that limited progress has
been made overall towards rationalisation, and that further ownership and
guidance would be required within the organisation in order to achieve the
desired outcome’.[42]
3.56
The Committee asked the DMO what was being done to achieve the level of
ownership and guidance required to drive further progress towards
rationalisation. The DMO informed the Committee of its recent rationalisation
of risk management systems, and provided an update on its progress in migrating
staff to a single records management system.
3.57
In relation to financial management systems, the DMO responded that:
… the financial data kept by each project is tailored to the
individual project. Differences will occur on factors such as the number of
contracts being managed, the currencies used in each of those contracts, the
labour and material indices, and the use of Foreign Military Sales versus
commercial contracts. All major projects record their project approval values,
the expenditure to date, and remaining budget (which includes planned
expenditure and remaining contingency) in Capital Equipment Program Financial
Planning System (CEPPlan). CEPPlan is planned for redevelopment to modernise it
and better link it to the Defence Budget and Output Reporting Information
System (BORIS) system. This redevelopment is scheduled for roll out in August
2013, subject to CFO Defence approval.[43]
3.58
The response further indicated that due to large variations between
projects ‘it will not be possible to standardise the information held and
management by all projects at all levels’.[44] Addressing the ANAO’s
concerns about the consistency of presentation of information, the DMO advised
that:
… information is maintained by each project in accordance
with the relevant project plans approved for each project. These plans provide
the basis for tailoring the DMO processes to match the requirements of each
project.[45]
Committee comment
3.59
The Committee was concerned by advice provided in the MPR that ‘limited
progress’ has been made in improving the consistency of information across
projects.
3.60
The DMO’s response to the Committee’s question indicated that some
rationalisation of risk management and record management systems was underway,
but that due to large variations in the nature of projects full standardisation
of information may not be possible. However, the Committee feels that the
management systems used should be able to be consistent across projects,
despite the individual nature of the projects themselves.
3.61
Recognising these complexities and acknowledging resource constraints,
the Committee suggests that the DMO, potentially in consultation with the ANAO,
develop a business systems improvement plan which prioritises projects, assigns
completion dates and allocates senior level ownership for implementation. The
plan should identify priority areas for rectification where the most
substantial improvement is needed and is achievable.
Recommendation 4 |
|
The Committee recommends that
the Defence Materiel Organisation develop a business systems improvement plan
which prioritises projects, assigns completion dates and allocates senior
level ownership for implementation. A progress update on achievements against
the plan should be included in the 2012–13 Major Projects Report. |
Accountability for projects
3.62
Accountability for the delivery of major projects is an area of ongoing interest
to the Committee.
3.63
In a written question, the Committee requested from the DMO a list of
all major projects approved by the Government after 1 March 2010, with details
of the assignment of overall responsibility, authority and accountability for
realisation of the capability system to an in-service stage. In response, the DMO
provided a table of 99 projects listing the Capability Realisation Authority
for each.[46]
3.64
In another question, the Committee asked the DMO whether the ‘project
line management’ section of each PDSS in the MPR should be taken to provide a
clear answer to the question ‘Where does the buck stop?’ The DMO advised that:
For projects assigned to the DMO, accountability and
reporting flows from the Project Director or Project Manager through line
management to the Chief Executive Officer Defence Materiel Organisation where,
ultimately, ‘the buck stops’.[47]
3.65
The DMO also provided information on the ‘typical allocation of
responsibilities and accountabilities’ of line management, including the
General Manager, the Division Head, the Branch Head and the Project Director or
Manager.[48]
3.66
In a final written question on accountability assignment, the Committee
asked the DMO for an example of a project that had exited the MPR, with details
of accountability as allocated at each state post exit.
3.67
The DMO’s response noted that only two projects had exited the MPR since
its inception in 2007: the AIR 5376 Phase 3.2 – Hornet Refurbishment and the
AIR 8000 Phase 3 – C-17 Globemaster III - Heavy Airlift. The DMO then provided
general information about the responsibilities of the Capability Manager, who
is ‘directly accountable to the Secretary of Defence and the Chief of the
Defence Force for the successful realisation of an approved new capability’.[49]
The roles and responsibilities of Capability Managers were also illustrated in
a diagram attached to the response.[50]
3.68
At the public hearing, the Committee asked the DMO to provide a simple
diagram that tracks how the responsibility for a project changes from the Chief
Executive Officer of DMO to the Capability Manager over time.
3.69
The DMO indicated that it has now become more formalised within Defence
that the Capability Manager is ‘overall responsible for the capability’, while
the DMO is responsible for ‘the materiel elements to Final Materiel Release’.[51]
A diagram was provided on notice illustrating this arrangement.[52]
Committee comment
3.70
The allocation of responsibility across the life of a project has been
an area of ongoing interest to committee members. The Committee understands
there has been significant variability in accountability across projects, and was
pleased to hear that the allocation of responsibilities between the DMO and
Capability Managers has now become more formalised inside Defence.
3.71
Diagrams provided by the DMO to explain the accountability arrangements
were not as simple as they could have been, and as a result, were not
especially helpful. However, it is clear from the evidence as a whole that the
Capability Manager is the ‘key figurehead accountable for Whole of Capability’
after the Second Pass approval of a project. The Capability Manager’s
responsibilities include coordinating the inputs to a project, including the materiel
delivery that is managed and financed by the DMO.
3.72
The Committee notes that a recent report of the Senate Foreign Affairs,
Defence and Trade References Committee made recommendations to expand the
responsibility of Capability Managers, including by attributing budgetary
control to them for procurement and sustainment rather than the DMO.[53]
While the Government’s response did not accept key aspects of these
recommendations, it did commit to taking steps to ‘ensure the primacy of the
Capability Manager’s role is maintained’.[54]
3.73
The Committee welcomes any actions that have been or will be taken in
light of this response to improve the clarity and outward transparency of the
relationship between the DMO and the Capability Managers in regards to overall
accountability for project outcomes.
Sustainment reporting
3.74
The DMO’s overview of the MPR reported that approximately 53 per cent,
of DMO’s 2011–12 budget, or around $5.4 billion, was for its sustainment
program. This comprised around 115 sustainment products.[55]
3.75
Given the importance of sustainment and the proportion of the DMO’s
budget attributed to it, the Committee was interested in learning whether there
were any reasons why an annual consolidated report could not be produced to
bring together the information on sustainment.
3.76
The DMO replied that it reports on the top 20 sustainment products
through the Defence Annual Report, Portfolio Budget Statements and Portfolio
Additional Estimates Statements. These products accounted for an average of
around 70 per cent of the DMO sustainment budget.[56]
3.77
The Committee followed up this response at the public hearing, asking
the DMO why it was not able to provide an overview of 100 per cent of the DMO’s
sustainment activities.
3.78
The DMO informed the Committee that the ‘first and foremost’ challenge
relating to reporting on sustainment activities was that performance
information on sustainment was tightly linked to sensitive information about
Defence’s operational capability commitments.[57] The DMO elaborated that:
Where we run into highly sensitive matters in the sustainment
area is that most of the performance metrics that we have for sustainment are
against the [Chief of the Defence Force’s} preparedness directive, which is
classified. If we were to do something similar in sustainment in terms of
assessing performance against measures in a public fashion, it would be
classified. We would have the same issue in sustainment metrics as we do in the
measures of effectiveness issue so the value of that would be less so in terms
of a public document. If we were to put a consolidated report together of how
we are performing against all the sustainment activities, that would give you a
public indication of preparedness, which is something we cannot do because it
is classified.[58]
3.79
Other complexities identified by the DMO relating to reporting on
sustainment included the moving of funds between one capability manager’s line
of activity and another’s, and that the nature of sustainment activities is
that they are an ongoing piece of work without a clear beginning and end.[59]
3.80
Despite these challenges, the DMO indicated that it would be able to
report on some data, such as the amount of money being spent on each of its
sustainment activities. The DMO undertook to look at ways to provide more
detailed information on sustainment in order to improve its transparency.[60]
3.81
On 13 May 2013, the DMO wrote to the Committee in response to this
undertaking. The response indicated that a ‘major review along the same lines
as the Major Projects Report would prove difficult’ due to the nature of
sustainment work and the classification of information. However, the DMO
committed to expand its reporting in Portfolio Budget Statements and the
Defence Annual Report to cover the top 30 sustainment products, rather than the
top 20. This would represent reporting on around 77 per cent of current spending
on sustainment.[61]
Committee comment
3.82
The Committee understands the sensitivity of some sustainment-related information,
and the inherent difficulty of reporting on this. However, given the
increasingly large amount of public money devoted to sustainment, the Committee
considers that transparency of information is needed to the greatest extent
possible without compromising national security interests.
3.83
The Committee welcomes the DMO’s commitment to expand the number of
sustainment products reported on in the Defence Annual Report and Portfolio
Budget Statements. However, the Committee believes that more needs be done to
increase the depth, not just the breadth, of information provided in these
reports. Current reporting provides only a high-level overview of each product,
its expenditure and its performance for the relevant financial year. The
Committee would like to see a proposal from the Department of Defence, in
consultation with the ANAO and taking into account disclosure in other
jurisdictions, on how the level of detail can be enhanced to make sustainment
information more transparent to public scrutiny.
Recommendation 5 |
|
The Committee recommends that, by 20 June 2013, the Department
of Defence reports to the Committee on how it intends to achieve greater
transparency in relation to its spending on sustainment activities. |
MPR stakeholder survey
3.84
In its review of the 2010–11 MPR, the Committee recommended that the DMO
include in the 2011–12 MPR a discussion on the use by, and value of, the report
for external stakeholders such as private companies or industry associations.[62]
3.85
In response to this recommendation, the DMO contracted the independent
firm Ernst & Young to conduct a survey of external MPR stakeholders. A
summary of the findings were published in the 2011–12 MPR.[63]
3.86
In response to a written question, the DMO also provided the Committee
with a detailed copy of the survey results.[64]
3.87
The survey had been sent to 226 external stakeholders, of which 86
responded. Of the respondents, 76 per cent were from industry (including those
linked to projects mentioned in the MPR) and 23 per cent were from outside the
Defence Community (such as media). Only 41 respondents completed the survey in
its entirety after indicating they were aware of the MPR, with the remainder
either not being aware of the MPR or not fully completing the survey.[65]
Ernst & Young advised the DMO that the response rate was below the level
usually received for their industry based consultations.[66]
3.88
Amongst those who responded, the survey provided the following key
findings:[67]
- Awareness of the MPR
was high, at 66 per cent.
- Understanding of the
intent of the report was high, at 85 per cent.
- Six attributes of the
report—clarity, accuracy, accessibility, transparency, relevance and value—were
rated at between six and seven out of ten. The lowest scores (around six out of
ten) were for transparency, clarity, and accuracy.
- 34 per cent of
respondents agreed that the quality of reports had improved over time.
- There was generally
agreement that the size, number of projects and focus of the MPR were
appropriate.
- The PDSSs and the DMO
sections were of the most interest, followed by the ANAO Overview.
- 73 per cent of users
agreed that the MPR was useful for understanding the DMO’s project performance,
while around 44 per cent used it as a comparison tool and to produce further reports.
3.89
Some suggestions for improvement were included in the supporting
comments of respondents to the survey. The comments varied greatly and were at
times conflicting—for example, some comments called for more detail to be
included in the report while others called for less.[68]
A small number of comments raised concerns about the accuracy of MPR information,
including financial figures.[69] There were also
suggestions that information on the source selection process for projects should
be included in the MPR.[70]
3.90
In its response to the Committee’s question, the DMO said that the
survey ‘did not highlight any clear areas for change’, and that no changes to
the MPR were intended as a result of the feedback other than any made in
consultation with the ANAO and with the approval of the Committee.[71]
Committee comment
3.91
The Committee welcomes DMO’s commissioning of an external stakeholder
survey to fulfil the Committee’s previous recommendation to include a
discussion on the use and value of the MPR.
3.92
Although the overall response rate to the survey was disappointing, the
Committee was pleased to hear that the MPR was generally valued by the majority
of those who did respond, and that the report has an appropriate focus.
3.93
It was unsurprising, given the primarily industry demographic, that
survey respondents had the most interest in project specific information
provided in the PDSSs and in overall information provided by the DMO.
3.94
The DMO has indicated that the survey did not highlight any clear areas
for change in the MPR. However, the Committee suggests that the comments of
respondents and the relatively low ratings given on the transparency, clarity
and accuracy of information in the report indicate that there remains room for
improvement. The Committee will continue to explore potential enhancements to
the format and content of the MPR during its future reviews of the report and its
Guidelines, and expects both the DMO and the ANAO to actively consider and
suggest improvements wherever possible.