Chapter 2 Major Projects Report 2007-08
Overview
2.1
The Major Projects Report 2007-08 is the pilot of an annual reporting
program that covers the cost, schedule and capability progress achieved by nine
selected DMO projects as at 30 June 2008. The MPR includes a formal review by
the Auditor-General on the information presented in the DMO Project Data
Summary Sheets (PDSSs).
2.2
The PDSSs are prepared by the DMO and have been designed to provide a
snapshot of key performance data for each of the projects included in the MPR.
The PDSSs currently provide data covering the following areas:
n Project summary;
n Financial
performance;
n Schedule progress;
n Risks, issues and
linked projects;
n Lessons learned; and
n Addendum (see 2.3
below).
2.3
The Project Data Summary Sheet Guidelines (prepared by the DMO in
consultation with the ANAO) required that each PDSS in the MPR 2007-08 contain
materially accurate and complete unclassified cost, schedule and capability
performance data as at 30 June 2008 together with an addendum that describes
material events occurring between 30 June 2008 and 31 October 2008.[1]
2.4
The work conducted by the ANAO on the MPR 2007-08 was undertaken in
accordance with Section 20 (Audits etc. by arrangement) of the Auditor-General
Act 1997. More specifically, the ANAO reviewed the PDSSs in accordance
with the Standard on Assurance Engagements ASAE 3000 Assurance Engagements
Other than Audits or Reviews of Historical Financial Information.[2]
The ANAO review included:
n examining each PDSS;
n reviewing the relevant
procedures used by the DMO to prepare the PDSSs;
n reviewing documents
and information relevant to the PDSSs;
n interviewing those
responsible for preparing the PDSSs and managing the nine projects; and
n examining the
certification and management representations by the DMO Chief Executive,
sign-offs by DMO managers, and management representations from the Capability
Managers relating to Initial Operational Capability and Final Operational
Capability.[3]
2.5
It should be noted that the nature and scope of the project issues
covered and the evidence obtained by the ANAO is not as extensive as the ANAO’s
individual performance audits. The level of assurance provided therefore, is
less than that provided by the ANAO’s performance audits.[4]
2.6
Although the review of the MPR 2007-08 provided the required level of
assurance in relation to the majority of the information contained in the
PDSSs, two areas were highlighted in the Auditor-General’s review report.[5]
2.7
First, sections in the PDSS concerning major risks and issues and
references to future events were scoped out of the review (see 2.12 for further
explanation). The ANAO’s review conclusion does not therefore cover major
risks and issues included by the DMO in Tables 1.2 (Project Context), 4.1
(Major Project Risks) and 4.2 (Major Project Issues) of the PDSS, and forecasts
of a project’s expected achievement of delivery schedule and capability that
are included in Sections 3 and 4 of each PDSS.[6]
2.8
Second, the ANAO review conclusion was qualified. This qualification
was related to uncertainty around the accuracy of the information contained in
Table 2.7 of the PDSSs, prime contract expenditure at base date price. The
qualification is attributed to the DMO’s corporate information systems. As
outlined in the MPR, the DMO relies on a number of systems to compile
information to populate the PDSSs, however, these systems are not well
integrated, particularly in relation to older projects. DMO project personnel
instead used spreadsheet-based systems and the accuracy of that information was
not able to be substantiated during the review.[7]
2.9
The MPR 2008-09 will report on the following 15 projects: [8]
n Air Warfare
Destroyers – SEA 4000 Phase 3;
n Airborne Early
Warning and Control Aircraft – AIR 5077 Phase 3;
n Multi-Role
Helicopters – AIR 9000 Phase 2;
n Super Hornet Aircraft
– AIR 5349 Phase 1;
n Amphibious Ships – JP
2048 Phase 4A/4B;
n Armed Reconnaissance
Helicopters – AIR 87 Phase 2;
n Air to Air Refuelling
Aircraft – AIR 5402;
n F/A-18 Hornet Upgrade
– AIR 5376 Phase 2;
n C-17 Heavy Lift
Aircraft – AIR 8000 Phase 3;
n FFG Frigate Upgrade –
SEA 1390 Phase 2;
n F/A-18 Hornet
Structural Refurbishment – AIR 5376 Phase 3.2;
n Bushmaster Vehicles –
LAND 116 Phase 3;
n High Frequency
Modernisation – JP 2043 Phase 3A;
n Armidale Patrol Boats
– SEA 1444 Phase 1; and
n Collins Submarine
Replacement Combat System – SEA 1439 Phase 4A
2.10
On the basis of evidence provided to the Committee at the public
hearings on 19 March and 19 August 2009 as well as the submissions received,
this report contains Committee comments on the MPR 2007-08. These comments address
the scope reduction and the qualification as well as potential improvements to the
PDSSs and the MPR overall.
Scope reduction and qualification
2.11
As outlined above, the ANAO’s review highlighted two key issues: the
scope reduction related to major risks and issues; and the qualification as a
result of uncertainty around the figures reflecting prime contract expenditure.
The Committee was interested in what could be done to address these issues in
the future.
2.12
In its first submission, the ANAO provided the following explanation for
the scope reduction:
For much of 2007-08, the DMO and ANAO were piloting the
development of the project data collection and assurance review policies and
processes, and as late as October 2008 DMO was making significant changes to
the major risks and issues information presented in the PDSSs. The limited
time available before the report’s tabling in late November 2008 reduced the
ANAO’s ability to assess the method used by DMO to compile the risks and issues
presented in each PDSS. As a result, this information was scoped-out of the
Auditor-General’s review.[9]
2.13
At the hearing on 19 March 2009, the Committee sought the
Auditor-General, Mr Ian McPhee’s, comments on how future scope reductions
related to major risks and issues could be avoided:
It is often a function of resources. I guess I am very
conscious [of the fact that when] we do performance audits, being the more
in-depth audits that we do, we get to understand more about the risks. In fact,
we are doing a couple of audits at the moment and you need quite a detailed
understanding to get comfort around the risks to the delivery on projects. At
this stage, I have not been comfortable to tell the committee that these risks
are the ones that have been fully identified for all of these DMO projects
reported. That is not to say that DMO is not highlighting risks and reporting
them to the best of their ability; I am just saying that it would take a lot
more time for us to get into the project detail to be able to say to you, ‘Yes,
this is the complete list of risks.’ It is a function of resources, but a
function also of how DMO compile their risks.[10]
2.14
In responding to a query about whether the DMO has sufficient systems in
place to make sure that it would be able to provide a complete list of risks,
Dr Gumley, CEO of the DMO, made the following acknowledgement:
We do not have systems consistently right through the DMO. It
is much more on a project-by-project basis.[11]
2.15
The Committee is keen to ensure that the scope of the review is not
reduced in future MPRs. To that end, the Committee encourages the DMO, in
collaboration with the ANAO, to develop a consistent framework for compiling a
complete list of major risks and issues across projects thereby maximising the
ANAO’s ability to assess the information appropriately.[12]
2.16
The Committee also welcomes the undertaking given by the DMO at the
hearing on 19 August that, at the Committee’s request, additional information
will be incorporated into ‘Section 4 – Risks, Issues and Linked Projects’ from
the MPR 2009-10 onwards. This information will identify whether the risks and
issues listed in Section 4 had been anticipated or whether they had emerged
over the course of the project.[13]
2.17
With regard to the qualification, at the hearing on 19 March, DMO
representatives acknowledged that more work needed to be done:
There is an improvement for us of the quality of the
historical financial data. There are issues in the qualification in respect to
the quality of specific instances of price and exchange variations sometimes
going back into the early 90s. What we have to do is look at the actual
materiality of those issues as to whether there is value for the Commonwealth
and us in going back and retrieving historical data, with some going back
10-plus years.[14]
2.18
The Auditor-General also made the following comment:
I think the one thing that we are conscious of is that some
of the systems within DMO are probably not at the same level of maturity across
the organisation and they do have an impact on the compilation of reporting information, particularly in the area of the costing information which we highlighted in our
report. We will keep an eye on that and clearly if we think it is not up to
scratch we will continue to qualify. Equally, I am conscious that DMO has got
plans afoot to try to improve the quality of information systems and information as well.[15]
2.19
As referred to in paragraph 2.8 above, data collection of the type now required
for Table 2.7 of PDSSs (‘Prime Acquisition Contract(s) Price and Progress
Payments’) poses challenges. This is because data originally collected
relating to older projects has since been transferred to another system and in
that process has lost some of the original detail.[16]
Additionally, as there has been no requirement for this type of information in
the past, the provision of electronic historical data is an extremely resource
intensive exercise.[17]
2.20
In a letter to the Committee dated 29 June 2009 (Exhibit 1), the DMO proposed
an amendment to the PDSSs which would see the prime contract base date
financial data being replaced by Assets Under Construction (AUC) data. This
correspondence sets out that AUC data would provide capitalised expenses, including
expenses incurred in relation to the contract price, and all other costs
directly attributable to bringing an asset to a condition ready for use.[18]
The DMO letter dated 29 June also provides the DMO rationale for moving to AUC
data, an explanation of the data together with an example of how the data would
be presented in the MPR, and a summary table comparing base date contract
expenditure information with capitalised expenses information.
2.21
To assist the Committee in its consideration of the DMO’s request, the
ANAO also provided a list of positive and negative implications for the MPR
compilation task and subsequent assurance reviews should base date financial
data be replaced with AUC data.[19]
2.22
Having considered the advantages and disadvantages outlined by both the
DMO and the ANAO of replacing base date financial data with AUC data in Table
2.7 of the PDSSs, the Committee is satisfied that the provision of AUC data is
a suitable approach for the DMO to take.
2.23
That said, the Committee intends to monitor the effectiveness of this
approach and any implications for other reporting mechanisms such as the
provision of Earned Value Management System (EVMS) data.[20]
2.24
In addition to shifting to AUC data in Table 2.7 of the PDSSs, the
Committee welcomes the undertaking provided by Dr Gumley on 19 August
2009 that the DMO will provide an additional breakdown of the project costs in Section
2 of the PDSSs. [21] That is, within all of
the major projects reported in the MPR, the DMO will provide costs data for the
largest five contracts within that project from project inception as well as
the costs incurred by the DMO and Defence.
2.25
At the hearing on 19 March 2009, the Committee inquired as to whether ANAO
assurance could be provided on a project-by-project basis. In response, the
Auditor-General provided the following comment:
We could have a look. We could break it down to particular
projects if we were comfortable to say that seven of the 15 legacy projects
have issues with systems and therefore we cannot provide assurance around the
cost information in relation to those. We could, over time. I agree with what Mr Gillis is saying, we need to see how significant these matters are. We have a bit more time
in this current year. We need to see how significant they are and, if we can,
we will certainly narrow down any qualification that still exists to be more
specific, and one would expect, on the basis of advice, that it would relate to
the older projects.[22]
2.26
The Committee will follow up this matter in the future.
2.27
Overall, the Committee notes that both the DMO and ANAO are working hard
to address the review scope limitations and the qualified conclusion associated
with the MPR 2007-08. The Committee will closely monitor progress in this respect.
Improvements to PDSSs
Lessons learned
2.28
In its inquiry into progress on equipment acquisition and financial
reporting in Defence, the Committee found that procedures and processes for
documenting lessons learned on all major projects were patchy and inconsistent.[23]
In reviewing the MPR 2007-08, the Committee was therefore pleased to see that a
section covering ‘Key Lessons Learned’ has been included in Section 5 of the
PDSSs.
2.29
That said, the Committee believes that more can be done to assure the
Parliament and the public that the DMO has incorporated any lessons that have
been learned into its project management systems and policy and practice.
2.30
At the hearing on 19 March 2009, in response to questioning about this
issue, Mr Kim Gillis, General Manager Systems at the DMO referred to the
commonality of the lessons across the projects:
In the nature of the types of projects that we are talking
about, these large complex systems integrations, which is where we are with
these major projects, they all have the same issues.[24]
2.31
The Committee also notes from evidence given by the representatives of
the DMO that in response to lessons learned on systems integration, a systems
integration cell has been established to address systems integration issues, a
program of gate reviews has been initiated whereby projects are reviewed by two
general managers, and discussions are taking place with industry about how
their internal practice of systems integration can be improved.[25]
2.32
Given the systemic and interrelated nature of many of the lessons
learned, representatives of the DMO have suggested that, rather than addressing
the same lessons learned individually for each MPR project, they could be
addressed collectively. That is, the MPR would include a section outlining the
systemic issues and interrelated issues at the front of the PDSSs, with lessons
specific to individual projects to be included in the relevant project PDSS.
2.33
While the Committee is broadly satisfied with this approach, the section
addressing the systemic issues at the front of the PDSSs must incorporate clear
plans as to how the lessons learned have been or will be incorporated into
future policy and practice.
2.34
Additionally, the Committee requires that where systemic issues have arisen
in individual projects, a cross-reference to the combined section addressing
systemic issues should be included in Section 5 of the relevant individual
PDSS. Where lessons learned are unique to individual projects, these should
continue to be reported in Section 5 of the PDSSs.
Recommendation 1 |
2.35
|
That all Major Projects Reports from the year 2009-10
onwards contain a section that clearly outlines the lessons learned on MPR
projects which are systemic and interrelated in nature. This section must
include plans for how the lesson learned will be incorporated into future
policy and practice. This section is in addition to Section 5 in the PDSSs
(i.e., ‘Lessons Learned’) which should still contain descriptions of
lessons learned that are unique to the individual projects and how they will
be incorporated into future policy and practice across the DMO. Section 5 of
the PDSSs should also include cross-referencing to the systemic issues where
relevant to individual projects.
|
Project maturity scores
2.36
Assigning maturity scores to projects is a way of benchmarking. A
maturity score is a quantitative measure that reflects a project’s stage of
development compared to expected benchmarks.[26] A project maturity
score is based on an assessment of seven attributes that are rated on a scale
between one and ten. These attributes are: Schedule; Cost; Requirement;
Technical understanding; Technical difficulty; Commercial; and Operations and
support.[27]
2.37
The draft template of the MPR that the Committee considered in September
2007 contained a section reporting ‘Project Maturity Scores and Benchmarks’. It
was anticipated that a score for each attribute contributing to the final
maturity score would be reflected in the MPR 2007-08 as it had been in the
draft template of the PDSSs provided to the Committee in September 2007. The
MPR 2007-08, however, contains only the aggregated maturity score.
2.38
The Committee sought clarification about this omission at the hearing on
19 March 2009. At that hearing[28] and again in its
response to questions on notice[29] the DMO agreed to
provide a breakdown of the maturity scores against the seven attributes for the
2008-09 MPR.
2.39
The Committee welcomes this development and wants to ensure that all
future MPRs will contain this information.
2.40
At the hearing on 19 March 2009, the Committee also expressed some
concern that the MPR provided no explanation of how the benchmark maturity
score, as opposed to the maximum score, is determined. The Committee believes
such an explanation would improve readability and comprehension and therefore
should be included in future MPRs.
Recommendation 2 |
2.41
|
That all Major Projects Reports from the year 2009-10 onward
provide a breakdown of maturity scores against the following seven attributes
in project data: Schedule; Cost; Requirement; Technical understanding; Technical
difficulty; Commercial; Operations and support. Additionally, all Major
Projects Reports from the year 2009-10 onward provide a succinct and
straightforward explanation of how the DMO determines the benchmark, as
opposed to the maximum, maturity score.
|
Reporting cost and schedule variance
2.42
The Earned Value Management System (EVMS), where progress is measured
against the schedule terms on a monthly basis, is a key mechanism for checking
cost and schedule progress. The Committee was keen therefore that EVMS data,
where available, would be included in the MPR.
2.43
In its questions placed on notice, the Committee inquired about the
possibility of including this information in the PDSSs. In particular, the
Committee asked the DMO and the ANAO to indicate whether the MPR could include
a graphical representation of cumulative monthly project cost and schedule
variance so as to provide the Parliament with a clear picture of where problems
may or may not be occurring.
2.44
In response to this question, the DMO expressed some concern about creating
inconsistency across the PDSSs given that not all projects have EVM
requirements:
… only selected high value DMO contracts invoke [EVM systems]
requirements. Therefore, we are unable to provide EVM data for those projects
with contracts arrangements that do not have EVM requirements; Foreign Military
Sales (FMS) procurements also fall into this category. Noting that the
objective behind the MPR is to have a standardised set of data across all MPR
projects…presenting EVM data for selected projects would not meet this
objective.[30]
2.45
While the Committee notes the DMO’s concern, it also notes the following
evidence from the ANAO about the advantages of including information on EVMS in
the PDSSs:
The ANAO agrees that there are benefits from including the
Earned Value Management System (EVMS) data in the PDSS, in instances where that
data is available in particular projects, as EVMSs provide an indication of a
project’s cost and schedule variance and emerging trends.[31]
2.46
The Committee fully appreciates that consistency across the PDSSs is the
ideal, however, the Committee wants to ensure that consistency is not achieved
at the expense of accountability and transparency.
2.47
For that reason the Committee urges the DMO and the ANAO to discuss this
matter further with a view to developing a standardised graphical
representation of each project’s cost and schedule variance that can be included
in the PDSSs. The Committee will follow up the outcome of these discussions.
Contingency budget funds
2.48
The Committee questioned the ANAO and the DMO on the possibility of
including information, where possible, about contingency budget funds in the
PDSSs, particularly as this type of information had been included in the draft
PDSS template considered by the Committee in September 2007.
2.49
The Committee notes and appreciates from the DMO’s responses to questions
on notice[32] that while the ANAO is
provided with complete access to the contingency logs of projects, the DMO does
not declare the remaining contingency budgets of projects for security
reasons.
2.50
The Committee notes, however, that the MPR 2007-08 did contain some high
level information about contingency funds.[33] The Committee therefore
welcomes the ANAO’s offer to discuss with the DMO opportunities to provide
higher level disclosures in the MPR that will not compromise security and the
Committee will follow up the outcome of those discussions.[34]
Capability performance data
2.51
The Committee is impressed with the clear information the United Kingdom
National Audit Office (UK NAO) and Ministry of Defence Major Projects Report
provides on capability. That is, whether Key User Requirements (i.e., those that
are considered to be key to the achievement of the mission and are used to
measure project performance[35]) are forecast to be met,
are at risk or will not be met in individual projects.[36]
Capability measures in the Australian MPR 2007-08 are reported as measures
of effectiveness (MOE). These measures reflect key capability performance
attributes of a project which, if not satisfied, would have a significant
effect on the eventual suitability for operational service.[37]
2.52
Individual MOEs for projects were not reported in the MPR 2007-08 for
security classification reasons. Instead, a chart reflecting aggregated
information for the nine projects under review was included in the report. This
chart presented a traffic light analysis of the consolidated MOEs. Percentage
figures were provided for the following: MOEs that were unlikely to be met
(Red light); MOEs under threat but still considered manageable (Amber light);
and MOEs in which there is a high level of confidence they will be met (Green
light).
2.53
The Committee notes from evidence given at the hearing on
19 March 2009 and from the submissions that there is some consensus
between the DMO and the ANAO that the quality of the DMO’s capability Key
Performance Indicators is in need of improvement.[38]
2.54
The submissions indicate that the 2007-08 MPR experienced problems
related to national security classifications[39] and there appears to be
some clarification required around the appropriate way to report capability
(i.e., in system engineering terms such as Measures of Effectiveness compared
to user-based Key User Requirements terms).[40]
2.55
As alluded to above, the Committee sees the work of the UK NAO and
Ministry of Defence Major Projects Report in relation to presenting information
on performance against approved Key User Requirements, and reasons for
variations against approved Key User Requirements[41]
as the ideal model. The Committee also notes the following statement from the
ANAO:
The ANAO is keen to see the inclusion in future MPRs of
unclassified and standardised capability achievement information, in terms of
risk categories to capability achievement as presented in the annual UK
National Audit Office MPR. This information would best be based on the
capability requirements set out in the Materiel Acquisition Agreements (MAAs)
between Capability Development Group and DMO.[42]
2.56
The Committee concurs with this view. While accepting that the
inclusion of this information in the MPR may take more time, the Committee
believes that information will contribute significantly to the capacity of the
ANAO to present the type of analysis the Committee requires (i.e., an analysis
that presents an ANAO summary and key findings similar in format to that
contained in the UK NAO Ministry of Defence MPR).[43]
2.57
The Committee also accepts that ideally the MPR would not contain ‘quick
fixes’.[44] However, the Committee
believes that the provision of percentage data on traffic light counts for each
project as an interim measure (as suggested by the DMO) does have some benefit.
Until such time as the MPR is able to provide unclassified and standardised
capability achievement information of the kind contained in the UK NAO Ministry
of Defence MPR, the traffic light analysis provides the reader of the MPR with
a more accurate assessment of the risks to capability for each project.
Recommendation 3 |
2.58
|
That the Defence Materiel Organisation provide a traffic
light analysis of the percentage breakdown of Capability Measures of
Effectiveness for each project. This traffic light analysis should be
included in each MPR from 2009-10 onward until such time as the DMO is able
to replace this analysis with unclassified and standardised capability
achievement information.
|
Improved analysis
2.59
As referred to above, the Committee is keen for the MPR to include an
analysis similar to that contained in the UK NAO Ministry of Defence MPR. The
Committee is pleased to note that ‘improved analysis regarding project
management performance across all MPR projects both in year and across years’[45]
was included as an area for improvement in future MPRs. The Committee was also
pleased that it is the intention of the ANAO to provide such an analysis in
future MPRs:
The ANAO is planning to undertake this type of analysis for
inclusion in future MPRs and is currently considering ways of analysing and
presenting project cost, schedule and capability data, with the view to provide
an ANAO Summary and Key findings in the 2008-09 MPR.[46]
2.60
However, the Committee further notes:
Progress to date has been limited given the challenges with
cost and performance trends and capability outlined above.[47]
2.61
The Committee is particularly interested in the provision of trend data
in the MPR and inquired of both the DMO and the ANAO, via questions taken on
notice, how trend data will be presented and dealt with in future reports.
2.62
Responses to the Committee’s questions indicate that work towards
developing and presenting trend data is evolving although, as outlined earlier,
it seems clear that the diversity across projects poses challenges as is
evident from the ANAO’s submission below:
Properly maintained Earned Value Management Systems (EVMSs)
provide accurate indications of an individual project’s cost and schedule
variance and emerging trends. However, projects using Milestone-based progress
measures without an accompanying EVMS, would experience difficulty in providing
emerging trend data with regard to a contractor’s cost performance.[48]
2.63
Moreover:
The emerging trends across multiple DMO projects would need
to be obtained from the analysis of trends in similar project groups and
comparing those trends across all groups.[49]
2.64
The Committee notes that the ANAO intends to work with the DMO to
develop suitable systems for trend data collection, analysis and presentation,
including multiple-project (program) trend information.[50]
2.65
The Committee also notes the DMO’s commitment to work cooperatively in
this regard:
I entirely support the development of trend data and its
inclusion in future reports and we will engage with the ANAO on how best to
portray this information.[51]
2.66
The Committee awaits advice on the progress of these discussions and
will follow up the outcomes of those discussions in due course.
MPR schedule
Project selection
2.67
The criteria for project inclusion in the 2008-09 MPR are set out in the
2008-09 Major Project Report Guidelines. These guidelines were developed by
the DMO in consultation with the ANAO.[52] As outlined in
paragraph 2.9 above, the MPR will report on 15 projects in 2008-09, with a
further eight projects being added in 2009-10.[53]
2.68
On 13 August 2009, the Committee was provided with a list of proposed
projects for the 2009-10 MPR for its consideration (Exhibit 3). In addition to
those projects that will be repeated (see paragraph 2.9 above) the Committee
has endorsed the following projects for inclusion in the 2009-10 MPR:
n Field Vehicles and
Trailers – Overlander Program – LAND 121 Phase 3;
n Next Generation
Satellite Program – JP 2008 Phase 4;
n New Heavyweight
Torpedo – SEA 1429 Phase 2;
n Follow-on Stand Off
Weapon – AIR 5418 Phase 1;
n Collins Submarines
Reliability & Sustainability – SEA 1439 Phase 3;
n Anzac Ship Anti-ship
Missile Defence – SEA 1448 Phase 2A;
n Maritime Patrol and
Response Aircraft System – AIR 7000 Phase 2; and
n Airborne Surveillance
for Land Operations – JP 129 Phase 2.
2.69
The Committee also notes the following ‘Principles for New MPR Projects’
contained in Exhibit 3:
n Projects must have at
least three years of asset delivery remaining (high cost of introducing a new
project – min 3 years reporting life)
n Total approved
project budget >$150m (to avoid picking up insignificant projects)
n Projects must have at
least $50m or 10% of their budget remaining for the next two years (for
sensible financial progress reporting)
n [Defence Capability
Plan] projects only admitted one year after [Year of Decision] (min time for
projects to progress acquisition)
n Maximum eight
new projects in any one year (capacity constraints of DMO and ANAO)[54]
2.70
The Committee suggests the addition of the following final principle:
n All projects for
inclusion in the MPR will be proposed by the DMO in consultation with the ANAO
and provided to the JCPAA for comment.
2.71
The Committee notes from submissions that the list of projects to be
included in each MPR should be settled by the end of September so as to allow
sufficient time for preparation of the PDSS.[55] To that end, the
Committee expects to be consulted on proposed projects for inclusion in the MPR
by 31 August each year.
2.72
Similarly, the Committee notes that it will be consulted when the DMO
and the ANAO have reached agreement on projects that have reached a state of
‘practical completion’[56] and as such may no
longer be appropriate to be reported on in the MPR. The Committee expects that
should a decision be made to remove a project from the MPR, the ANAO and the
DMO will provide a full rationale for its exclusion and that this rationale
will be included into the MPR.
2.73
The Committee appreciates that the point at which the MPR will reach its
maximum of thirty projects is dependent upon the level of resourcing available
in both organisations. That said, the Committee anticipates that the MPR will
contain thirty projects in the year 2010-2011.
Recommendation 4 |
2.74
|
That no later than 31 August each year, the ANAO and the DMO
will consult the Committee on the projects to be included in and, where
appropriate, excluded from, the following year’s MPR.
|
Timetabling
2.75
Evidence provided to the Committee reinforces the point that scheduling
for the MPR is time critical and that it will become more so as the number of
projects increases to the maximum of thirty.[57]
2.76
Indeed there appears to be a good deal of evidence to suggest that where
an efficient schedule has not been agreed to by the parties, this is likely to
lead to less than ideal outcomes such as scope reductions. The timetable for
the pilot MPR appears to have put somewhat of a strain on both organisations.
2.77
As the ANAO state:
The 2007-08 MPR demonstrated that schedule management was of
critical importance to the report’s overall quality.[58]
2.78
Similarly, the CEO of the DMO, Dr Gumley, in his Foreword to the MPR,
also refers to the importance of ensuring efficient timelines:
…the time required for the projects to prepare their project
data as at the end of the financial year, and the internal clearances required
within the DMO, was extremely compressed during this pilot year. These
timelines need to be reviewed to ensure that in the future the final MPR is a
high quality product and provides surety regarding the published information.[59]
2.79
This concern was reiterated in the report itself, in ‘Lessons Learned
from the 2007-08 Major Projects Report and Intentions for Improvement’, as
follows:
n reviewing the
schedule for the MPR – populating data in the PDSS, data assurance, ANAO
assurance, and report compilation all exceeded planned pilot program schedule.[60]
2.80
The Committee notes that the ANAO requires an efficient schedule that
distributes the work the ANAO is required to complete for the MPR (i.e.,
reviewing DMO projects and evidence supporting the data and narratives provided
by the DMO) as evenly as possible from February to September each year.[61]
The Committee will monitor this issue.
2.81
The Committee also welcomes the evidence provided to it on 19 August 2009,
that it is the intention of the DMO and the ANAO to table the MPR 2008-09 on or
before 18 November 2009.[62] This will afford the
Committee an opportunity to examine the report prior to the end of the
parliamentary sitting year.
General
2.82
As referred to above, the MPR 2007-08 outlined a number of lessons
learned from its development and intentions for improvements. One of these
lessons included ‘improvements in readability and comprehension that need to be
addressed in the PDSS’.[63]
2.83
To that end, the Committee believes the readability of the document
could be significantly improved by using a consistent order of projects across
the document. For example, in the MPR 2007-08 the order in which the projects
are presented or listed differs on page 20 (list of projects selected for
review), page 58-81 (financial analysis of MPR projects), graphs on pages 84
and 85, and the order in which the PDSSs are presented.
Recommendation 5 |
2.84
|
That where possible the order of presentation of the projects
will remain consistent across the Major Projects Report.
|
Committee comments
2.85
While recognising that improvements can be made to the MPR, the Committee
is pleased with the MPR 2007-08 and it congratulates the parties involved on achieving
that outcome.
2.86
The Committee is well aware that the MPR is not a substitute for
performance audits and it welcomes the broader perspective that the report will
be able to provide across the DMO portfolio.[64] That said, the Committee
was reassured to hear evidence on 19 March 2009 that the ANAO will not be
reducing its performance audits across the Defence portfolio.[65]
2.87
The Committee notes that the relationship between the ANAO and the DMO
continues to evolve in a positive way, with representatives from both agencies
making comments to that effect.[66]
2.88
The Committee understands that it will take some time to ‘bed down’ the
elements of the MPR and is keen to make a positive contribution to the ongoing
development of the MPR and its components. It will continue to monitor the MPR
process to ensure that where improvements can be made to that process, they
will be.
2.89
The Committee also notes that it is currently undertaking an inquiry
into the Auditor-General Act 1997. Whilst still ongoing, that inquiry
is addressing, amongst other things, whether the Act’s focus on the traditional
assurance and performance audit roles should be expanded to take explicit
account of newer functions performed by the Auditor-General such as reviewing
the Major Projects Report.
Sharon Grierson MP
Committee Chair