![]() ![]() ![]()
|
|||
|
|
|||
|
|
|
|
|
|||||||||||||||||||||
Recommendation 1 |
|
|
2.35 |
That all Major Projects Reports from the year 2009-10
onwards contain a section that clearly outlines the lessons learned on MPR
projects which are systemic and interrelated in nature. This section must
include plans for how the lesson learned will be incorporated into future
policy and practice. This section is in addition to Section 5 in the PDSSs
(i.e., ‘Lessons Learned’) which should still contain descriptions of
lessons learned that are unique to the individual projects and how they will
be incorporated into future policy and practice across the DMO. Section 5 of
the PDSSs should also include cross-referencing to the systemic issues where
relevant to individual projects. |
2.36 Assigning maturity scores to projects is a way of benchmarking. A maturity score is a quantitative measure that reflects a project’s stage of development compared to expected benchmarks.[26] A project maturity score is based on an assessment of seven attributes that are rated on a scale between one and ten. These attributes are: Schedule; Cost; Requirement; Technical understanding; Technical difficulty; Commercial; and Operations and support.[27]
2.37 The draft template of the MPR that the Committee considered in September 2007 contained a section reporting ‘Project Maturity Scores and Benchmarks’. It was anticipated that a score for each attribute contributing to the final maturity score would be reflected in the MPR 2007-08 as it had been in the draft template of the PDSSs provided to the Committee in September 2007. The MPR 2007-08, however, contains only the aggregated maturity score.
2.38 The Committee sought clarification about this omission at the hearing on 19 March 2009. At that hearing[28] and again in its response to questions on notice[29] the DMO agreed to provide a breakdown of the maturity scores against the seven attributes for the 2008-09 MPR.
2.39 The Committee welcomes this development and wants to ensure that all future MPRs will contain this information.
2.40 At the hearing on 19 March 2009, the Committee also expressed some concern that the MPR provided no explanation of how the benchmark maturity score, as opposed to the maximum score, is determined. The Committee believes such an explanation would improve readability and comprehension and therefore should be included in future MPRs.
Recommendation 2 |
|
|
2.41 |
That all Major Projects Reports from the year 2009-10 onward
provide a breakdown of maturity scores against the following seven attributes
in project data: Schedule; Cost; Requirement; Technical understanding; Technical
difficulty; Commercial; Operations and support. Additionally, all Major
Projects Reports from the year 2009-10 onward provide a succinct and
straightforward explanation of how the DMO determines the benchmark, as
opposed to the maximum, maturity score. |
2.42 The Earned Value Management System (EVMS), where progress is measured against the schedule terms on a monthly basis, is a key mechanism for checking cost and schedule progress. The Committee was keen therefore that EVMS data, where available, would be included in the MPR.
2.43 In its questions placed on notice, the Committee inquired about the possibility of including this information in the PDSSs. In particular, the Committee asked the DMO and the ANAO to indicate whether the MPR could include a graphical representation of cumulative monthly project cost and schedule variance so as to provide the Parliament with a clear picture of where problems may or may not be occurring.
2.44 In response to this question, the DMO expressed some concern about creating inconsistency across the PDSSs given that not all projects have EVM requirements:
… only selected high value DMO contracts invoke [EVM systems] requirements. Therefore, we are unable to provide EVM data for those projects with contracts arrangements that do not have EVM requirements; Foreign Military Sales (FMS) procurements also fall into this category. Noting that the objective behind the MPR is to have a standardised set of data across all MPR projects…presenting EVM data for selected projects would not meet this objective.[30]
2.45 While the Committee notes the DMO’s concern, it also notes the following evidence from the ANAO about the advantages of including information on EVMS in the PDSSs:
The ANAO agrees that there are benefits from including the Earned Value Management System (EVMS) data in the PDSS, in instances where that data is available in particular projects, as EVMSs provide an indication of a project’s cost and schedule variance and emerging trends.[31]
2.46 The Committee fully appreciates that consistency across the PDSSs is the ideal, however, the Committee wants to ensure that consistency is not achieved at the expense of accountability and transparency.
2.47 For that reason the Committee urges the DMO and the ANAO to discuss this matter further with a view to developing a standardised graphical representation of each project’s cost and schedule variance that can be included in the PDSSs. The Committee will follow up the outcome of these discussions.
2.48 The Committee questioned the ANAO and the DMO on the possibility of including information, where possible, about contingency budget funds in the PDSSs, particularly as this type of information had been included in the draft PDSS template considered by the Committee in September 2007.
2.49 The Committee notes and appreciates from the DMO’s responses to questions on notice[32] that while the ANAO is provided with complete access to the contingency logs of projects, the DMO does not declare the remaining contingency budgets of projects for security reasons.
2.50 The Committee notes, however, that the MPR 2007-08 did contain some high level information about contingency funds.[33] The Committee therefore welcomes the ANAO’s offer to discuss with the DMO opportunities to provide higher level disclosures in the MPR that will not compromise security and the Committee will follow up the outcome of those discussions.[34]
2.51 The Committee is impressed with the clear information the United Kingdom National Audit Office (UK NAO) and Ministry of Defence Major Projects Report provides on capability. That is, whether Key User Requirements (i.e., those that are considered to be key to the achievement of the mission and are used to measure project performance[35]) are forecast to be met, are at risk or will not be met in individual projects.[36] Capability measures in the Australian MPR 2007-08 are reported as measures of effectiveness (MOE). These measures reflect key capability performance attributes of a project which, if not satisfied, would have a significant effect on the eventual suitability for operational service.[37]
2.52 Individual MOEs for projects were not reported in the MPR 2007-08 for security classification reasons. Instead, a chart reflecting aggregated information for the nine projects under review was included in the report. This chart presented a traffic light analysis of the consolidated MOEs. Percentage figures were provided for the following: MOEs that were unlikely to be met (Red light); MOEs under threat but still considered manageable (Amber light); and MOEs in which there is a high level of confidence they will be met (Green light).
2.53 The Committee notes from evidence given at the hearing on 19 March 2009 and from the submissions that there is some consensus between the DMO and the ANAO that the quality of the DMO’s capability Key Performance Indicators is in need of improvement.[38]
2.54 The submissions indicate that the 2007-08 MPR experienced problems related to national security classifications[39] and there appears to be some clarification required around the appropriate way to report capability (i.e., in system engineering terms such as Measures of Effectiveness compared to user-based Key User Requirements terms).[40]
2.55 As alluded to above, the Committee sees the work of the UK NAO and Ministry of Defence Major Projects Report in relation to presenting information on performance against approved Key User Requirements, and reasons for variations against approved Key User Requirements[41] as the ideal model. The Committee also notes the following statement from the ANAO:
The ANAO is keen to see the inclusion in future MPRs of unclassified and standardised capability achievement information, in terms of risk categories to capability achievement as presented in the annual UK National Audit Office MPR. This information would best be based on the capability requirements set out in the Materiel Acquisition Agreements (MAAs) between Capability Development Group and DMO.[42]
2.56 The Committee concurs with this view. While accepting that the inclusion of this information in the MPR may take more time, the Committee believes that information will contribute significantly to the capacity of the ANAO to present the type of analysis the Committee requires (i.e., an analysis that presents an ANAO summary and key findings similar in format to that contained in the UK NAO Ministry of Defence MPR).[43]
2.57 The Committee also accepts that ideally the MPR would not contain ‘quick fixes’.[44] However, the Committee believes that the provision of percentage data on traffic light counts for each project as an interim measure (as suggested by the DMO) does have some benefit. Until such time as the MPR is able to provide unclassified and standardised capability achievement information of the kind contained in the UK NAO Ministry of Defence MPR, the traffic light analysis provides the reader of the MPR with a more accurate assessment of the risks to capability for each project.
Recommendation 3 |
|
|
2.58 |
That the Defence Materiel Organisation provide a traffic
light analysis of the percentage breakdown of Capability Measures of
Effectiveness for each project. This traffic light analysis should be
included in each MPR from 2009-10 onward until such time as the DMO is able
to replace this analysis with unclassified and standardised capability
achievement information. |
2.59 As referred to above, the Committee is keen for the MPR to include an analysis similar to that contained in the UK NAO Ministry of Defence MPR. The Committee is pleased to note that ‘improved analysis regarding project management performance across all MPR projects both in year and across years’[45] was included as an area for improvement in future MPRs. The Committee was also pleased that it is the intention of the ANAO to provide such an analysis in future MPRs:
The ANAO is planning to undertake this type of analysis for inclusion in future MPRs and is currently considering ways of analysing and presenting project cost, schedule and capability data, with the view to provide an ANAO Summary and Key findings in the 2008-09 MPR.[46]
2.60 However, the Committee further notes:
Progress to date has been limited given the challenges with cost and performance trends and capability outlined above.[47]
2.61 The Committee is particularly interested in the provision of trend data in the MPR and inquired of both the DMO and the ANAO, via questions taken on notice, how trend data will be presented and dealt with in future reports.
2.62 Responses to the Committee’s questions indicate that work towards developing and presenting trend data is evolving although, as outlined earlier, it seems clear that the diversity across projects poses challenges as is evident from the ANAO’s submission below:
Properly maintained Earned Value Management Systems (EVMSs) provide accurate indications of an individual project’s cost and schedule variance and emerging trends. However, projects using Milestone-based progress measures without an accompanying EVMS, would experience difficulty in providing emerging trend data with regard to a contractor’s cost performance.[48]
2.63 Moreover:
The emerging trends across multiple DMO projects would need to be obtained from the analysis of trends in similar project groups and comparing those trends across all groups.[49]
2.64 The Committee notes that the ANAO intends to work with the DMO to develop suitable systems for trend data collection, analysis and presentation, including multiple-project (program) trend information.[50]
2.65 The Committee also notes the DMO’s commitment to work cooperatively in this regard:
I entirely support the development of trend data and its inclusion in future reports and we will engage with the ANAO on how best to portray this information.[51]
2.66 The Committee awaits advice on the progress of these discussions and will follow up the outcomes of those discussions in due course.
2.67 The criteria for project inclusion in the 2008-09 MPR are set out in the 2008-09 Major Project Report Guidelines. These guidelines were developed by the DMO in consultation with the ANAO.[52] As outlined in paragraph 2.9 above, the MPR will report on 15 projects in 2008-09, with a further eight projects being added in 2009-10.[53]
2.68 On 13 August 2009, the Committee was provided with a list of proposed projects for the 2009-10 MPR for its consideration (Exhibit 3). In addition to those projects that will be repeated (see paragraph 2.9 above) the Committee has endorsed the following projects for inclusion in the 2009-10 MPR:
n Field Vehicles and Trailers – Overlander Program – LAND 121 Phase 3;
n Next Generation Satellite Program – JP 2008 Phase 4;
n New Heavyweight Torpedo – SEA 1429 Phase 2;
n Follow-on Stand Off Weapon – AIR 5418 Phase 1;
n Collins Submarines Reliability & Sustainability – SEA 1439 Phase 3;
n Anzac Ship Anti-ship Missile Defence – SEA 1448 Phase 2A;
n Maritime Patrol and Response Aircraft System – AIR 7000 Phase 2; and
n Airborne Surveillance for Land Operations – JP 129 Phase 2.
2.69 The Committee also notes the following ‘Principles for New MPR Projects’ contained in Exhibit 3:
n Projects must have at least three years of asset delivery remaining (high cost of introducing a new project – min 3 years reporting life)
n Total approved project budget >$150m (to avoid picking up insignificant projects)
n Projects must have at least $50m or 10% of their budget remaining for the next two years (for sensible financial progress reporting)
n [Defence Capability Plan] projects only admitted one year after [Year of Decision] (min time for projects to progress acquisition)
n Maximum eight new projects in any one year (capacity constraints of DMO and ANAO)[54]
2.70 The Committee suggests the addition of the following final principle:
n All projects for inclusion in the MPR will be proposed by the DMO in consultation with the ANAO and provided to the JCPAA for comment.
2.71 The Committee notes from submissions that the list of projects to be included in each MPR should be settled by the end of September so as to allow sufficient time for preparation of the PDSS.[55] To that end, the Committee expects to be consulted on proposed projects for inclusion in the MPR by 31 August each year.
2.72 Similarly, the Committee notes that it will be consulted when the DMO and the ANAO have reached agreement on projects that have reached a state of ‘practical completion’[56] and as such may no longer be appropriate to be reported on in the MPR. The Committee expects that should a decision be made to remove a project from the MPR, the ANAO and the DMO will provide a full rationale for its exclusion and that this rationale will be included into the MPR.
2.73 The Committee appreciates that the point at which the MPR will reach its maximum of thirty projects is dependent upon the level of resourcing available in both organisations. That said, the Committee anticipates that the MPR will contain thirty projects in the year 2010-2011.
Recommendation 4 |
|
|
2.74 |
That no later than 31 August each year, the ANAO and the DMO
will consult the Committee on the projects to be included in and, where
appropriate, excluded from, the following year’s MPR. |
2.75 Evidence provided to the Committee reinforces the point that scheduling for the MPR is time critical and that it will become more so as the number of projects increases to the maximum of thirty.[57]
2.76 Indeed there appears to be a good deal of evidence to suggest that where an efficient schedule has not been agreed to by the parties, this is likely to lead to less than ideal outcomes such as scope reductions. The timetable for the pilot MPR appears to have put somewhat of a strain on both organisations.
2.77 As the ANAO state:
The 2007-08 MPR demonstrated that schedule management was of critical importance to the report’s overall quality.[58]
2.78 Similarly, the CEO of the DMO, Dr Gumley, in his Foreword to the MPR, also refers to the importance of ensuring efficient timelines:
…the time required for the projects to prepare their project data as at the end of the financial year, and the internal clearances required within the DMO, was extremely compressed during this pilot year. These timelines need to be reviewed to ensure that in the future the final MPR is a high quality product and provides surety regarding the published information.[59]
2.79 This concern was reiterated in the report itself, in ‘Lessons Learned from the 2007-08 Major Projects Report and Intentions for Improvement’, as follows:
n reviewing the schedule for the MPR – populating data in the PDSS, data assurance, ANAO assurance, and report compilation all exceeded planned pilot program schedule.[60]
2.80 The Committee notes that the ANAO requires an efficient schedule that distributes the work the ANAO is required to complete for the MPR (i.e., reviewing DMO projects and evidence supporting the data and narratives provided by the DMO) as evenly as possible from February to September each year.[61] The Committee will monitor this issue.
2.81 The Committee also welcomes the evidence provided to it on 19 August 2009, that it is the intention of the DMO and the ANAO to table the MPR 2008-09 on or before 18 November 2009.[62] This will afford the Committee an opportunity to examine the report prior to the end of the parliamentary sitting year.
2.82 As referred to above, the MPR 2007-08 outlined a number of lessons learned from its development and intentions for improvements. One of these lessons included ‘improvements in readability and comprehension that need to be addressed in the PDSS’.[63]
2.83 To that end, the Committee believes the readability of the document could be significantly improved by using a consistent order of projects across the document. For example, in the MPR 2007-08 the order in which the projects are presented or listed differs on page 20 (list of projects selected for review), page 58-81 (financial analysis of MPR projects), graphs on pages 84 and 85, and the order in which the PDSSs are presented.
Recommendation 5 |
|
|
2.84 |
That where possible the order of presentation of the projects
will remain consistent across the Major Projects Report. |
2.85 While recognising that improvements can be made to the MPR, the Committee is pleased with the MPR 2007-08 and it congratulates the parties involved on achieving that outcome.
2.86 The Committee is well aware that the MPR is not a substitute for performance audits and it welcomes the broader perspective that the report will be able to provide across the DMO portfolio.[64] That said, the Committee was reassured to hear evidence on 19 March 2009 that the ANAO will not be reducing its performance audits across the Defence portfolio.[65]
2.87 The Committee notes that the relationship between the ANAO and the DMO continues to evolve in a positive way, with representatives from both agencies making comments to that effect.[66]
2.88 The Committee understands that it will take some time to ‘bed down’ the elements of the MPR and is keen to make a positive contribution to the ongoing development of the MPR and its components. It will continue to monitor the MPR process to ensure that where improvements can be made to that process, they will be.
2.89 The Committee also notes that it is currently undertaking an inquiry into the Auditor-General Act 1997. Whilst still ongoing, that inquiry is addressing, amongst other things, whether the Act’s focus on the traditional assurance and performance audit roles should be expanded to take explicit account of newer functions performed by the Auditor-General such as reviewing the Major Projects Report.
Sharon Grierson MP
Committee Chair