Award of Funding Under the Regional Jobs and Investment Packages
Entities audited: Department of Infrastructure, Transport, Cities and Regional Development
Department of Industry, Innovation and Science
Introduction
4.1
The Regional Jobs and Investment Packages (RJIP) is a competitive grants program that was established to drive economic growth and create jobs in 10 regions across five states:
Queensland: Bowen Basin, Tropical North Queensland, Wide Bay Burnett;
New South Wales: North Coast, South Coast;
South Australia: Upper Spencer Gulf;
Victoria: Geelong, Goulburn Valley, Latrobe Valley; and
Tasmania: Regional Tasmania.
4.2
The Department of Infrastructure, Transport, Cities and Regional Development (Infrastructure) is responsible for the administration of the program. Infrastructure engaged the Business Grants Hub within the Department of Industry, Innovation and Science (Industry) to administer key aspects of the program including:
Assisting with program design and guidelines;
Developing governance arrangements and implementation plans; and
Designing risk assessment, reporting and evaluation frameworks;
Operationalising the assessment framework for the program; and the
Execution and monitoring of grant agreements.
4.3
Program guidelines covering the ten regions were released in March 2017. As a non-corporate Commonwealth entity, Infrastructure is obligated to conduct grant processes in accordance with the Commonwealth Grants Rules and Guidelines (CGRGs).
4.4
The program guidelines stipulated that a Local Planning Committee (LPC) for each region would be appointed by the Minister for Regional Development (the Minister). Each PLC was to develop a Local Investment Plan (LIP) and submit it to the Minister. Applications for funding would open in each region after the relevant LIP had been published. Applicants were advised that they needed to demonstrate that their project aligned with the industry growth sectors, new market opportunities and future workforce needs outlined in the relevant region’s LIP.
4.5
The program guidelines also stated that grant funding would only contribute up to 50 per cent of eligible project costs. Applicants were required to demonstrate their commitment to the project by providing evidence of their ability to co-fund the project.
4.6
Grant funding was available under one of three streams. Notably, each stream had slightly different merit and eligibility criteria.
4.7
The program guidelines informed applicants that Infrastructure would provide advice to a Ministerial Panel on eligible applications along with recommendations on which projects to fund. Program guidelines stated that the Ministerial Panel would decide which applications to fund, taking into account departmental recommendations and the availability of grant funding as well as ‘other factors’ including:
the balance of local infrastructure, business innovation and skills and training projects in the region;
other projects or planned projects in the region, and the extent to which the proposed project may duplicate those projects or complement them and the services that they offer; and
the level of funding allocated to an applicant in previous programs.
4.8
In total 702 grant applications were received with 233 grants approved by the Minister for funding.
4.9
In May 2018, Infrastructure engaged a consultant to assess the extent to which the RJIP was conducted in accordance with the CGRGs. The assurance review concluded that Infrastructure needed to improve how assessors tested the veracity of claims made in applications, undertake quality assurance activities in relation to assessments completed, and strengthen procedures relating to the identification and management of conflict of interest.
Overview of performance audit findings
4.10
The audit concluded that advice provided by Infrastructure to the Minister was largely appropriate, however assessment processes were not to the standard required by the grants administration framework. The audit found that both eligibility and merit criterion were inconsistently assessed against the program guidelines. Additionally, requests for co-funding exemptions were not appropriately considered and conflict of interest was not managed consistently or to an appropriate standard.
4.11
The audit concluded that the briefing approach taken by Infrastructure promoted accountability, however improvements could be made in relation to reporting decisions of the Ministerial Panel where non-recommended projects were approved.
4.12
The audit made three recommendations that focused on improving assessment processes and increasing transparency in relation to Ministerial Panel decision-making. The Department agreed with all three recommendations and has implemented learnings from an assurance review commissioned by the Department in May 2018.
4.13
The Department also indicated that key lessons from the contracted assurance review had been implemented in other funding programs they administer, including the Building Better Regions Fund (BBRF) and the Regional Growth Fund (RGF).
4.14
In responding to all three of the ANAO’s recommendations, the Department acknowledged that they had already ‘implemented clearer and more robust procedures for assessing and briefing on co-funding exception request and applied this to regional program funding rounds’, as well as ‘implemented a quality assurance check on a sample of assessments during the process to ensure the assessments are complying with agreed procedures’.
Conduct of the audit
4.15
The ANAO stated that the audit was undertaken as grants processes are a matter of parliamentary interest. The RJIP grant was specifically examined as it was included in the Joint Standing Committee of Public Accounts and Audit’s list of audit priorities for 2018-19.
4.16
The objective of the audit was to assess whether the award of the funding under the RJIP program was informed by appropriate departmental advice and that the process compiled with the grants administration framework.
4.17
To form a conclusion against the objective, the ANAO examined:
Whether applications were soundly assessed in accordance with the program guidelines; and
Whether funding decisions were supported by clear advice and consistent with requirements.
4.18
The audit scope included consideration of all applications received and assessed; and funding decisions taken for each of the 10 regions. This included: the development of program guidelines and related material; the assessment of applications against the eligibility and merit criteria; advice provided to decision-makers; and the decision-making process of the Ministerial Panel.
4.19
The audit was conducted in accordance with ANAO auditing standards at a cost of $362 000 and utilised three ANAO staff.
Application assessment process
4.20
The audit identified concerns in relation to internal processes used to contract, train and guide assessors, the quality of assessments undertaken; and processes for quality assurance of initial assessment. The ANAO concluded that ‘assessment processes were not to the standard required by the grants administration framework’.
4.21
Industry determined that it needed to engage external contractors to complete the assessments within the program guidelines published timeframes. The audit states that ‘Industry did not employ competition in the selection process for engaging [a] firm to undertake RJIP assessment work or benchmark the rates being paid for assessment work’. Instead, Industry utilised an existing contract for contact centre services. Industry received legal advice that approved this approach and advised that the contractor was obliged to ensure staff undertaking assessments had relevant qualifications or experience.
4.22
The cost of contracted services was expected to be $3.15 million. As of 28 February 2020, Industry had paid the provider $1.8 million reflecting the number of applications received.
4.23
In respect of process, the audit concluded that the program guidelines were adequate to ensure a process that complied with the CGRGs. Program guidelines outlined that an assessor would undertake and record the initial assessment, which would then be reviewed by an assessment team leader.
4.24
The audit determined that Industry’s records of training were poor. This made it difficult to identify if assessors had attended or completed the training provided prior to commencing assessment activity. For example, one person who performed a ‘quality specialist’ role had been invited to attend a briefing in early August 2017 on the quality specialist role but there was no evidence of the person attending. The audit identified that this assessor had the most recorded scoring errors of any assessor.
4.25
Industry responded by acknowledging that there was no formal process in place to obtain feedback from the training participants following the completion of the initial training and at the completion of the assessments. Industry advised that participant development was ongoing and delivered in a continuous learning atmosphere throughout the assessment process. The Committee heard that training materials and reporting associated with training have been reviewed and matured since the audit was conducted.
4.26
Recommendation 5 of this report addresses the importance of ensuring all relevant entities involved in grants administration receive and complete sufficient training, with documented processes to ensure the ongoing quality assurance of assessments.
Consistency with guidelines
Eligibility assessment
4.27
The audit found that although program guidelines included clear information regarding minimum requirements for eligibility and merit criterion, ‘eligibility requirements were not applied in full, and there are indications of shortcomings in the assessment of the merit criterion most directly related to program outcomes’.
4.28
The program guidelines included eligibility criteria that applicants needed to demonstrate in order to be considered for the grants. Applicants had to be located within an eligible region, or benefit an eligible region; be a certain type of entity (requirements varied for each stream of funding); undertake a certain type of project (which was dependent upon the stream of funding sought); and demonstrate the ability to have another entity or the applicant co-fund the grant activity.
4.29
The program guidelines were clear that Infrastructure ‘cannot consider your application if you do not satisfy all eligibility criteria’. The program guidelines stated that the eligibility criteria could not be waived ‘under any circumstances’ and that only eligible applications would proceed to merit assessment, with the merit criteria scored relative to the project size, complexity and grant amount requested.
4.30
The audit found that whilst the electronic eligibility assessment form was well designed with tick boxes and had space for comments by assessors to record reasons for their assessment, the form design did not prevent process errors. The audit found that 77 forms proceeded to merit assessment despite no response being recorded in a required field, or alternatively, both a yes and no box were ticked instead of the required one answer. Errors remained following assessment by an assessment leader, with incorrect applications proceeding to merit review.
4.31
In total 634 applications were assessed against eligibility requirements, with 12 applications assessed as ineligible. Five organisations were ineligible due to their status as a Registered Training Organisation (RTO) or registration as a further or higher education body. Four were ineligible because they were individuals and sole traders; two because their activities were ineligible; one because the applicant wasn't a trading entity; and one because it was a tertiary education facility.
4.32
All ineligible applicants were given the opportunity to submit additional documentation to prove that the initial assessment was incorrect. Two applicants took this opportunity. The ANAO stated that neither of these applications was successful in demonstrating eligibility following the provision of additional information.
4.33
Ineligible applications were not ranked by Infrastructure, although they were included in the funding brief prepared for the Ministerial Panel. The reason for the application’s ineligibility was also provided to the Ministerial Panel.
4.34
The audit found that one ineligible program was merit assessed and received funding despite being an RTO. In June 2019, Infrastructure advised the ANAO that the application:
…was merit assessed because the Minister’s Office showed a strong preference to fund this project, as it deemed the RTO element of the proponent’s business to be incidental to the project seeking funding under RJIP. The Department requested the assessment to better understand the project in order to appropriately advise the Minister.
4.35
The audit found that no other RTOs were examined to determine if their training status was also incidental to the primary activity of the business.
4.36
To demonstrate the ability of the applicant to co-fund the grant activity, the eligibility criteria required proof of Chief Executive Officer or board support for the grant application and a declaration from an accountant that the entity could meet its co-funding obligations. The audit stated that Industry ‘relaxed’ eligibility requirements for 300 applications in respect of ‘evidence of support from the board or equivalent, Accountant Declaration, verification of funding contributions from other contributors or project plans’. Industry advised the ANAO that this occurred as ‘if the guidelines were applied strictly a large number of applications would have been assessed as ineligible’.
4.37
The audit found that while 127 applications did not provide a signed letter of support from the board or equivalent, only 34 applications were marked as having failed to meet the eligibility criteria; and of the 193 applications that did not fully comply with the requirement for an Accountant Declaration only 16 were marked as failing to fulfil the requirements. In respect of both criteria the audit highlighted that there ‘was no threshold established to determine whether a discrepancy was significant or not’. Follow-up was inconsistent with some assessors contacting applicants to seek correct documentation and others making comments on the assessment.
4.38
The audit included an example of an application that failed to demonstrate its ability to meet the co-funding requirement of the project. The application proceeded to the merit assessment stage and was awarded a grant. The applicant later declined the grant offer as it was unable to secure the funding required from other sources.
Merit assessment
4.39
There were four published merit criteria used to determine if the proposed project represented value for money. Assessors determined if the project aligned with regional priorities; would provide economic benefits to the region; offer value for money; and if the organisation had the capacity and resources to carry out the project. In total, 92 per cent of applications met the minimum passing score for each of the criteria.
4.40
The individual scores for each criterion were added up to total a score out of 100 for each application. The overall score was used to rank competing eligible applications for each region. The audit found that 30 applications had incorrect scores recorded on the form compared to the working spreadsheet scoring sheet. Industry informed the ANAO that ‘where feedback from the quality specialists identified a revision was needed to a particular score, minor adjustments could be calculated without the assistance of the scoring template.’
4.41
The audit stated that assessment ‘procedures alone did not provide a strong basis for contracted staff to conduct merit assessments’. For example, the procedures did not include any guidance specific to assessing an application’s job creation claims. Testing the veracity of claims was not specifically addressed in the training materials either. The assurance review commissioned by Infrastructure reached a similar conclusion.
4.42
Infrastructure recommended 232 applications to the Ministerial Panel for approval. The Ministerial Panel did not approve 28 per cent of these applications, instead approving 17 per cent of applications that had not been recommended. The ANAO considered the reasons provided by the Ministerial Panel for the high rate of change and concluded that the Ministerial Panel considered applications had been incorrectly scored. According to the recorded reasons of the Ministerial Panel:
100 applications had been incorrectly scored against the second criterion, equally divided between overstated scores and understated scores;
53 applications had been incorrectly overscored against the third criterion; and
57 applications had been incorrectly scored against the fourth criterion.
4.43
In almost all instances the Panel considered the scoring had understated the extent to which the application met the criterion.
4.44
The audit found that there were 51 instances where the Ministerial Panel considered the application had been incorrectly scored against a single criterion. For 70 applications the Ministerial Panel considered the application had been incorrectly scored against two criteria, and there were six instances where the Ministerial Panel considered the application had been incorrectly scored against three criteria.
4.45
Infrastructure did not inform Industry about the Ministerial Panel’s view that there was a high rate of over or under-scored assessments. As such, there was no discussion between Infrastructure and Industry about improving quality assurance processes for future assessments. Consequently there was no examination undertaken to determine if the issue was related to training gaps, a particular assessor, or a systematic issue with assessment process.
Decision-making process regarding funding
4.46
In addition to the assessment process, the program guidelines set out the approval and decision-making processes for the award of funding. The guidelines outlined that Infrastructure would provide advice to the Ministerial Panel on eligible applications and provide recommendations on which applications to fund based on the assessments undertaken by Industry. Ultimately, however, decision-making on funding was the responsibility of the Panel.
Advice and recommendations
4.47
Under the CGRGs, ‘officials must provide written advice to Ministers, where Ministers exercise the role of an approver’. Such advice must include the application and selection process followed, include the key principle of achieving value with relevant money and identify which applications fully meet, partially meet or do not meet the selection criteria.
4.48
To this end, Infrastructure provided the Ministerial Panel a funding recommendations brief for each region that identified the projects that Infrastructure recommended be awarded funding. This brief included information about the score required for an application to be assessed as value for money, identified any applications assessed as ineligible, provided a summary ranking spreadsheet of each application and the scores for each criterion, and individual assessment ‘snapshots’ for each eligible application. The audit found that ‘decisions taken on the award of grant funding were supported by clear advice’.
4.49
Infrastructure’s approach to making funding recommendations met the requirements of the CGRGs. Despite this, however, the ANAO found that there were inconsistencies in the approach taken by Infrastructure to rank the applications and subsequently provide a list of recommended applications for funding. In some regions all eligible applications were ranked according to the result of the merit assessment scoring. In other regions, only the applications that had been scored as meeting each of the merit criteria were ranked, excluding those that had been assessed as failing one or more of the criteria. In one region, only those applications that were recommended for funding were ranked. No other applications were ranked, ‘including those that had scored higher than the recommended applications but that were unable to be accommodated within the funding available for the region’.
4.50
This highlighted an inconsistency between the ranking of the applications and the recommendations for funding. In most instances, applications were ranked based on their total score, with applications awarded the same total assessment score given the same rank. The recommendations, however, were not simply based on the rankings themselves, including as many as could be accommodated by the budget for that region. Rather, Infrastructure worked down the ranked list and
…where the requested grant amount could not be accommodated within the funding cap, the department would ‘skip’ over that (and any other applications whose requested grant amount could also not be accommodated) to recommend the next highest ranked application(s) that could be afforded within the funding cap.
4.51
This meant the process for recommending funding skipped higher ranked applications in favour of lower ones that were closer to the amount left in the budget for that region. The ANAO highlighted one instance where, with $83 876 remaining in the budget, Infrastructure skipped an application ranked 13 (requesting $60 000) along with applications ranked 14, 15, 16 and 17. Instead, the application ranked 18 which requested $80 000 was recommended for funding.
4.52
Further inconsistencies in Infrastructure’s processes were found in the advice to the Panel on requests for a co-funding exemption. The program guidelines emphasised the importance of organisations leveraging funding from other sources as grant funding was only available for up to 50 per cent of eligible project costs. The guidelines did, however, provide that applicants from the non-business streams could apply for a co-funding exemption if they could demonstrate ‘exceptional circumstances’. Such approval could only be granted by the Ministerial Panel.
4.53
The provision of advice from Infrastructure to the Panel regarding the co-funding exemptions differed between cohorts. What was consistent, however, is that all briefings on exemptions were missing key information such as the case that had been made by each applicant in support of its request. In all briefings, Infrastructure failed to outline to the Panel that the applications had demonstrated ‘exceptional circumstances’. In the briefing for Cohort 2, Infrastructure provided advice that that the case had not been made for the exemptions, but proceeded to recommend to the Panel that the exemptions be approved.
4.54
Out of the 15 eligible applications that sought a co-funding exemption, Infrastructure recommended to the Panel that all be approved. Written approval was granted by the Panel for 14 exemptions. There is no record of approval being granted for the one co-funding exemption request for Cohort 3. Of the 15 seeking exemption, four applications were approved for funding, two of which were recommended for funding by Infrastructure. This inconsistency in advice resulted in a recommendation from the ANAO that Infrastructure provide clear advice to decision-makers on ‘whether any exemption requests permitted under grant program guidelines should be granted, and why’. Infrastructure agreed to the recommendation.
Funding decisions
4.55
The ANAO found that there were 132 applications where the Panel’s funding decision differed from Infrastructure’s recommendation. The ANAO reiterated the decision-making authority of the Ministerial Panel, and stated:
It's not their job just to approve what is recommended to them and to not approve those that aren't recommended. It's their job to turn their minds to the merits of the applications they're considering against the program guidelines.
4.56
Each of the briefings prepared by Infrastructure for the Ministerial Panel explained that the Panel ‘must satisfy the requirements of the CGRGs by ensuring that, where the Panel did not agree with the department’s funding recommendations, the reasons for those overturn decisions are properly documented’. The ANAO emphasised the importance of this documentation, and stated:
Appropriately documenting and reporting decisions relating to grant opportunities are key elements of probity and transparency in grants administration. Under the CGRGs, the Minister must record in writing the basis for the approval and must report annually on all instances where they have decided to approve a grant which the relevant official has recommended be rejected.
4.57
The ANAO found that the reasons recorded for funding decisions that differed from departmental recommendations were provided in writing to Infrastructure by staff in the office of the Panel Chair. The predominant reasons for not approving a recommended project were that the Panel considered that the assessment of claims against the second and third criteria had been overstated by the Industry assessors. Similarly, the Panel considered that many of the assessments of the second and third criteria had been understated and therefore awarded funding to a number of applications that were not recommended. This was confirmed by a statement from the ANAO:
What the ministerial panel consistently recorded there was that they didn't think the assessment work, either the ones which scored highly against the relevant criteria and those which scored lowly, was properly reflecting the extent to which this project would actually create and sustain jobs, which is what it was really about.
4.58
The ANAO observed, however, that although the records identified which criteria the Panel considered had been incorrectly scored, there was no corresponding re-scoring of the applications. According to the ANAO, this gap in documentation did not provide an appropriate level of transparency in the decision-making processes of the Ministerial Panel. It also did not allow for any assurance by the audit team that decision-making was performed consistently and appropriately within the guidelines:
Again, it seemed that the panel was saying, 'The errors are very great in some instances’, but by not rescoring them we couldn't satisfy ourselves as to whether that was truly the case or not.
4.59
This led to a recommendation by the ANAO that Infrastructure ‘implement processes for decision-makers to re-score grant applications in circumstances where they disagree with the scoring presented by the department’. Although Infrastructure agreed with the recommendation, their response focused on improving assessment processes and implementing a stronger quality assurance program. According to Infrastructure, ‘[t]hese assurance arrangements are likely to minimise the need to re-score applications’.
4.60
In terms of the records kept by the Ministerial Panel, the Auditor-General stated that the record of the outcome of decisions by the Panel was not inconsistent with the CGRGs. However, the Auditor-General suggested that there would have been more transparency in decision-making had the Panel kept records which better explained the process of the decision taken. The Auditor-General stated:
If the decision-making process wasn't consistent with the rule framework you would've read it in the audit report. The audit report doesn't say that. It talks about how some of that documentation could be improved. Part of the conversation you're talking about went to issues around how probity issues were dealt with, conflicts of interest—whatever. There isn't any documentary record of how that was dealt with. Normally you would expect in the decision-making process that some of those things that sit around the decision, and the evidence base for the decision, may be available to reviewers of it. So, yes, there's a record of the outcome of the decision. There's not a record of the process of the decision.
4.61
The audit also found that one application that was assessed as ineligible was merit assessed and later approved for funding by the Ministerial Panel after the Panel ‘showed a strong preference to fund this project, as it deemed the RTO element of the proponent’s business to be incidental to the project seeking funding under RJIP’. Inconsistencies in the assessment process were identified, as not all ineligible applications with the same circumstances were afforded the same treatment; however the decision was duly reported to the Finance Minister. The ANAO analysis found that ‘there was no bias clearly evident in the assessment and decision-making processes.’
Managing conflicts of interest
4.62
The CGRGs require that, when administering grants programs, that accountable authorities and officials ‘ensure that entity policy and management processes for conflict of interest are published to support probity and transparency’.
4.63
The RJIP guidelines contain information in relation to conflicts of interest for all officers connected with the delivery of the program. The RJIP Guidelines state that conflicts ‘may arise with our staff, technical experts, advisory committee members and others delivering the program between … their program duties, roles and responsibilities and … their private interests’.
4.64
Infrastructure advised:
The principles around handling conflicts of interest are that it depends on the degree of closeness or the degree of the perception around conflict. My understanding of the principles around conflict of interest is that some of those potential or perceived conflicts can be addressed by declaring a conflict so that there is transparency amongst decision-makers about interests and where they lie. In some circumstances, there may be more action that needs to be taken where a decision-maker may absent themselves from making a particular decision. There are orders of degree depending on the circumstances of a particular case, but there are well-worn practices and very standard practices for managing conflicts of interest in decision-making processes.
4.65
The ANAO report provided analysis on three aspects of conflict-of-interest management: applicants declaring conflicts of interest, conflict-of-interest policies applied to Local Planning Committees (LPCs), and staff working on the administration of the RJIP declaring potential conflicts of interest.
Grant Applicants
4.66
The CGRGs provide that conflicts of interest may arise ‘where a potential grantee has a direct or indirect interest, which may influence the selection of their proposed grant during the application process’. The CGRGs also state that conflicts of interest may arise during the course of carrying out the grant activity. Recommended mechanisms to manage conflicts of interest include: creating procedures for applicants to provide information regarding potential conflicts, and clearly articulating and defining conflict of interest policy in guidelines for applicants.
4.67
The RJIP guidelines, issued in March 2017, gave advice to applicants regarding what constitutes a conflict of interest, and advise applicants that such conflicts could ‘affect the awarding or performance of your grant’. The guidelines also state under ‘Your conflict of interest responsibilities’ that:
We will ask you to declare, as part of your application, any perceived or existing conflicts of interests or that, to the best of your knowledge, there is no conflict of interest … If you later identify that there is an actual, apparent, or potential conflict of interest or that one might arise in relation to your grant, you must inform us in writing immediately.
4.68
The ANAO audit, however, found that applicants were not required to provide information in relation to potential conflicts of interest on application forms. Infrastructure was reported to have coordinated with Industry to amend application forms to require that future applicants would be required to provide a declaration regarding conflicts of interest.
4.69
Infrastructure further advised that improvements to its communication regarding conflict of interest management had been made in other grants programs it manages. Infrastructure provided an example of the Building Better Regions Fund, where in Round 4 applicants were provided with enhanced guidance material and application forms which better addressed conflict of interest issues.
Local Planning Committees
4.70
Local Planning Committees (LPCs) were designed as part of the RJIP to provide advice to Government about where funding would be best targeted in the ten regions. The Minister was to appoint a LPC in each region, which would subsequently develop a LIP to be submitted to the Minister and Infrastructure. The guidelines describe the intended function of the LPCs:
A Local Planning Committee will be responsible for developing their Local Investment Plan that outlines industry growth sectors, new market opportunities and future workforce needs. Grants will be available in targeted competitive funding rounds to projects that align with the priorities in each region’s Local Investment Plan.
4.71
The LPC Terms of Reference provide information for Committee members, including the definition of a conflict of interest and required actions for members. Actions included Committee members providing Infrastructure with a Declaration of Personal Interest and a Confidentiality Agreement prior to their appointment, and disclosure of interests of ‘immediate family members and close associates’.
4.72
The ANAO found that there were a number of issues in relation to the LPC members’ conflict of interest documentation. The ANAO stated that 84 per cent of personal interest declarations by LPC members demonstrated problems such as forms not being signed, dated, or otherwise being incomplete. The ANAO also identified issues such as LPC members from multiple regions signing forms for the wrong region, and declarations of consent by immediate family members not being provided.
4.73
In one instance raised in the ANAO’s report, an issue was identified where a Chair of one LPC did not report a conflict of interest in relation to a grant made to a company part-owned by an immediate family member.
4.74
When queried during a Senate estimates hearing, Infrastructure did not advise whether this constituted a breach of the conflicts of interest policy. The ANAO further reported that, where conflicts were declared in relation to family or personal interests, Infrastructure did not provide advice in relation to how to manage conflicts.
4.75
The ANAO reported that while the LPCs were not stated in the guidelines to participate in assessing or deciding on applications, Infrastructure suggested that nine regions’ LPC Chairs be consulted by the Ministerial Panel in regard to applications. No consultations with LPC Chairs, however, were reported to have taken place.
4.76
The assurance review commissioned by Infrastructure recommended that a number of changes be made in relation to the role and terms of LPC members, particularly that members’ terms expire at the acceptance of the region’s Local Investment Plan, and that they have no role in decision-making or assessment. The assurance review further suggested that Infrastructure:
…’tighten’ the conflict of interest requirements in any future rounds, and also ensure that an appropriate ‘air gap’ exists between members of the [LPC] and the consideration of grant applications, specifically by ruling out in the program guidelines any contact between the Ministerial Panel and the Committee Chair.
Departmental staff and contractors
4.77
The ANAO raised concerns in relation to the application of conflict of interest policies to departmental staff and contractors working on the RJIP. Infrastructure required Senior Executive Staff (SES) staff working on the RJIP be subject to conflict of interest management procedures, including the provision of an annual declaration of potential interests. The ANAO also examined whether any non-SES staff would require a declaration, but identified no officers required to comply with this process.
4.78
The guidelines refer to the Australian Public Service Code of Conduct, and state that RJIP program officers are bound by a conflict of interest policy which involves a declaration procedure and appropriate controls for officers identified as having a relevant conflict of interest.
4.79
The ANAO was advised that Infrastructure amended its policy in 2019 to require that all Executive Level 2 staff make annual declarations regarding conflicts of interest, and that all staff and contractors in the Regional Programs Branch were required to provide declarations from
mid-2018.
4.80
Industry did not require departmental officers or contractors working on the RJIP to provide declarations on the basis that the departmental policies regarding conflict of interest management and insider trading would apply. The Industry policy also requires continuous reporting of conflicts as they arise. During the audit, Industry advised the ANAO that officers involved in the RJIP administration were ‘reminded of the need to declare conflicts because the assessment procedures contained instructions that assessors cannot be involved in an assessment if they have a conflict of interest’. The audit also stated that the assessment procedures advised assessors to ‘identify any potential conflict of interest during any stage of the assessment’, but no conflicts were reported. Industry further advised that contractors are now provided with conflict of interest management training and are subject to continuous disclosure requirements.
4.81
Infrastructure advised that the AusIndustry Business Grants Hub had made a number of improvements to its conflict of interest management, including:
Staff working on the RJIP program were covered through departmental policy which requires an annual disclosure each year and specific disclosure as matters arise[.]
From December 2017 the Hub requires employees and contractors to formally acknowledge they understand their conflict of interest obligations at the launch of each program[.]
This is in addition to the requirements of the departmental policy[.]
Contractors are also required to undertake mandatory training on conflict of interest processes.
Ministerial Panel
4.82
The Ministerial Panel was provided advice by Infrastructure in relation to their responsibilities under the CGRGs and managing conflicts of interest. Infrastructure stated that members of the Ministerial Panel were expected to declare any relevant interests to the rest of the Panel.
4.83
The lack of documentation of the meetings or deliberations of the Ministerial Panel, however, means that it cannot be determined whether conflicts of interest were disclosed. The ANAO observed that this practice marked a significant differentiation between the conduct of the RJIP and other grants programs, where deliberations of ministerial meetings are recorded.
Communication with stakeholders
4.84
As outlined above, communication between Infrastructure and the Ministerial Panel consisted of a funding recommendations brief for each region. These briefings were found by the ANAO to have provided clear advice and were consistent with the requirements of the grants administrative framework. The audit found, however, that Infrastructure’s communication with Industry was deficient in several key respects.
4.85
Most significantly, Industry advised the ANAO that it had not been asked by Infrastructure to re-examine any of its assessment procedures, especially for criteria two and three, in light of the Ministerial Panel frequently recording that the Industry assessors were significantly over- or under-scoring the applications.
4.86
There were also no steps taken to examine whether the advice from the Panel could be traced back to the work of specific assessors or was a systemic issue with the assessment process.
4.87
As a result, the ANAO recommended that in the event there is a high incidence of decision-makers disagreeing with the assessment and scoring of applications, Infrastructure should require that any service provider review the assessment procedures to identify any improvements or adjustments required. Infrastructure agreed to the recommendation.
4.88
Industry advised the ANAO that it did not receive any advice from Infrastructure regarding co-funding exemption approvals. Although Industry did not follow the agreed procedures regarding co-funding exemptions and assessed all applications against the merit criteria before approval for the exemption was granted, Industry was not informed of the exemption approval. Had Industry followed agreed procedures, however, applications may have been slower to assess. Infrastructure’s records showed that it informed Industry of the approval of Cohort 1 exemption requests four weeks after the approval had been given. The audit, however, noted that at the time of finalising the report, Infrastructure had not located these records. Industry further advised the ANAO that it ‘assumed that all co-funding requests had been approved’.
4.89
Recommendation 5 of this report addresses improvements to the timely announcement and communication to stakeholders of grant opportunities and outcomes of grant programs.
Committee comment
4.90
In regard to the re-scoring of applications in the event that the decision-maker disagrees with the recommendations, the Committee observed that although Infrastructure agreed with the ANAO’s recommendation, its response was not focused on implementing the recommendation itself, but on improving assessment processes and implementing a stronger quality assurance program so that re-scoring would be unnecessary. The Committee considers that the re-scoring element of the recommendation should be implemented in addition to revisions to the assessment procedures. This would ensure that applications requiring re-scoring in future could be subject to appropriate review processes.
4.91
Although the Ministerial Panel recorded the outcome of decisions, the record-keeping of the Panel did not, according to the Auditor-General, allow for sufficient transparency in decision-making. By recording more information, including probity issues such as conflicts of interest and the decision-making process used, questions about the bias or otherwise of the Ministerial Panel could have been avoided and public confidence improved.
4.92
The ANAO also advised that it was a departure from usual practice for a Ministerial Panel conducting a decision-making role in a grants program to not keep records of deliberations. Given these considerations, the Committee found that it would be appropriate to require that the Ministerial Panel keep records of meetings in line with prior programs or similar current programs which utilise Ministerial Panels in decision-making.
4.93
The absence of clear communication between Infrastructure and Industry regarding key elements of the administration of the program is concerning. Despite the Ministerial Panel having significant disagreement with the assessment and scoring of the applications, Infrastructure did not engage with Industry to adjust or amend the assessment process. Infrastructure advised that they had subsequently improved this part of the grants process and had implemented their own quality assurance program.
4.94
The Committee also observed that policies regarding conflict of interest management were not in line with best practice guidance. In particular, it is not clear whether Infrastructure has improved its management or administration of the conflict of interest process for LPCs. The Committee encourages Infrastructure to strengthen its management of conflict of interest.
4.95
Additionally, Industry’s use of an existing contract with a contact centre provider to carry out grant application assessments is concerning to the Committee. It is not evident to the Committee that the provider was appropriately suited for performing such functions, highlighted by the poor results identified by the ANAO in relation to applications found to be scored incorrectly. This raises concerns that assessors were not appropriately trained or qualified to conduct application assessment work. Industry’s use of this contract also did not appear to be sufficiently cognisant of risk management or value for money principles.
4.96
The Committee reminds departments that tenders for services are a critical element of transparent and effective program management. Further, the Committee believes that it is not appropriate to utilise contracts already afoot to carry out program management, especially if their function is not suited to the nature of the program.