Part 3Annual performance statements

Results by performance criteria

Performance criterion 1—Number and types of visitor interactions

DPS is the custodian of Parliament House as the home of the Parliament, as the working symbol of Australian democracy, and as a significant destination for citizens and international visitors alike. Part of DPS’ purpose is to make Parliament House and the important activity that takes place within it accessible to the public. Visitors to Parliament House are encouraged to view proceedings of both the Senate and the House of Representatives from the public galleries or via ParlView on the Parliament of Australia website, to witness democracy in action. They can also learn about the work of the Parliament through a range of tours.

DPS measures visitor numbers for four types of visitor interactions which reflect the different modes of access to the building and the activities that take place. These are:

  • number of visitors
  • number of virtual visitors
  • number of visitors for DPS school tours, and
  • number of participants in DPS organised tours and events.

Enhancing the Parliament’s engagement with the community is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • enhance our visitor experience and community engagement including the use of social media and emerging technologies, and
  • enhance electronic access to parliamentary information for the community to engage easily with the parliamentary process.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p15
  • Program 1, 2017–18 Corporate Plan, p21

Results against performance criterion

Table 3: Number and types of visitor interactions
Target—Equivalent or greater to same period last year
2015–16 results 2016–17 results 2017–18 results
Number of visitors 725,992 750,005 4 742,049
Number of virtual visitors 4,706,404 5,190,519 5,989,850
Number of visitors for DPS
school tours
127,292 127,176 132,040
Number of participants in DPS
organised tours and events
74,829 73,879 78,593

Methodology

Number of visitors

The number of visitors to Parliament House is the number of people experiencing the building as general and business visitors. We calculate the total magnetometer count minus the number of pass holder swipes at the main entrance of Parliament House.

DPS defines visitors to include school tours, visitors signed in at the front entrance and international, interstate and local tourists. DPS excludes Australian Parliamentary/Public Servants, retail and service workers and building occupants who hold an active pass.

Number of virtual visitors

The number of virtual visitors is the number of people that visit the Parliament of Australia website. The calculation is based on the number of unique ‘Users’ as measured by Google analytics. In previous Annual Performance Statements the methodology has attributed the result to only the Visit Parliament section of aph.gov.au, but it represents visitation to the whole website.

Number of visitors for DPS school tours

The number of visitors for DPS school tours includes all students and any accompanying adults. School tour participants are manually counted by our Visitor Services Officers.

DPS defines school tours as the school tours bookings managed by the Serjeant-At-Arms’ Office through the Venue Management System.

Number of participants in DPS organised tours and events

The number of participants in DPS organised tours and events measures the number of people that actively participate in both paid and free tours and events available to the public organised by DPS. The total number of people participating in the tours and events is based on manual counting by Visitor Services Officers as well as information from The Parliament Shop, and internal and external booking systems (such as Ticketek and Canberra Ticketing).

Analysis

Number of visitors

Visitor Service Officers actively engaged visitors with the work, stories and collections of Parliament House through their tour program, including a suite of nine different tours, and eight high quality exhibitions and events throughout the year. Ongoing building works negatively affected visitor numbers, as construction activity reduced the visual aesthetic and access to some areas of the building. Despite this, DPS achieved an annual result of 742,049 visitors to Parliament House, which is only a slight decrease on 2016–17 (7,956 or 1 per cent).

Number of virtual visitors

The range of services and information provided by the Parliament of Australia website—from information on parliamentarians through to parliamentary recordings—continued to drive strong virtual visitor numbers. This information is contributed by all four parliamentary departments and is administered by DPS. The Visit Parliament area, which features content targeted at visitors, received 395,743 visitors, which was a five per cent increase on 2016–17. DPS achieved an annual result of 5,989,850 visitors to the Parliament of Australia website, a 15 per cent increase on the 2016–17 figures.

Number of visitors for DPS school tours

School tours of Parliament House are available to all primary and secondary Australian schools. Bookings are managed by the Serjeant-at-Arms’ office in the Department of the House of Representatives. The number of participants on school tours grew by four per cent or 4,864 on 2016–17 figures, to be near the peak of school visits in the 2014–15 financial year.

Figure 4: Annual schools visitation figures

This is a stacked bar chart showing the annual school visitation figures for the financial years, 2011-12, 2012-13, 2013-2014, 204-15, 2015-16, 2016-17 and 2017–18. It shows there were: 124,357 school visits for 2011-12; 126,128 school visits for 2012-13; 123,507 school visits for 2013-14; 132,781 school visits for 2014-15; 127,292 school visits for 2015-16; 127,176 school visits for 2016-17; 132,040 school visits for 2017-18.

Annual schools visitation figures include students, teachers, carers and accompanying parents.

Number of participants to DPS organised tours and events

An engaging program of exhibitions and events throughout the year, along with the events in May to commemorate the 30th anniversary of the official opening of Parliament House by Her Majesty Queen Elizabeth II on 9 May 1988, contributed to a positive result for event visitation. A breakdown of the key tours and events is below. DPS tour and event numbers were 78,593, six per cent above the 2016–17 result.

Daily tours

Parliament House visitors participated in a number of daily guided tours in 2017–18 including:

  • Welcome Tours—offered five times a day to introduce visitors to the most significant features of Parliament House. The tours include a visit to the Chambers of Parliament on non-sitting days and viewing of the extensive Parliament House Art Collection on show, including in the Great Hall, the Marble Foyer and Members Hall, and
  • Behind the Scenes Tours (Discovery Tours on sitting days)—offered three times a day to give visitors an exclusive chance to visit some of the private spaces of Parliament House. Visitors have the opportunity to stand beneath the Australian flag, to hear of the events that have shaped Australia and Parliament House. During sitting days, the Discovery Tours are offered, but access to the private spaces is not available.
Seasonal and subject-based tours

Parliament House visitors participated in various tours held in 2017–18, these included:

  • Spring Glory Tours in September and October 2017, which focused on the hidden courtyards and landscapes of Parliament House. These tours highlighted the courtyards on the Senate and House of Representatives sides of the building. They also featured the springtime foliage of the large and small trees in the courtyards of Parliament House
  • as part of the Enlighten festival, APH Catering and Events presented two events, a Masquerade Party and a Food, Fun and Fireworks event, both on the Members and Guests Terrace. Both events sold out
  • Autumn Colours in the Courtyards Tours in April and May 2018, which highlighted the spectacular changing landscape in the hidden courtyards of Parliament House
  • NAIDOC and Reconciliation weeks saw the delivery of Indigenous Collections and Connections tours and the Canberra Design and Heritage weeks saw the delivery of Design for Democracy tours, and
  • a number of tours were conducted for ministers, parliamentarians and international delegations.
Events

Parliament House visitors participated in various events held in 2017–18, these included:

  • the annual ‘Christmas comes to APH’ program, launched by the Presiding Officers, which featured free public performances of Christmas carols and the Giving Tree in the Marble Foyer. Eleven school choirs participated in the performances and funds were raised for charities Life without Barriers and the Australian Indigenous Governance Institute
  • as part of the 30th anniversary celebrations, on 5 May 2018, a panel of special guests led by Barrie Cassidy reflected on the architectural, social, historical, political and cultural legacy of Parliament House over the past 30 years, followed by a special concert by an ensemble from the Canberra Symphony Orchestra, and
  • a ceremony on the Forecourt attended by His Excellency General the Honourable Sir Peter Cosgrove AK MC (Retd) and Her Excellency Lady Cosgrove for the 30th anniversary of the opening of Parliament House.

Performance criterion 2—Visitor satisfaction with the APH Experience

DPS aims to offer engaging and innovative programs to enhance visitor experience and community engagement, making Parliament House a destination of choice and a showcase for the best products of the surrounding region. Regular and on-going feedback is essential to understanding visitor satisfaction with the Parliament House experience.

DPS measures visitor satisfaction for four types of visitor interactions which reflect the different modes of access to the building and the activities that take place within it. These are:

  • percentage of visitor feedback indicating their visit met or exceeded expectations
  • percentage of virtual visitor feedback indicating their visit met or exceeded expectations
  • percentage of school visitor feedback indicating their visit met expectations, and
  • percentage of participants attending DPS tours and events indicating their visit met or exceeded expectations.

Enhancing the Parliament’s engagement with the community is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • enhance our visitor experience and community engagement including the use of social media and emerging technologies, and
  • enhance electronic access to parliamentary information for the community to engage easily with the parliamentary process.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p15
  • Program 1, 2017–18 Corporate Plan, p21

Results against performance criterion

Table 4: Visitor satisfaction with Australian Parliament House
Target—85% Satisfaction
2015–16 results 2016–17 results 2017–18 results
% of visitor feedback indicating their visit met
or exceeded expectations
97% 97% 98%
% of virtual visitor feedback indicating their visit met
or exceeded expectations
81% 86% 78%
% of school visitor feedback indicating their visit met expectations 99% 97% 98%
% of participants attending DPS tours and events indicating their visit met or exceeded expectations 99% 98% 99%

Methodology

Visitor feedback

Visitors’ satisfaction with their experience at Parliament House is measured through the percentage of visitor feedback indicating the visit met or exceeded expectations. The feedback is collected through visitor comment cards which are available at a number of locations in the building. We ask visitors to rate their overall experience from 1 to 5, 1 being poor and 5 being excellent. A score of 3 or above indicates visitor satisfaction.

Visitors are also provided with the opportunity to provide comments which are monitored and addressed by Parliamentary Experience Branch. These responses are not included in the calculation as the information is not quantifiable.

Virtual visitor feedback

The percentage of virtual visitors’ feedback indicating their visit met or exceeded expectations measures virtual visitor satisfaction with their interaction with the website. A visitor satisfaction survey is available on the ‘Visit Parliament’ webpage and prompts ‘Visit Parliament’ visitors to answer the question Did you find the information you wanted easily? The rating is from 1 (strongly disagree) to 5 (strongly agree). A score of 3 or above indicates virtual visitor satisfaction with their interaction.

School visitor feedback

The percentage of school visitors’ feedback indicating their visit met expectations is measured by capturing the satisfaction of school educators with the experience of Parliament House. Comment cards are provided to the teacher to rate the experience from 1 to 5 against two statements. A score of 3 or above indicates visitor satisfaction. Each statement is considered a separate response for the purpose of calculating the response. The two statements are:

  • the tour engaged students, and
  • the information in the tour will assist students with their learning.
Tours and events feedback

The percentage of participants attending DPS tours and events indicating their visit met or exceeded expectations is measured by capturing the satisfaction of participants attending DPS tours and events. Comment cards are available from Visitor Service Officers leading the tours and events. We ask visitors to rate their overall experience from 1 to 5, 1 being poor and 5 being excellent. A score of 3 or above indicates visitor satisfaction. Visitors are also provided with the opportunity to provide comments which are monitored and addressed as part of a separate process.

Tours include both free and paid tours led by Visitor Services Officers. Events included ticketed events, community events and other events run by DPS that are available to the public.

Analysis

Visitor feedback

DPS continued to receive strong positive feedback directly to staff and through the ratings cards handed out to visitors, demonstrating the quality and relevance of our programs and customer service. This is supported by Parliament House ranking eighth out of the ‘254 things to do in Canberra’ as of 1 July 2018. This ranking, by TripAdvisor, is based on the quantity and quality of visitor reviews. During 2017–18 Parliament House received the 2018 TripAdvisor Travellers Choice and Certificate of Excellence Awards. The visitor satisfaction result for 2017–18 was 98 per cent. This was corroborated by a variety of independent sources.

Virtual visitor feedback

Virtual visitor feedback continues to be limited, with only 196 virtual visitors providing feedback via the website from a total of 5,989,850 virtual visitors. During 2017–18 DPS has reviewed and analysed the data and processes behind this performance measure. This review found the reliability of the measure is undermined by lack of data. It has been removed as a reportable performance measure in 2018–19. DPS will continue to collect and action feedback on the website while trying to redesign data collection to reflect a more representative sample. Virtual visitor satisfaction was 78 per cent for 2017–18.

School visitor feedback

During 2017–18, 132,040 school students and teachers were provided with a tour of Parliament House, with school tour satisfaction results of 98 per cent demonstrating that these tours are highly valued by participants.

DPS tours and events feedback

Interaction with visitors by DPS tour guides and the strong result from visitor feedback cards continue to validate DPS’ confidence in our tours and events offerings. Staff have focused on ensuring that visitors are aware of the opportunity to provide feedback and have tripled response rates from participants. The satisfaction rate for tours and events in 2017–18 was 99 per cent.

A sample of visitor comments for tours and events held in 2017–18 appears below:

Welcome Tour:
  • Sascha was an eloquent and knowledgeable guide, giving an enjoyable and interesting tour.
  • Scott was great! Highly entertaining and extremely informative. Great tour. Very interesting! Made even more interesting with humour. Clear and easy to understand for all including non-native English speakers. Thank you!
  • Thanks for a wonderful visit, Lori was an exceptional host, knowledgeable guide and a warm ambassador for our Parliament.
  • We took the tour. The guide was engaging and interesting and the kids listened and asked questions. It was the right level of detail with enough anecdotes and open questions to capture everyone’s attention. The view from the roof is worth seeing. (Via TripAdvisor)
School Tour:
  • Stephen was very engaging and worked with the students in a positive and informative manner.
  • Kirsten was very knowledgeable and aimed the information very appropriately, prompting students when they were unsure.
  • The students thoroughly enjoyed their experience today—it consolidated their learning and fueled their interest in the political process.
  • Very clear, informative, and patient. Shared some interesting and relevant stories. Great experience for students.

Behind the Scenes tour:
  • Behind the Scenes tour with Nathan was outstanding. His knowledge and delivery of it was second to none.
  • First time in Canberra so we thought we would do the behind the scenes tour. This proved to be a good decision as the tour guide was very knowledgeable and able to provide lots of anecdotes and history which made the whole experience more fulfilling. In particular I didn’t realise the extent of cultural symbolism and thought that went into seemingly nearly every aspect Parliament’s architectural design. Tour also gives access to some areas not available to an unstructured tour. (Via TripAdvisor)
  • Gina was a very knowledgeable and interesting guide. Thoroughly enjoyed the tour.
  • Went on the ‘behind the scenes’ tour at Parliament House @$25 per head. This only runs when Parliament is not sitting. Well worth it. The knowledge and enthusiasm of our guide Rosie was the main contributor—recommended. (Via TripAdvisor)
Spring/Autumn tour:
  • The guide Marie was excellent. She knew the names of all the plants and trees and shared her enthusiasm.
  • Kevin and the courtyard tour were wonderful. Very informative. Beautiful gardens and a friendly, knowledgeable guide.
  • It was a real privilege to visit Parliament House and also the beautifully kept courtyard gardens. Thank you Monique for the wonderful Autumn Courtyard Tour this morning. Loved the spring tour very much as well. (Via Facebook)

Performance criterion 3—Hansard Service KPIs are achieved

DPS performs a critical function supporting the work of the Parliament through the Parliamentary Recording and Reporting Branch (PRRB). This performance criterion demonstrates timely reporting of chamber and committee proceedings through the production of Hansard.

Responding to the changing needs of the Parliament is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • implement efficient and effective infrastructure, systems and services to respond to the changing needs of the Parliament and our parliamentarians, and
  • explore and develop innovative technology and systems for the delivery of timely information and services to parliamentarians.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p16
  • Program 1, 2017–18 Corporate Plan, p21

Results against performance criterion

Table 5: Hansard Service KPIs are achieved
Target
% of individual draft speeches delivered within two hours of speech finishing—85%
% of electronic proof Hansard reports delivered within agreed timeframes—95%
% of committee transcripts delivered within agreed timeframes—95%
2015–16
results
2016–17
results
2017–18
results
% of Individual draft speeches delivered within two hours of speech finishing 86% 92% 94%
% of electronic proof Hansard reports delivered within agreed timeframes 93% 90% 97%
% of committee transcripts delivered within agreed timeframes 93% 99% 82%

Methodology

The Hansard Production System (HPS) continuously tracks the status of draft transcripts. The HPS records and produces reports on when Hansard documents are delivered compared to the target timeframe. An individual draft speech (known as a ‘pink’ or ‘green’) is considered to have been delivered on time if the entire speech reaches the office of the parliamentarian within two hours of the speech. Proof draft transcripts are reported as being on time if published in full within three hours of the chamber rising. Committee transcripts are reported as being on time if published in the timeframe agreed with the committee.

Analysis

Hansard exceeded the performance targets for timely delivery of individual draft speeches and electronic proof Hansard reports. Hansard did not achieve its target for committee transcripts, due to a spike in committee activity between July and October 2017. In particular, the July–August winter recess saw an increase in workload of 41 per cent relative to the last comparable winter break (2015).

Hansard adjusted its workforce composition and its strategy for outsourcing transcription work in response and, for the remainder of 2017–18, achieved a result of 92 per cent against the committee timeliness target. The committee offices of the Senate and House of Representatives were consulted on expected delays in committee transcription, to ensure priority transcripts were delivered on time.

Performance criterion 4—Building occupant satisfaction with timeliness and quality of DPS services

DPS is responsible for the delivery of a broad range of services directly and through facilitated arrangements. To continue to improve our services, it is important to gauge building occupant satisfaction with the timeliness and quality of DPS services.

DPS measures building occupant satisfaction with timeliness and quality of DPS services across a number of service categories. These are:

  • food and beverage/catering services
  • retail/sporting services
  • maintenance/cleaning services (including gardens and landscaping)
  • security services
  • parliamentary reporting and recording services
  • ICT services
  • visitor/art services
  • nurses centre services, and
  • loading dock services.

Responding to the changing needs of the Parliament is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended result for this performance criterion is to:

  • implement efficient and effective infrastructure, systems and services to respond to the changing needs of the Parliament and our parliamentarians.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p15
  • Program 1, 2017–18 Corporate Plan, p18

Results against performance criterion

Table 6: Building occupant satisfaction with timeliness and quality of DPS Services
Target—75% Satisfaction
2015–16 results 2016–17 results 2017–18 results
% of building occupant feedback indicating a satisfied or neutral rating with timeliness and quality of DPS services (by DPS service category) 89% 90% 92%
Breakdown by service category
Food and beverage/catering services 76% 88% 91%
Retail/sporting services 93% 95% 97%
Maintenance/cleaning services (including gardens and landscaping) 89% 89% 88%
Security services 94% 93% 93%
Parliamentary reporting and recording services 96% 98% 97%
ICT services 92%5 86% 90%
Visitor/Art services 94%6 95% 99%
Nurses centre services - 81% 95%
Loading dock services - 94% 94%

Methodology

The satisfaction with timeliness and quality of DPS services is measured annually through a Building Occupant Satisfaction Survey. Survey Monkey is used to allow building occupants to anonymously rate their level of satisfaction with DPS services over the past 12 months and they are encouraged to provide any comments or suggestions as to how the services could be improved in the free text fields.

To calculate the satisfaction rating, the ‘Very Satisfied’, ‘Satisfied’ and ‘Neutral’ (neither satisfied nor dissatisfied) responses are totalled and expressed as a percentage of the total response count.

Analysis

DPS is proud of the positive feedback it receives through the Building Occupant Satisfaction Survey. The results of the survey, including both satisfaction ratings and individual comments, are provided to the relevant DPS Assistant Secretary. Where necessary, action plans have been developed in response to both the survey results and comments and all actions associated with the Building Occupant Satisfaction Survey for the current and previous years will be tracked by the Executive Committee over the coming year.

For example cleaning has been a topic that has received negative feedback over the past two years. In 2017–18 DPS finalised the new cleaning contract for Parliament House, which came into effect on 1 July 2018. Feedback from building occupants was used when developing the statement of requirements for this procurement.

We received a total of 557 responses to the building occupant survey, from a distribution list of approximately 5,500 email addresses, which is a return of 10 per cent. The majority of respondents (74 per cent) worked for one of the four parliamentary departments. Participation by parliamentarians and their staff has increased as a proportion of total responses by five per cent to 24 per cent (131 people).

The target for building occupant satisfaction with timeliness and quality of our services is 75 per cent. The overall rating of building occupant satisfaction for the 2018 survey was above the target, at 92 per cent. The target was achieved for all individual service categories in the 2018 survey.

In previous Annual Performance Statements, DPS commented on the appropriateness of the 75 per cent satisfaction target and indicated we would look to increase the target. In 2018–19 the target has been raised to 80 per cent.

DPS continues to supplement this information with other sources of feedback such as the Parliament House Art Collection Feedback Cards and feedback collected by APH Catering and Events.

Performance criterion 5—Parliamentary Library Service KPIs are achieved

The Parliamentary Library Service metric is an index to capture all of the service standards or key performance indicators for the Parliamentary Library approved by the Presiding Officers in the annual Library Resource Agreement between the
Secretary of DPS and the Parliamentary Librarian.

The office and functions of the Parliamentary Librarian are established by the Parliamentary Service Act 1999 (PS Act) (sections 38A and 38).

In the DPS Corporate Plan, the Library’s activities fall under the strategic theme ‘Respond to the changing needs of the Parliament’. The relevant objective for this performance criterion is to ‘continue to build the Library’s reputation for high quality advice through’: ensuring high and consistent quality in services; increasing digital access and service; supporting the Parliament’s engagement with the community; and initiatives to help develop parliamentary democracy in our region.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p15
  • Program 1, 2017–18 Corporate Plan, p19

Results against performance criterion

Table 7: Parliamentary Library Service KPIs
Target—90%
2015–16 results 2016–17 results 2017–18 results
% of Library Service KPIs set out in the annual Library Resource Agreement that are achieved 93% 90% 93%

Methodology

Key priorities and performance indicators for the Parliamentary Library are outlined in the Library’s Annual Resource Agreement (PS Act, section 38G). The KPIs in each Resource Agreement set out the outcomes and key deliverables for that year and also measure the:

  • percentage of clients using the Library’s services
  • customer satisfaction
  • number of completed client requests
  • number of publications produced
  • number of online uses of the Library’s publications
  • attendance at training courses and events
  • timeliness of research and library services
  • number of items added to the Library’s Electronic Media Monitoring Service (EMMS) and ParlInfo Search data bases
  • number of new titles added to the catalogue
  • percentage of the collection available online, and
  • use of the Library’s collections and databases and media portal.

The Library uses the RefTracker Information Request Management System to manage client requests and other client related work. This provides a rich array of client related data, including number of requests, usage, and timeliness. Satisfaction data is derived primarily from a formal evaluation of the Library’s services conducted once in every Parliament, the most recent being undertaken in 2017. Data regarding the number of publications produced and the number of items added to the EMMS and ParlInfo Search databases is obtained from ParlInfo Search.

Data relating to visits to the Library client portal (intranet) are captured by Sitecore’s engagement analytics. The Parliamentary Library currently uses Google analytics and Splunk web-analytics application to analyse statistics for use of publications and collection items. A manual count is used to report on attendance at training courses and events and new titles added to the Library catalogue. Reports generated from the Integrated Library System provide information regarding the percentage of titles in the Library’s collection available online in full-text. Statistics on the use of the Library’s collections and databases is formulated from Integrated Library System reports, Splunk data and vendor provided usage statistics.

Analysis

In 2017–18 the Library met 93 per cent of its key deliverables and targets.

Significant initiatives in the reporting period included: the client evaluation of Library Services for the 45th Parliament; a review of the Library’s KPIs; a review of the Library collection; completion of the new ParlMap and Wadsworth systems (the latter underpinning the online Parliamentary Handbook); digitisation of the Parliamentary Papers Series; the commencement of a remediation project for historic Hansard files; launch of the First Eight project; the expansion of news services; and assistance to the Parliamentary Institute of Cambodia and the Parliament of the Solomon Islands.

In regard to service benchmarks, the Library met its client usage target of 100 per cent (consistent with the previous financial year); and received two complaints (again, consistent with the previous financial year). The Library achieved a rating of 94 per cent for client satisfaction among parliamentarians against its target of 95 per cent (based on data from the most recent client evaluation). It completed 11,656 individual client requests against its target of 13,000 (a demand driven measure). However, hours spent on client requests increased from 45,656 in 2016–17 to 47,747, reflecting their increasing complexity. There were a little over 3.946 million uses of the Library collection and databases, very slightly below the target of four million. The Library will continue to monitor usage closely. The Library did not meet its target of digitising three million pages from its historic newspaper clippings collection, as the fragility of the material caused processing delays.

The Library met or exceeded its targets for its remaining client service KPIs, including: timeliness; use of online publications; attendance at training courses and events; number of research publications released; number of items added to the EMMS service and ParlInfo Search databases and the Library catalogue; client use of the mediaportal and social media monitoring services; and percentage of collection available online.

Detailed discussion of the Library’s performance is contained in the Parliamentary Librarian’s Annual Report, which is included in the DPS Annual Report, as required by section 65 (1)(c) of the PS Act.

Performance criterion 6—ICT Service Standards are achieved

The ICT Service Standard is an index composed of 15 individual service standards, each of which is measured monthly. Each service standard measures the delivery of key services in support of the effective and efficient operations of the Parliament, electorate offices, the parliamentary departments and Parliament House.

Responding to the changing needs of the Parliament is the strategic theme which links this performance criterion to the achievement of our purpose. The intended results for this performance criterion are to:

  • implement efficient and effective infrastructure, systems and services to respond to the changing needs of the Parliament and our parliamentarians, and
  • explore and develop innovative technology and systems for the delivery of timely information and services to parliamentarians.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p15
  • Program 1, 2017–18 Corporate Plan, p19

Results against performance criterion

Table 8: ICT Service standards are achieved
Target—90%
2015–16 results 2016–17 results 2017–18 results
% of ICT Service Standards outlined in the ICT SLA
that are achieved
92% 88% 93%

Methodology

Information Services uses the ServiceNow IT Service Management System to capture and manage client interactions received via telephone, email, self-service and face-to-face contacts. Client interactions are classified and prioritised appropriately before being assigned to the relevant support group for resolution. Data specifically relating to the management and handling of telephone calls to the 2020 Service Desk is obtained from the Alcatel-Lucent Call Management System.

Availability statistics for key ICT systems and infrastructure is obtained directly from event logging and monitoring software systems. Manual methods are used to calculate the availability of Broadcasting Services. This is due to the analogue nature of these systems. Their availability is determined through a combination of regular scheduled testing, monitoring and incidents raised by clients directly with the 2020 Service Desk. Availability of the Whole of Government Secure Internet Gateway, is reported to DPS by the vendor.

Analysis

In 2017–18, 167 out of 180 ICT Service Standards were achieved (see Table 8) leading to a five per cent overall improvement on the 2016–17 result. A combination of factors contributed to some of the ICT Service Standards not being achieved at several points throughout the year.

Several ICT outages experienced by both third party vendors and ICT Services during the 2017–18 reporting period have had a direct impact on the availability of key systems and services flowing through to performance measures for chamber services, file and print services, percentage of calls answered in less than 30 seconds, information services, user accounts and email within APH and electorate offices. These outages are not typical of the stability of ICT systems with the Parliament. Corrective actions have been implemented to continue to work towards achieving the performance criteria in the 2018–19 reporting period.

In addition, a higher than normal call volume and demand for ICT services occurred following the resumption of Parliament after the summer recess. The ‘percentage answered in less than 30 seconds’ service standard was adversely impacted by this significant increase in call volumes. Despite this, customers were still able to reach the 2020 Service Desk, including on the busiest day in February where calls increased by approximately 350 per cent over the daily average.

The Information Services Division delivered a series of major infrastructure upgrades during Q3 and Q4 2017–18 to improve the performance and supportability of critical Parliamentary services. These activities were scheduled and implemented with careful consideration of the impact to clients. The completion of the upgrades and the improvements in the underlying infrastructure better position DPS to meet the future needs of Parliamentary Computing Network users.

Table 9: 2017–18 ICT results
All 15 service standards are outlined below along with an explanation of their performance for the year.
Monthly target achieved Service Target Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun
1. First Call Resolution
70 per cent of calls to the ICT 2020 Service Desk are resolved on first contact.
11/12 70% 75% 74% 73% 72% 71% 73% 75% 73% 70% 69% 75% 73%
2. Incident Resolution
95 per cent of all incidents reported to the ICT 2020 Service Desk are resolved within the agreed resolution times.
12/12 95% 96% 96% 98% 97% 97% 97% 96% 95% 96% 96% 98% 97%
3. % of calls answered in 30 sec
90 per cent of calls made to the ICT 2020 Service Desk are answered within 30 seconds.
6/12 90% 92% 90% 93% 90% 92% 91% 83% 88% 75% 67% 85% 81%
4. Gateway availability
Gateway services are available 24 hours a day, seven days a week (excluding scheduled downtime). The target availability for gateway services is 99 per cent.
12/12 99% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
5. Customer Engagement model—on response time of the three timeframes
A customer request is a ‘request’ for a new solution (application or hardware) that is currently not part of the DPS existing service offering. This criterion requires that 100 per cent of customer requests are responded to within six days of lodgement and that 100 per cent of customer requests have undergone a detailed assessment within a timeframe agreed to with the customer.
12/12 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
6. User accounts
100 per cent of all new user accounts are created within 24 hours.
10/12 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 99% 98%
7. Telephone—technical
Each telephone handset is operational 24 hours a day, seven days a week.
12/12 99% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
8. Broadcasting services
The target availability for the Chamber Sound Reinforcement, Division Bells and Lights, Clocks and ‘Live to Air’ broadcasts is 100 per cent during sitting times.
12/12 100% (sitting
periods
only)
100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
9. Information services
Availability of the EMMS and ParlView services. The target availability is 99 per cent.
11/12 99% 100% 100% 100% 98% 100% 100% 100% 100% 100% 100% 100% 100%
10. PABX availability
Target availability of the core telephone system (PABX) is 100 per cent.
12/12 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
11. Internet
Target average availability of internet services for our customers is 99 per cent.
12/12 99% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%
12. Email
Target average availability for email and mail exchange services provided by DPS is 99 per cent.
11/12 99% 100% 99% 99.97% 100% 100% 100% 100% 100% 100% 100% 100% 98%
13. File and Print
Target availability for file and print services provided to our customers is 99 per cent.
11/12 99% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 98%
14. Chamber Services
Availability of specific systems used by the House departments to ensure the effective and efficient operation of the parliamentary chambers. These systems include the Table Office Production System, Dynamic Red and Live Minutes and the target average availability of these systems is 100 per cent during sitting periods.
11/12 100%
(sitting periods only)
100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 99% 100%
15. Mobile Device Management
A mobile device management application is used to provide security to mobile devices.This criterion requires that this service is available 24 hours a day, seven days a week (excluding scheduled downtime).
12/12 99% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100% 100%

Performance criterion 7—Design Integrity Performance

Through the design integrity consultation process, established during 2016–17, DPS is maintaining the design integrity of Parliament House by integrating consideration of the architect’s design intent into all aspects of the planning, implementation and review processes for capital works and maintenance projects.

A fundamental element of the design integrity consultation process is the ongoing dialogue with Mr Giurgola’s moral rights administrators and key members of the original Parliament House design team, Ms Pamille Berg AO Hon FRAIA and Mr Harold (Hal) Guida LFRAIA AIA. Ms Berg and Mr Guida are a key source of information on the original design intent of Parliament House and how to interpret and apply it as the building changes. This assistance helps staff understand and manage effectively the building and its surrounds.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended result for this performance criterion is to:

  • ensure adaptations of the building uses are strategic, appropriate and reference design integrity principles.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p16
  • Program 1, 2017–18 Corporate Plan, p23

Results against performance criterion

Table 10: Continuity of Design Integrity
Target—Effective
2015–16 results 2016–17 results 2017–18
results
The level at which the design integrity process is functioning7 - - Effective
The extent and effectiveness of consultation with moral rights holders and DPS regarding the process for design integrity and moral rights matters8 - - Effective
% of projects that have a material impact on design integrity of the building where design integrity is maintained or improved9 - 90% -

Methodology

A qualitative assessment of the level of effectiveness of the Design Integrity process, both within DPS and with the moral rights administrators10, is undertaken by the Design Integrity and Archives Unit. The unit provides secretariat support for the quarterly design integrity meetings and other meetings (as necessary) and facilitated consultation between the DPS staff and the moral rights administrators. The assessment of how effective the process has been is based on an analysis of the number and type of interactions with staff, including on important capital works and maintenance projects, and with the moral rights administrators.

Importantly, DPS also requests annual feedback from the moral rights administrators on their views about the extent and effectiveness of our consultation with them about design integrity and moral rights matters. This provides an external measure of the effectiveness of DPS’ consultation with these key stakeholders.

Analysis

Overall, the design integrity consultation process is now working reasonably effectively. Prior to the introduction of the design integrity consultation processes in 2016–17, communication and interaction with the moral rights administrators had become sporadic and internal departmental liaison was considered to be ad hoc. However, effecting cultural change takes time. Since the introduction of the new consultation protocols and the set-up of the Design Integrity and Archives Unit, including the engagement of an architectural historian in February 2018, improvements have been noticeable. It is acknowledged that as the process matures and staff become more familiar with it, regular liaison will become more routine.

During 2017–18, DPS held three day-long quarterly meetings with Ms Berg and Mr Guida and had more than 22 ad hoc meetings, presentations (including three all-staff presentations on the design intent of the architecture and art of the Forecourt, Foyer and Great Hall) and round-table discussions with one or both of the moral rights administrators. These meetings covered a variety of design integrity issues, including (but not limited to):

  • major capital works, including for example, the proposed public car park and Forecourt renewal projects, working at heights measures, lift upgrades, and the art/craft program
  • potential information and telecommunication matters
  • events to celebrate the 30th anniversary of the opening of Parliament House
  • furniture, fittings and fixtures
  • accessibility matters, and
  • landscape and gardening issues.

Given its importance, matters related to design integrity are a standing agenda item at the fortnightly Executive Committee meetings. In addition, the Design Integrity and Archives Unit is represented at various departmental fora (such as the quarterly risk management forum and the weekly building maintenance meetings), participates in stakeholder groups and management boards established for various capital works projects and regularly meets with project managers and officers to discuss specific design integrity matters as they arise. The unit has also responded in writing to requests for information, with many of the responses involving considerable research through original files held by the National Archives of Australia and in other sources. Anecdotally, the unit has received positive feedback from staff regarding this work.

The overriding principle of the new consultation process is the early and regular involvement of the Design Integrity and Archives Unit and Ms Berg and Mr Guida (as necessary) on design integrity during the entire life of a capital works project or maintenance program. In this way, timely intervention can be achieved if required to ensure that Mr Giurgola’s design intent is maintained for future generations and unnecessary or abortive work is eliminated to the greatest extent possible. Collectively, the moral rights administrators have expressed confidence in the new consultation processes and are pleased with the level of engagement to date.

Performance criterion 8—Building Condition Rating

DPS measures the Building Condition Rating (BCR) by the percentage of areas within the building that are assessed as being in good or better condition.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • maintain Parliament House and the precinct as befits its status as an iconic building and location of national significance
  • ensure that adaptations of the building’s uses are strategic, appropriate and reference design integrity principles, and
  • effectively manage all assets within APH including collections.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p16
  • Program 1, 2017–18 Corporate Plan, p23

Results against performance criterion

Table 11: Building Condition rating
Target—80%
2015–16
results
2016–17
results
2017–18
results
% of building areas reviewed that are assessed as being in good or better condition - 81% 88%
% of building areas reviewed that are assessed as being in fair or better condition11 88% - -

Methodology

The BCR measures the current condition of Parliament House’s building fabric, expressed as a percentage of the original condition. The BCR is determined by a visual inspection of the building and fabric surfaces for deterioration and damage caused by general wear and tear.

For the purposes of the BCR, the building is divided into eight zones and, over the course of the 12-month reporting period, an inspection is carried out using the BCR methodology. Each area within a zone has up to 31 elements which are scored from zero (disrepair) to 100 (excellent).

A percentage is calculated by dividing the total of any given score by the potential optimum score for each zone.

The recommendations from the audit in 2016–17 have now all been implemented and the verification methods improved. It was identified that the rating scale for the BCR should be aligned so that an assessment of ‘Good’ (i.e. 80 per cent) meets the target, in order to fully address audit concerns. DPS recognised that the change would contribute to a slight rise in the overall score.

In 2017–18, DPS undertook a project to implement all BCR measurements in the SAP system. The readings can now be directly recorded while conducting inspections. SAP can now report on historical data and subsets of its stored information, such as carpet condition.

Analysis

In 2017–18, a total of 660 rooms, suites and areas were inspected. The consequent performance measure result was 88 per cent, which represents an increase of seven per cent compared to 2016–17.

As well as the change to the rating scale mentioned in the methodology, the improvement is attributed to:

  • a number of office area refurbishments
  • a replacement program to upgrade staff office furniture
  • carpeting and repainting of 40 suites
  • retiling of 20 toilets, and
  • re-glazing of a number of link-ways and skylights.

Due to current construction activities, a small number of areas were not measured in 2017–18. These will be subject to inspection in the coming year.

DPS will continue to seek opportunities for improving BCR processes, including refining the present methodology, developing targeted inspection strategies and reviewing maintenance arrangements. DPS is considering whether building users’ experiences of Parliament House would be better reflected by a system that gave greater weighting to core elements of the areas (such as paint and carpet). For example, under the current methodology, cleanliness is one element of up to 31 elements. A low cleaning score, which is a major factor in building occupants’ perceptions of Parliament House, only contributes three per cent of an area’s total score.

Performance criterion 9—Engineering Systems Condition Rating

The Engineering Systems Condition Rating (ESCR) measures the current operation and condition of the engineering systems in Parliament House against the expected decline of those systems through their lifecycles.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • ensure adaptations of the building uses are strategic, appropriate and reference design integrity principles
  • ensure a secure environment while maintaining public accessibility, and
  • effectively manage all assets within APH including collections.

Criterion source

  • Program 1, 2017–18 Corporate Plan, p23

Results against performance criterion

Table 12: Engineering Systems Condtion Rating
Target
% of critical engineering systems reviewed that are assessed as being in good condition—70%
% of critical engineering systems reviewed that are assessed as being in fair or average condition—95%
2015–16 results 2016–17 results Last year’s results converted 2017–18 results
% of critical engineering systems reviewed that are assessed as being in good condition - 50% 52% 53%
% of critical engineering systems reviewed that are assessed as being in fair or average condition - 87% 88% 85%
% of critical engineering systems reviewed that are assessed as being in good or better condition - 50%12 - -
% of critical engineering systems reviewed that are assessed as being in fair or better condition 89%13 - - -

Methodology

As in 2016–17, DPS engaged a third party (ESBS Pty Ltd) to assist in undertaking the ESCR. This third party:

  • reviewed the status of engineering assets that were replaced or refurbished through capital works projects in 2017–18
  • reset the condition ratings of these assets as appropriate, and
  • surveyed specific engineering assets selected on a risk basis by DPS. Emphasis for the surveyed assets was placed on those that were assessed in 2016–17 to be in poor condition and critical items of engineering plant rated as in fair or average condition.

In 2017–18 DPS reviewed the list of existing engineering system subcategories as part of an ongoing improvement process. To better capture this data at the system level, DPS regrouped it into more logical, systems-based subcategories. For example, what was recorded as six individual boilers has been consolidated into a single subcategory Boiler—Central Energy. Consistent with the revised subsystem categorisation, this year’s ratings were applied over nine system categories and 283 subsystem categories. Refinement of the last year’s 410 subsystems was undertaken by functional groupings rather than by location to further improve the accuracy of this Performance Measure. Last year’s ESCR scores have been converted into the new system subcategorisation to enable comparison with this year’s performance data. The scores have also been rounded to the nearest percentage rather than to two decimal points.

The two performance measures use the same calculation steps but different assessments. Once a score was allocated to the three factors against each subcategory, an overall score for the category was generated against each factor by way of a count of subcategories which were assessed as good or 'fair or average' depending on the performance measure. This score was converted to a percentage for each factor. Then, the overall result for each system category was determined by averaging the percentage scores across the three factors, and the total results calculated by averaging the overall result for each system category. Using this methodology, each of the nine system categories can be assessed individually against the target, as well as providing a total result across all engineering systems.

Analysis

The 53 per cent of assets rated as being in good or better condition represents a slight improvement over the 52 per cent result for 2016–17. While this is substantially below the 70 per cent target, further improvements should be realised as major capital works are delivered in the next two years. A review of rankings for individual system categories show that the condition of lifts, electrical assets and the Building Management System are having a particularly adverse impact on improved ratings.

The three per cent decline on last year in items assessed as fair or average has resulted from deterioration in the condition of Variable Volume and Packaged Air Handlers, Area Main Switch Boards and Uninterruptible Power Supply (UPS) since the previous inspection.

Improvements expected to 2021

DPS has mobilised resources for a staged upgrade of the Building Management System over the next two to three years. The Building Management System is expected to be significantly upgraded by 2021.

The following is an overview of the capital works projects that are also expected to have a positive affect on performance in this area over the next two years:

Electrical Services
  • electrical distribution boards replacement (due for completion in June 2021)
  • emergency lighting monitoring (due for completion in June 2022)
  • light fittings upgrade to low energy luminaires (due for completion in June 2022)
  • lightning control upgrade (due for completion in June 2022), and
  • review and planning of electrical essential power requirements, high voltage electrical distribution, auxiliary power upgrades including load shedding and replacement/upgrade of emergency power generators expected by December 2018 (capital works expected to be committed and completed by June 2020).
Fire Services
  • sliding fire door replacement (due for completion
    September 2020), and
  • fire sprinkler services upgrade including piping (expected to commence in 2018–19).
Mechanical Services
  • replacement of all mechanical switchboards (due for completion June 2020)
  • investigation and design of replacement/upgrades of all major air handling plant (for heating, cooling and ventilation) under way with capital works expected to be completed by June 2021
  • design of dedicated ventilation systems upgrades under way with capital works expected to be completed by December 2020, and
  • replacement of the boiler plant (due for practical completion December 2018).
Lifts
  • the progressive replacement of the 42 lifts over four years
  • the first four lifts were refurbished in 2017–18. Of the remaining 38, 10 are scheduled for completion in 2018–19, 14 in 2019–20 and the final 14 in 2020–21.

Performance criterion 10—Landscape Condition Rating

The Landscape Condition Rating (LCR) measures the current condition of the landscape surrounding Parliament House.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • maintain Parliament House and the precinct as befits its status as an iconic building and location of national significance
  • ensure adaptations of the building uses are strategic, appropriate and reference design integrity principles, and
  • effectively manage all assets within APH including collections.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p16
  • Program 1, 2017–18 Corporate Plan, p22

Results against performance criterion

Table 13: Landscape Condition Rating
Target—85%
2015–16
results
2016–17
results
2017–18
results
% of landscaped areas reviewed that are assessed as being in good or better condition - 88% 77%
% of landscaped areas reviewed that are assessed as being in fair or better condition 83%14 - -

Methodology

The LCR is expressed as a percentage and is measured annually. For the purposes of the assessment process, the landscape is divided into 10 zones which include up to 10 separate elements such as lawns, trees and hard surfaces. Each element is manually assessed by a team of five Landscape Services staff. The assessment takes into account variables such as the intended purpose, lifecycle, planned maintenance levels and seasonal variations. The agreed scores are provided against each element for each zone and the total score achieved (across all elements and zones) is expressed as a percentage of the total possible score.

The methodology is designed to give a fair representation of the overall landscape condition.

Analysis

The security capital works have had a major effect on the LCR result. This was mainly due to impacts from the new fence being installed on all four ramps and at the Senate and House of Representatives sides of the building. Building users will have seen that there was no shrubbery or turf around the Senate and House of Representatives slip roads.

Projects are also in development and under way to address:

  • a leak in the Members and Guests Terrace Garden, which has resulted in all plants being removed.
  • the leaking pond and condition of grouting in the Forecourt, and
  • the poor condition of the trees at the entrances to the Senate and the House of Representatives.

As expected due to the events that occurred in 2017–18 the Landscape Condition Rating was 11 per cent less than the previous year. The score is expected to meet the target of 85 per cent next year when the project works are completed and the landscape is rehabilitated.

Performance criterion 11—Security KPIs are achieved

The Security KPIs measure the Parliamentary Security Service’s (PSS) performance and reflect the robustness of the policies and processes and operational capability.

Effective stewardship of APH is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended result for this performance criterion is to:

  • ensure a secure environment while maintaining public accessibility.

Criterion source

  • Program 1, 2017–18 Portfolio Budget Statement, p16
  • Program 1, 2017–18 Corporate Plan, p23

Results against performance criterion

Table 14: Landscape Condition Rating
Target—100%
2015–16
results
2016–17
results
2017–18
results
% of security incidents that are handled in accordance with policy and process - - 93%15
% of PSS Officers compliant with mandatory training requirements - - 97%16

Methodology

Handling of security incidents

The Security Operating Policy and Procedure 10.7–Parliamentary Security Service (PSS) Security Incident Reporting states that ‘[s]ecurity incident reports provide an official record of events relating to breaches of security or other security matters.’ Incident reports are used by Security Branch to document information on and provide management visibility of a range of events and interactions involving PSS staff. A security incident is an incident that impacts on the integrity of the security arrangements at Parliament House. Not all incident reports relate to security incidents. Security incidents include:

  • non-compliance with security screening
  • denial of entry to the building or galleries
  • disruptive behavior that requires any level of security response
  • reported lost or stolen items, and
  • instances where an escorted visitor is found without a pass holder to escort them or where an unauthorised person is found in the private areas.

Security reports are examined to determine if the incident was a security incident and whether it was due to a systemic cause (e.g. an unattended item) or due to other factors such as a failure to follow procedures. The handling of the security incident is then assessed against the stated protocols in the Security Operating Policies and Procedures (OPPs) and a final percentage is then calculated based on an analysis of whether the incident was compliant with the OPPs.

Mandatory PSS training

The mandatory training for PSS Officers is Initial Security Training (IST) on their recruitment and annual Competency Maintenance Training (CMT). This does not include the generic mandatory training that applies to all DPS employees (such as Fraud Awareness, WHS training, etc).

PSS Officers do not commence operational duties until they have successfully completed IST. IST is a six week program providing new recruits the basic training they require to fulfil their roles and obligations as uniformed PSS Officers, covering topics such as Communications, Access Control, Screening, Defensive Tactics, and Powers and Responsibilities.

CMT is ongoing. It covers areas such as First Aid, defensive tactics, first response fire fighting and parliamentarian recognition. All areas have specific requalification windows which are tracked by the Security Branch Learning and Development Section, including exemptions to allow for training to be rescheduled for operational reasons such as parliamentary sitting being extended. The result is calculated as at 30 June each year.

Analysis

Handling of security incidents

The target for the correct handling of security incidents is 100 per cent. For the year 2017–18, the actual achievement was 93 per cent. Factors such as human error, issues of performance management and supervision prevented the target being achieved. The objective is to document all security incidents and to identify those that have occurred as a result of non-compliance with the OPPs. The root cause of any non-compliance and any systemic issues are then used for performance improvement purposes, training updates or recommended changes to security protocols.

Mandatory PSS training

The target for this performance indicator is 100 per cent, for which the achieved results were 97 per cent. The indicator is to record the number of PSS Officers that have undertaken all compulsory training and to report, by exception, those that have not. PSS officers are required to complete all mandatory training specified in the DPS Enterprise Agreement 2017 or any other training identified by the DPS Security Branch. An analysis of this data indicated that 97 per cent of the 169 officers
(as at 30 June 2018) had completed the required training.

The outstanding three per cent of officers (five) that had not completed training were unable to do so for the following reasons:

  • three officers were unable to requalify in defensive tactics in the agreed timeframe due to delays in the program whilst it was being reviewed, and
  • two officers had been working extensively on rotating shifts and were not able to be released from duties for scheduled training.

All five officers had been scheduled for their training during July 2018.

Performance criterion 12—Parliament House Works Program KPIs are achieved

The Parliament House Works Program KPIs measures project delivery.

Effective delivery of the APH works program is the strategic theme which links this performance criterion to the achievement of our purpose. The relevant intended results for this performance criterion are to:

  • effectively manage a Capital Works program for APH to function effectively as a safe and accessible workplace, and
  • deliver a security upgrade capital works program that meets the needs of the Parliament.

Criterion source

  • Program 2, 2017–18 Portfolio Budget Statement, p17
  • Program 2, 2017–18 Corporate Plan, p25

Results against performance criterion

Table 15: Parliament House Works Programs KPIs are Achieved
Target—80%
2015–16
results
2016–17
results
2017–18
results
% of Capital Works Branch projects in delivery phase - - 87%17
% of Capital Works Branch budget spent in the financial year - - 82%18
% of Security Upgrade Implementation Plan projects in delivery phase - - 100%19
% of Security Upgrade Implementation Plan budget spent in the financial year - - 54%20

Methodology

Projects in delivery phase—Capital Works Branch

Any project in delivery phase through the financial year counts towards this KPI. Projects are considered to be in delivery at commencement of design through to completion of works on site. Projects do not have to be planned projects to count towards this KPI.

The projects in the Administered Capital Report (ACR) as agreed at the beginning of the financial year provide the baseline for calculating this result.

Budget spent in the financial year—Capital Works Branch

The percentage of the Security Upgrade Implementation Plan budget spent in the financial year as recorded by the Chief Finance Officer.

Projects in delivery phase—Security Upgrade Implementation Plan

Any project in delivery phase through the financial year counts towards this KPI. Projects are considered to be in delivery at commencement of design through to completion of works on site. Projects do not have to be planned projects to count towards this KPI.

The projects in the Administered Capital Report (ACR) as agreed at the beginning of the financial year provide the baseline for calculating this result.

Budget spent in the financial year—Security Upgrade Implementation Plan

The percentage of the Security Upgrade Implementation Plan budget spent in the financial year as recorded by the Chief Finance Officer.

Analysis

DPS follows an annual cycle of programming capital works to address key infrastructure risks and accommodate the evolving requirements of building occupants. DPS made good progress on capital works in 2017–18. Accommodating the requirements of parliamentary sittings and a wide range of stakeholders continues to put pressure on timeframes. DPS is focused on being flexible while driving towards the required outcomes.

Considerable progress was made on both the physical and electronic components of the security works. The budget underspend is temporary and primarily due to contractor-related delays and the requirement to respond to the needs of building occupants.

  • The fencing and associated civil works were substantially completed with a small amount of re-turfing and planting remaining.
  • Work on the Senate, House of Representatives and main public entrances progressed more slowly than expected due to contractor-related delays and the requirement to respond to the needs of building occupants.
  • Upgrades to the security of skylights and rectification of leaks in the Main Committee Room and Members Hall were substantially completed. The third main skylight over the Great Hall will be replaced in 2018–19.
  • The roll-out of electronic security measures—including improved CCTV coverage and the Electronic Access Control System for selected areas—is well advanced.

The non-security works program has accelerated dramatically in 2017–18 and is delivering good results.

  • Significant improvements were made to the climate control system with major upgrades to plant.
  • Major upgrades were undertaken to the roof structure to rectify water ingress, including the Senate Chamber skylight.
  • Six of the building’s 42 lifts were substantially upgraded by the replacement of most of the mechanical and electrical equipment and the refurbishment of the lift car interiors.
  • During the year the Emergency Warning Intercommunication System was replaced with improved fire resistant cabling and sophisticated electronic fire management system.
  • Improvements were also made to the safety and reliability of the electrical network with upgrades to major circuit breakers and some distribution boards.

Footnotes:

4 As detailed in Appendix F, this figure has been corrected from the 2016–17 Annual Report.

5 This number has been incorrectly reported in 2015–16 and 2016–17 Annual Report. Appendix F provides further details.

6 This number has been incorrectly reported in 2015–16 and 2016–17 Annual Report. Appendix F provides further details.

7 These new performance measures follow the 2016–17 review of the Design Integrity performance measures.

8 These new performance measures follow the 2016–17 review of the Design Integrity performance measures.

9 This performance measure was discontinued in 2017–18.

10 Previously referred to as the moral rights holders.

11 This performance measure was discontinued in 2015–16.

12 This performance measure was discontinued in 2016–17.

13 This performance measure was discontinued in 2015–16.

14 This performance measure was discontinued in 2015–16.

15 This is a new performance measure in 2017–18.

16 This is a new performance measure in 2017–18.

17 This is a new performance measure in 2017–18.

18 This is a new performance measure in 2017–18.

19 This is a new performance measure in 2017–18.

20 This is a new performance measure in 2017–18.