Effectiveness of safety management

From wiki
Jump to: navigation, search
1. Contact
1.1 Contact organisation EUROCONTROL: The European organisation for the safety of air navigation.
1.2 Contact organisation unit Directorate Single Sky - Performance Review Unit (DSS/PRU)
1.3 Contact name Performance Review Unit - EUROCONTROL
1.5 Contact mail address 96 Rue de la Fusée
1130 Brussels
BELGIUM
1.6 Contact e-mail address NSA-PRU-Support@eurocontrol.int
1.7 Contact phone number +32 2 729 39 56
2. Metadata update
2.1. Metadata last certified not applicable
2.2. Metadata last update 01 November 2016


3. Statistical presentation

3.1. Data description

ref. EASA Annex to ED Decision 2014/035/R

  • AMC1 SKPI


GENERAL DESCRIPTION : The Effectiveness of Safety Management (EoSM) indicator should be measured by verified responses to questionnaires at State/competent authority and service provision level. For each question the response should indicate the level of implementation, characterising the level of performance of the respective organisation.

A Management Objective (MO) has been derived for each of the elements of the ICAO State Safety Programme (SSP) and Safety Management System (SMS) as described in ICAO Document 9859 ‘Safety Management Manual’.

EFFECTIVENESS LEVELS AND EFFECTIVENESS SCORE : When answering the questions, one of the following levels of implementation should be selected:

- Level A which is defined as ‘Initiating’ — processes are usually ad hoc and chaotic;

- Level B which is defined as ‘Planning/Initial Implementation’ — activities, processes and services are managed;

- Level C which is defined as ‘Implementing’ — defined and standard processes are used for managing;

- Level D which is defined as ‘Managing & Measuring’ — objectives are used to manage processes and performance is measured;

- Level E which is defined as ‘Continuous Improvement’ — continuous improvement of processes and process performance.


ref. EASA Annex to ED Decision 2014/035/R

  • AMC2 SKPI — Questionnaire for Measurement of Effectiveness of Safety Management KPI — State level
  • AMC3 SKPI — Questionnaire for Measurement of Effectiveness of Safety Management SKPI — ANSP level


For each question, States should provide to the Agency information on the level of effectiveness (or level of implementation) and evidence to justify their answer. For each question, ANSPs should provide to their NSA/competent authority information on the level of effectiveness (or level of implementation) and evidence to justify its answer.


ref. EASA Annex to ED Decision 2014/035/R

The results of the States’ filled-in questionnaires are to be verified by means of EASA standardisation inspections.

The coordination between EASA and the competent authority should be done through the national coordinator appointed by the State in accordance with Article 6 of Commission Regulation (EC) No 736/2006.

The national coordinator should be responsible for coordination within the State authorities and for coordination with the ANSPs to provide the Agency with the responses to the questionnaires (both competent authority and ANSP, aggregated where required).

The competent authority/NSA may allocate the detailed verification task to a qualified entity or other entity.

The verification of the ANSP questionnaires by the NSA/competent authority should take place before the questionnaires and their results are submitted to EASA. ANSPs should assign a focal point for the purpose of the verification process.

A detailed description of the EoSM indicator and data requirements are available from the EASA Acceptable Means of Compliance and Guidance Material (EASA AMC/GM).

3.2. Classification system

ref. EASA Annex to ED Decision 2014/035/R

  • AMC2 SKPI Stat level
  • AMC3 SKPI ANSP level
  • Appendix 2 to AMC2 SKPI
  • Appendix 2 to AMC3 SKPI

See EASA Acceptable Means of Compliance and Guidance Material (EASA AMC/GM).


Two set of scores are computed:

  • The overall effectiveness score which takes all answers into account; and,
  • An effectiveness score for each Management Objective.

The scores are computed at two levels:

  • State/competent authority (the NSA) level; and,
  • ANSP level.


A snapshot of the formula is available at Media:Formula_State.jpg

3.3. Sector coverage

The measures pertain to the Air Transport and Air Traffic Management Sector of the economy.

3.4. Statistical concepts and definitions

ref. EASA Annex to ED Decision 2014/035/R

Management Objective (MO) has been derived for each of the elements of the ICAO State Safety Programme (SSP) and Safety Management System (SMS) as described in ICAO Document 9859 ‘Safety Management Manual’.

For each Management Objective, a question (or questions) has been derived and the levels of effectiveness have been described.

For both State and ANSP levels, EASA and PRB will monitor the performance regarding this indicator based on the received answers and on the results of the verification process by the States and by EASA

The questionnaires’ sole intent is to monitor the performance (effectiveness) of Member States/competent authorities and ANSPs regarding ATM/ANS safety management.

ref. EASA Annex to ED Decision 2014/035/R

  • AMC2 SKPI - State level

The Management Objectives of the SSP framework are grouped into 5 components:

Component 1 State safety policy and objectives
  • State safety legislative framework
  • State safety responsibilities and accountabilities
  • Accident and incident investigation
  • Enforcement policy
  • Management of related interfaces
Component 2 Safety risk management
  • Safety requirements for the air navigation service provider’s SMS
  • Agreement on the service provider’s safety performance
Component 3 Safety assurance
  • Safety oversight
  • Safety data collection, analysis and exchange
  • Safety-data-driven targeting of oversight of areas of greater concern or need
Component 4 Safety promotion
  • Internal training, communication and dissemination of safety information
  • External training, communication and dissemination of safety information
Component 5 Safety culture
  • Establishment and promotion of safety culture
  • Measurement and improvement of safety Culture

ref. EASA Annex to ED Decision 2014/035/R

  • AMC3 SKPI - ANSP level

The Management Objectives of the SMS framework are also grouped in 5 components:

Component 1 ANSP safety policy and objectives
  • Management commitment and responsibility
  • Safety accountabilities - Safety responsibilities
  • Appointment of key safety personnel
  • Coordination of emergency response planning/contingency plan
  • SMS documentation
  • Management of related interfaces
Component 2 Safety risk management
  • Safety risk assessment and mitigation
Component 3 Safety assurance
  • Safety performance monitoring and measurement
  • The management of change
  • Continuous improvement of the SMS
  • Occurrence reporting, investigation and improvement
Component 4 Safety promotion
  • Training and education
  • Safety communication
Component 5 Safety culture
  • Establishment and promotion of safety culture
  • Measurement and improvement of safety culture

3.5. Statistical unit

ref. EASA Annex to ED Decision 2014/035/R

  • GM 2 SKPI - General

The statistical units are States and ANSPs.

3.6. Statistical population

The statistical population comprises the States and ANSPs in the Single European Sky Area.

3.7. Reference area

The reference area is the Single European Sky Area.

3.8. Time coverage

2012 is the first year for which data is presented.

3.9. Base period

Not applicable.

4. Unit of measure

Based on the responses, the following scores should be derived:

  • The overall effectiveness score should be derived from the combination of the effectiveness levels selected by the relevant entity (ANSPs or Member State/competent authority) against each question with the weightings as described in Appendices 2 to AMC 2 SKPI and 2 to AMC 3 SKPI;
  • An effectiveness score for each Management Objective for the State/competent authority and for each study area for the ANSP.


More information is available at EASA Acceptable means of compliance and guidance material.

5. Reference period

  • The first reference period (RP1) covers the calendar years 2012 to 2014 inclusive.
  • The second reference period (RP2) covers the calendar years 2015 to 2019 inclusive.

Unless decided otherwise, the following reference periods shall be of five calendar years.

6. Institutional Mandate

6.1. Legal acts and other agreements

Provisions are established in:


7. Confidentiality


8. Release policy

8.1. Release calendar

The information is released annually in June of the following year.

8.2. Release calendar access

Not applicable.

8.3. User access

Information is disseminated to the general public via the EUROCONTROL website.

9. Frequency of dissemination

Information is disseminated on a yearly basis as soon as the data is verified.

10. Dissemination format

The information is available on the EUROCONTROL website.

11. Accessibility of documentation

Documentation on methodology is available at EASA Acceptable means of compliance and guidance material.

12. Quality Management

The legal framework for quality management is established by EU legislation.

The national competent authorities (i.e. the NSAs) assess the quality of the questionnaires completed by the ANSPs. There are at least two types of NSA organisations for verifying the questionnaires: either there is a specific organisation or the existing oversight organisation is used to verify the questionnaire. The uniformity of NSA verification output across Europe is maintained through:

EASA verifies both NSAs and ANSPs questionnaires through means of Standardisation Inspection, specifically:

  • Light verification through desktop audits;
  • Thorough verification during Standardisation inspections using Checklists and based on Findings and UNCs; and,
  • Trustful verification through desktop audits based on reports on closure of Corrective actions, further verified during follow-up inspections.

The PRB is involved in some EASA verifications through temporary participation of PRU staff.

12.1. Quality assurance

For light and trustful verification: Once the questions (i.e. MOs) subject to verification have been selected through sampling, the first step is to observe the consistency between the selected level of effective implementation, the justifications provided and the evidence. In a second step the level of effective implementation is verified through external information available to EASA and PRB. Clarification questions could be addressed to the NSA in question either via mail or telephone.

For thorough validation: Each MO is linked to a protocol question of the EASA inspection questionnaire. When EASA conducts a Standardisation Inspection, the level of effective implementation is verified during the on-site audit through the assessment of the protocol questions which are linked to protocol questions.

EASA verification processes.
Belgium example to conduct the verification process of ANSPs questionnaires.

12.2. Quality assessment

Standard quality criteria to be defined.

13. Relevance

The information is published for performance monitoring purposes in accordance with the relevant EU legislation.

13.3. Completeness

The data is collected for the entire Single European Sky Area.

14. Accuracy and reliability

14.1. Overall accuracy

The accuracy depends on the ability of competent authorities and ANSPs to well understand the various questions (management objectives) and the the different levels of effective implementation (5 steps from A to E). Sufficient means have been set in order to control accuracy.

14.2. Sampling error

There is no sampling and therefore no sampling error.

15. Timeliness and punctuality

15.1. Timeliness

The EoSM refers to a status of the indicator which was measured in a period of time between one year and nine months before its publication in the PRB annual report. When published in the online performance sheet, the data refer to a status which is measured less than none months before its publication.

15.2. Punctuality

The data is provided by the States/ANSPs before January of the following year and is displayed to the public in June the following year.

16. Comparability

Data are comparable being measured in the same way across European Member States. However, benchmarking is not allowed in safety.

16.1. Comparability — geographical

Geographical comparability is valid.

16.2. Comparability over time

Comparability over time is valid.

17. Coherence

17.1. Coherence — cross domain

It can be expected that:

  • The NSA scores have a high correlation with the Lack of Effective implementation indicator published in ICAO ISTAR [1]; and,
  • The ANSP scores have a high correlation with the CANSO maturity survey [2] .

17.2. Coherence — internal

The data is assumed to be internally coherent as provided.

18. Cost and burden

Not available.

19. Data revision

Data could change through time due to two reasons:

  • to reflect a change in managing safety within an organisation (i.e. the competent authority or the ANSP changed the data; and,
  • to reflect an improved verification (e.g. the questionnaire was first verified with a "light verification" and later with a "thorough verification").

20. Statistical processing

20.1. Source data

Questionnaires completed by States/competent authorities and ANSPs (see also Statistical concepts and definitions).

20.2. Frequency of data collection

After the first completion , ANSPs and competent authorities are requested to update the questionnaires at least once a year during the period between June of the year in progress and January of the following year.

20.3. Data collection

The State and ANSP questionnaires are collected by EASA though a web application.

20.4. Data validation

The competent authorities are responsible for verification of the questionnaires completed by ANSPs. They should report results of the verification to EASA through the web-interface.

Once the competent authorities have completed their own questionnaires and verified ANSPs questionnaires, EASA makes its own verification.

EASA verifies the updated questionnaires by March of the following year at the latest. After EASA verification, the following data is sent to the PRB by March of the following year at the latest.

The verification process may conclude that some questionnaires do not achieve an acceptable standard. In this case the data are not published.

(see also Quality Management)

20.5. Data compilation

The EASA web-interface integrates an algorithm which allows the computation of scorings immediately after the completion of questionnaires. Following the verification by EASA, the data is provided to the PRB. After the verification and analysis by EASA, the following data and analysis is provided to the PRB:

  • the computed scores and the verification results for the questionnaires completed by each ANSP and each competent authority,
  • the analysis of scores taking into account the verification results.

20.6. Adjustment

Not applicable.

21. Comment

No comment

Disclaimer

This data is published by the Performance Review Body (PRB) of the Single European Sky. Every effort has been made to ensure that the information and analysis contained on this website are as accurate and complete as possible. Despite these precautions, should you find any errors or inconsistencies we would be grateful if you could please bring them to the Performance Review Unit’s attention.

The information may be copied in whole or in part providing that the copyright notice and disclaimer are included. The information may not be modified without prior written permission from the PRB. The views expressed herein do not necessarily reflect the official views or policy of EUROCONTROL or of the European Commission, which make no warranty, either implied or express, for the information contained on this website, neither do they assume any legal liability or responsibility for the accuracy, completeness or usefulness of this information. The PRB reserves the right to change or amend the information provided at any time and without prior notice.

Information contained in this website is checked and updated with due diligence on a regular basis. This notwithstanding, data may become subject to changes during the intervening periods. Note that the content of this page and the information provided might change once the actual monitoring cycle starts and the first safety data is received.

The PRB does not assume any liability or guarantee for the accuracy and completeness of the information provided to the linked third-party websites. The PRC does not control the linked sites and is not responsible for the contents of any linked site or any link in a linked site, or any changes or updates to such links. The PRB provides these links to users of this website as a convenience alone.

Information provided on this page should not replace EASA Acceptable Means of Compliance and Guidance Material documents.

Personal tools
Namespaces

Variants
Actions
Meta Data
Navigation
Toolbox
Print/export