Evaluation of the Airports Capital Assistance Program (ACAP)

Print Version |

March 2015

Table of Contents

Executive Summary

An evaluation of the Airports Capital Assistance Program (ACAP) was conducted in 2013-2014. The evaluation assessed relevance and performance for 2010-2011 through 2013-2014. When necessary, Evaluation used information collected from the beginning of the Program in 1995-1996.

Relevance

Evaluation findings demonstrate that the Program remains relevant.

ACAP was launched in association with the devolution of airports to non-federal, largely public owners in 1995. Evaluation found that the Program addresses a continuing need for safety related infrastructure funding among small to medium sized airports. However, between 2010-2011 and 2013-2014, the Program was under-subscribed and did not disburse 45% of its budget. This was explained by uncertainty regarding renewal of the Program, delays in processing applications, and decisions not to attempt to re-profile funds to subsequent years. Evaluators noted that between 2000 and 2009, TC obtained authorization to re-profile funds, and as a result the Program awarded effectively its entire budget over those ten years. In addition, over the entire Program's years recipients have not used roughly 10% of the amount awarded on all projects.

Performance

Evaluation findings confirm that the program is largely meeting its objectives. No ACAP eligible airport lost its certification due to asset condition since 2009. Projects addressing the highest Program's priority continue to receive the bulk of the funding (i.e. 86% between 2009 and 2014). Analysis of funded projects indicated that the forecasted extension in asset life due to funding was realized. Case studies conducted at airports across Canada demonstrated how safety was improved on the ground as a result of the project work. These findings lead to the conclusion that the Program has contributed to the safe condition of the infrastructure at funded airports. Supporting this conclusion is the fact that fewer than 3% of all accidents at ACAP eligible airports (reported from 1995 to 2013) included any mention of the infrastructure (e.g. snow, animal or a bump on the runway) in relation to the event. The cost of the Program was estimated at $2.00 per passenger when the future benefit of the capital improvements was considered. Management is currently implementing changes in Program administration to reduce overhead costs, which represented 7.0% of contribution funding in the present term and 6.5% historically (i.e. from 1995 to 2013).

The one negative performance finding was that of systematic errors in the Program's recording of the total O&M savings to the client.

Recommendations

Evaluation recommends some improvements to the management of the Program, by implementing a formal mechanism through which airport operators will express their near and medium term requirements for project funding (such as through a three-year rolling plan based on a call for letters of intent to apply), some revision of application guidance material and an update of its performance measurement strategy.

Introduction

An evaluation of the Airports Capital Assistance Program (ACAP) was conducted by the Evaluation & Advisory Services (EAS) of Transport Canada (TC) in accordance with Section 42.1 of the Financial Administration Act and in response to commitments made to Treasury Board upon program renewal in 2009-2010.

Program Profile

The Airports Capital Assistance Program (ACAP) was created in conjunction with the National Airport Policy in 1994. ACAP is a contribution program that is funded from the department's existing reference levels. Currently, ACAP provides up to $38 million per year to successful applicant airports.

ACAP's purpose is to assist airports finance safety-related capital projects. Eligible airports are those outside of the National Airport System (NAS), which are not owned by the federal government; provide year-round, regularly scheduled commercial passenger service Footnote 1; and meet the requirements for certification according to the Canadian Aviation Regulations (III, 2). The number of eligible airports has fluctuated modestly over the years. When the evaluation commenced, 194 airports were eligible for ACAP funding.

For most of the evaluation period, ACAP was located under the Aviation Safety Program within the departmental Program Alignment Architecture and was expected to contribute to the strategic outcome – a safe transportation system. This evaluation will focus on measuring the contribution of the program to the safety of the Aviation sector.

Delivery

ACAP is managed by the Authorities Stewardship branch of the Air and Marine Programs directorate, Programs Group. Headquarter staff members are responsible for overall program management, policy and funding. Staff members in the five regions (i.e. the Atlantic, Québec, Ontario, Prairie and Northern, and Pacific regions) have been responsible for supporting program design, managing and monitoring contribution agreements and providing support to applicants/recipients up to the present year. A re-organization took place concurrently with this evaluation, as a result of which two regions (Prairie and Northern and Ontario regions) will be responsible for delivery across the country in future years.

Regional staff members distribute funding to airports following their review and the approval of project applications submitted to them by airport operators/owners. As of 2010, eligible projects were assessed according to three funding priorities. Projects that are assigned the highest priority level (Priority 1) are airside safety projects such as runway or apron rehabilitation. Priority 2 projects are airside safety mobile equipment projects such as the purchase of snow ploughs and sweepers. Priority 3 projects are safety-based groundside projects such as the removal of asbestos insulation from terminal buildings. Airports are expected to share in the cost of the project. The amount of the airport's contribution depends on the number of enplaned and deplaned passengers per year over the three years preceding the application. The table below presents the project priorities and the airport contribution formula.

Table 1: ACAP Project Priorities and Cost Contributions
Priority  Description ACAP's Cost Contribution
Priority 1 Airside safety projects Less than 50,000 passengers.....100%
Priority 2 Airside safety-related mobile equipment* 50,000 to 74,999..............95%
Priority 3 Air terminal projects/groundside safety-related 75,000 and above: 5% less for each additional 25,000 passengers**

* Exceptions: The maximum contribution amount available for airside mobile equipment shelters is 50%, while 100% of the costs associated with Emergency Response Services regulatory requirements, such as fire truck shelters, are funded.   

** Exception: For airports north of the 60th parallel, ACAP contributes no less than 85% to the project

ACAP Logic Model

According to the 2009 Results-based Management and Accountability Framework, ACAP's immediate outcomes are the completion of capital projects at ACAP-eligible airports, protection of assets and increase in life at ACAP-eligible airports and reduced operating costs at ACAP-eligible airports.

Through achievement of these near-term outcomes, the program is expected to attain its intermediate outcomes – that safety levels are maintained at ACAP-eligible airports and that these airports meet safety standards required for continued operation. Predicated on the achievement of these outcomes, the program is expected to contribute to its ultimate objective – a safe civil aviation system.

Evaluation Scope

In accordance with Treasury Board Secretariat's Policy on Evaluation, this evaluation assessed program relevance and performance. The current term of the program (i.e. from 2010-2011 through to 2014-2015) was the main period under study, although Evaluation drew upon information from prior terms to assess results and to use as a non-reactive measure of demand. Planning, data collection, analysis and reporting to senior management occurred between September 2013 and September 2014.

Details of the evaluation methodology can be found at the end of this report.

Findings on Relevance

To assess the relevance of the ACAP, the evaluation examined its policy rationale and the extent to which it addresses a demonstrable need as well as alignment with government priorities, departmental strategic outcomes and federal roles and responsibilities.

Finding 1. The rationale for ACAP has remained the same since its inception: small to medium airports continue to need capital assistance to fund necessary safety-related infrastructure work. However, the extent of that need is not well known.

ACAP provides funds to non-federally owned airports for capital projects that are necessary to keep the infrastructure of those airports in safe condition for use. The program was designed in 1994-1995 when Transport Canada undertook to devolve the airports it owned to other levels of government, airport authorities or private interests. The federal government created the Airports Capital Assistance Program (ACAP) to provide financing for capital projects related to safety-related airside projects. No end date was stipulated for ACAP at its inception, nor has been since. The financial argument for the program was that providing such capital support was necessary since the activity levels at the smaller airports (i.e. those typically covered by ACAP) would likely not allow them to cover both their operating and capital costs. Senior management interviewed for the evaluation asserted that the rationale for ACAP remains essentially unchanged today. A series of studies and independent assessment reported in the 2009 evaluation confirmed this rationale.

However, as noted in previous evaluations, the extent of this or need for capital is not well known. The 2009 evaluation recommended that Program management develop estimates of capital needs and five-year forecast of capital needs in all regions. Interviews with staff suggest that a plan is kept in one region and that a second region keeps a running list of anticipated projects at local airports. All regions indicated that there is some staff member with in-depth knowledge of the state of the clients' airports and their need for ACAP assistance. As well, program staff attempt to anticipate need based on a review of the capital plans received from airports when they apply for funding. The currency of this information diminishes over time, however, and as such the Program manages the need based on demand expressed by airports at the time of application. Nevertheless, the amount of time required for the Program to assess applications and for recipients to plan for and implement these projects calls for some way of identifying upcoming need as early as possible. The need for this intelligence is becoming even greater as reductions in regional ACAP staff promise to reduce the time available for outreach and pre-consultations with airports. This intelligence clearly can only be had from the airports themselves. For this reason, it is suggested that the Program consider developing a three-year rolling plan for project funding based on expressions of interest provided by operators through an on-line, non-binding letter-of-intent (LOI) that airports will use to flag near- and medium-term needs. The call for expressions of interest would be issued to all eligible airports in that year. The vehicle is meant to replace informal discussions with airports as a first step so that all airports might be sure to express their needs, and do so earlier in a manner acceptable to owners.

Recommendation: It is recommended that the Program develop a formal mechanism requiring Operators to express their anticipated near- and medium-term requirements for project funding.

Finding 2. From 2010-2011 to 2013-2014, the Program was undersubscribed and did not disburse approximately half (45%) of its allocation. This is explained mostly by the uncertainty regarding the program renewal, delays in processing applications and the inability to re-assign unspent budget to future years. Based on applications received in early 2013, however, it appears that demand is sufficient to consume future annual budgets into 2016-2017.

TC budgets $38 million annually to spend on contribution funding for ACAP. In the first four years of the program's current term, approximately half of the total amount allocated to the Program ($69.3M or 45%) was not transferred to eligible recipients. During roughly that same period Footnote 2, agreements were signed in the amount of $79M, or $76M when adjusted for expected payments pertaining to later years, representing 68% of available funds for these years.

Table 2: Funds disbursed and variance from 2010-2011 to 2013-2014
Year Budget ($000) Disbursed ($000) Variance ($000; %)
2013-2014 38,000 12,870 25,130 (66%)
2012-2013 38,000 29,823 8,177 (22%)
2011-2012 38,000 30,403 7,597 (20%)
2010-2011 38,000 9,616 28,384 (75%)

However, there seems to be sufficient demand to consume the annual budget into 2016-2017. There are previous commitments recorded in the ACAP database of $7.9M to be added to the $74.4M in new applications received in the spring and summer of 2014. Assuming that contribution agreements are signed to satisfy all these requests, the entire budget could be allocated in 2014-2015 and 2015-2016 with $6.3M remaining to be spent in 2016-2017. Interviews with operators confirmed impressions that demand for ACAP in 2014-15 and 2015-16 could be robust.

Three reasons were suggested by interviewees to explain why the program did not use approximately half of its allocation in the first four years of the current term:

  • Both operators and program staff interviewed for this evaluation commented on the uncertainty about the future of the program that prevailed in 2010, which made some clients reluctant to apply and caused some staff to hesitate to encourage new applications. At the same time, the program was said to have cleared off the backlog of projects at the end of the prior term (2009), so there were no outstanding applications to respond to.
  • Staff and operators noted that the time for review, approval and other aspects of the process related to project tendering had increased in the present term as compared to the prior term. When too protracted, this can cause operators to delay project work, possibly causing planned payments to be missed as a result. Operators interviewed were concerned that projects being awarded for 2014-2015 would not start as planned, given the length of the review process in 2013-2014. Analysis of the project activity for the term (up to data extraction in October 2013) reveals that although a large number of new agreements were put in place, funds were slow to flow. Relatively little was paid out in the first year of the term and funds unspent in any of these years lapsed.
Table 3: Payments made against agreements signed in the present term (2010-2011 to 2013-2014)
Year of signature # of agreements signed Disbursements ($000) paid per year: 2010-2011 Disbursements ($000) paid per year: 2011-2012 Disbursements ($000) paid per year: 2012-2013 Disbursements ($000) paid per year: 2013-2014
2010-2011 49 7,180 25,921 952 81
2011-2012 29   4,293 14,965 6
2012-2013 63     13,906 10,617
2013-2014 21*       2,166
Paid against all Term 4 agreements   7,180 30,214 29,283 12,870
Paid against all agreements   9,616 30,403 29,823 12,870
Paid against all agreements from prior terms   2,436 189 0 0

* Includes seven projects signed after extract taken from ACAP database.

  • Since 2010-11, the department has experienced difficulties when seeking to re-profile unspent budget to future years. This is seen by program staff and managers as the main factor explaining the lapsing of funds allocated to the Program. To verify that explanation, evaluators reviewed the spending profiles of previous terms when such re-profiling occurred. Between 2000-2001 and 2009-2010, the totality of the budget ($380M) was committed and the vast majority of it (93%) spent. A review of the total expected contribution (TEC) amounts recorded for projects signed in these ten years reveals that staff over-delegated in five of the years anticipating that some projects would not be undertaken. This coupled with care in managing over-commitments, resulted in the reduction of slippage (7%) for any reason over these ten years.
Table 4: Spending against budget for Term 2 and Term 3 (2000-2001 to 2009-2010)
Agreements signed in: Disbursements made in
Term 2 and Term 3 ($000)
Term 1 18,244
Term 2 168,754
Term 3* 165,192
Total spending for period 352,190
Program Budget for period 380,000
Budget – Disbursements 27,810
As percent of period budget 7.3%

* Net of payments to be made in Term 4 in amount of $2,625K.

Finding 3. In the present term, recipients did not use approximately 12% of the money awarded to them. Between 1995-1996 and 2012-2013, on average, recipients do not use 11% of the funding they were awarded. Under-spending on multi-year airside construction projects accounted for almost two thirds of the amount not used by recipients.

In the present term, recipients did not use approximately 12% of the money awarded to them for the projects for which Evaluation had full information. It was suggested that the commitment amount indicated in the program database may not reflect the amount awarded to clients on the contribution agreement, and that differences in the amount awarded and committed led to this finding of slippage. However, Evaluation had taken several steps to validate contents of the ACAP database including comparison of critical fields against paper documents received for 32 projects at 11 airports. Comparison of the total expected contribution value in the database against the award amount on the contribution agreement revealed that the figures were identical for 29 projects and minimally different in the other two cases (i.e. a difference of .02% in one case and 2% in the other). This was considered to be a high degree of accuracy. Therefore, it is concluded that clients did not spend a significant amount of the funds awarded to them in the present term.

Table 5: Spending against the amount awarded under the agreement (TEC) for Term 4 Footnote 3
Projects Awarded in: Total Spending ($000) Adjusted TEC ($000) TEC – Spending ($000) Unspent (as % of TEC)
Term 3 2,625 2,811* 185 6.6
Term 4 69,443 78,999 9,556 12.1
TOTAL   81,810 9,741 11.9
Term 4 incomplete information 10,644** na na na
TOTAL 82,712      

* Remaining progress payments indicated in database.
** Includes an initially rejected project in amount of $201K.

Analysis of the ACAP database for the years for which full project information was available (i.e. from 1995-1996 to 2012-2013) indicates that, on average, clients do not use 10% to 11 % of the amount awarded to them. The client spending analysis took the difference between the adjusted expected contribution / award amount and actual disbursements to airports for the 722 Footnote 4 projects funded. Clients did not use $60.5M of the $563.5M awarded to them over these years. Nothing indicates that under-spending is more or less pronounced in certain years or terms.

Table 6: Amounts awarded, funds spent and not spent according to project category
Category Funds Awarded ($000) Disbursements ($000) Funds not spent ($000) Funds not spent (%)
Airside Fire Truck (CARS 303) 15,863 12,950 2,913 18.4
Field Electrical System 11,173 10,198 976 8.7
Mobile Equipment 81,533 73,510 8,023 9.8
Runway Rehabilitation 301,039 270,644 30,395 10.1
Sand Storage 4,603 4,090 513 11.1
SNOWTAM 164 124 40 24.3
Taxiway Rehabilitation 52,552 49,080 3,472 6.6
Wildlife Fence 11,805 9,100 2,705 22.9
Various 5,130 4,683 447 8.7
Total 563,500 503,029 60,471 10.7

Operators indicated that coming in under budget was not uncommon. For those paying a percentage of the project cost, it was desirable. A review of the close-out documentation obtained for case study projects uncovered material savings on two airside construction projects and on the purchase of a fire truck to comply with the new Aircraft Rescue and Firefighting requirement (CAR 303). Analysis of the construction projects funded over the history of the program reveals that collectively apron projects came in under the contribution amount by 12%, taxiway projects came in under budget by 7% and runways by 10%. Since spending on these projects amounted to 68% of total disbursements (at $343.7M), the associated under-spending explains roughly two-thirds of the $60M not disbursed to recipients. Given that these projects typically involve multiple payments, closer monitoring of disbursements against forecasted contributions might enable the program to anticipate and manage under-spending on such projects.

Table 7: Amounts awarded, funds spent and not spent according to project category
Category Funds Awarded ($000) Disbursements ($000) Funds not spent ($000) Funds not spent (%)
Airfield Lighting 52,515 44,664 7,852 13.0
Apron Rehabilitation 27,122 23,986 3,136 5.2
Airside fire truck (CARS 303) 15,863 12,950 2,913 4.8
Field electrical system 11,173 10,198 976 1.6
Mobile Equipment 81,533 73,510 8,023 13.3
Runway Rehabilitation 301,039 270,644 30,395 50.3
Sand Storage 4,603 4,090 513 0.8
SNOWTAM 164 124 40 0.1
Taxiway Rehabilitation 52,552 49,080 3,472 5.7
Wildlife Fence 11,805 9,100 2,705 4.5
Various 5,130 4,683 447 0.7
Total 563,500 503,029 60,471 100.0

Finding 4. There is a potentially unmet demand among ACAP-eligible non-recipients, but the magnitude of that demand is unknown.

From 2010-2011 through 2012-2013, ACAP funded 57 airports or 29% of the airports eligible for ACAP. They were located across the regions thus: 9% in Atlantic, 33% in Ontario, 21% in Pacific, 26% in PNR and 10% in Quebec Footnote 5.

Over the life of the program, approximately twenty percent Footnote 6 of airports eligible for ACAP have not received funding. Over half (23) are found in the North and one third in Ontario (14). The remainder are located in Pacific (1), Atlantic (1) and Quebec (6). Non-recipients predominantly serve small, Northern or remote communities.

Table 8: Client population for ACAP up to October 2013
Status Number of Airports
Funded airports that have received funds 169*
Funded airports that have not yet received funds 5
Unfunded airports with applications under review as of October 2012 3
Total applicants 177
Eligible airports as of January 2012 that have never applied 45
Total known population as of January 2012 222

* Includes 26 airports that have received ACAP funding but were ineligible in January 2012.

Table 9: Non-recipient airports by region
Region Airports
Atlantic Natuashish
Ontario Bearskin Lake, Deer Lake, Fort Albany, Fort Hope, Fort Severn, Muskrat Dam, Ogoki Post, Peawanuck, Pickle Lake, Pikangikum, Poplar Hill, Round Lake (Weagamow), Sandy Lake, Summer Beaver
Pacific Gillies Bay / Texada Island
Prairie Northern Déline, Fort Good Hope, Fort Simpson, Fort Smith, Hay River, Holman Island, Lutselk'e, Paulatuk, Rae Lakes (Gamètì), Snare Lake (Wekweeti), Tuktoyaktuk, Wha Ti (Lac La Martre), Cambridge Bay, Clyde River, Hall Beach, Igloolik, Kimmirut (Lake Harbour), Nanisivik, Pangnirtung, Pond Inlet, Repulse Bay, Resolute Bay
Quebec Bagotville, Bonaventure, Îles-aux-Grues, La Tabatière, Montmagny, Umiujuaq

Regional staff members were asked why certain eligible airports in their region had not applied for funding. Representatives of Ontario Region indicated they were aware of several airports owned by the province that had not required assistance to date but that were expected to apply collectively in the near future. This was confirmed in a subsequent interview with the operator for these airports. Representatives for the Prairie and North Region regional office did not have insights into specific airports but suggested that the territories tend to decide who will apply for funding under the various programs available and when. Case study discussions and operator interviews confirmed this observation. There were relatively few non-recipients in the other regions.

There appears to be a high level of awareness among non-participants, but in a few cases their knowledge may be insufficient to allow them to apply or engage with staff with ease. Evaluation sampled five non-recipient airports for the operator interviews. The five airports were represented by four individuals, as operators could manage several airports. (One of the operators oversaw 28 airports and a second operator oversaw five airports. So, while the number of individuals contacted is small, the number of airports covered in these discussions is relatively large.) Evaluators discovered that awareness of ACAP among non-recipients was high, with only one interviewee indicating that s/he had no knowledge of the program. Two other operators were aware of the program but had no direct experience with it. Three of the operators said they would like additional information on ACAP when this offer was made and observed that it sounded like a program that they could use. ACAP staff told evaluators that the approach to outreach would be changing as it would be done via the website or through national/regional councils or associations exclusively. Case study discussions with operators of large- or medium-sized airports suggest that this would suffice for airports such as theirs, where managers tend to be active in the airport associations. Operators of remote or small airports said that their time for activities outside of airport operations was very limited. As a result, operators of the smallest, and sometimes remote, airports may be least able to attend council or association events or scan association notices for ACAP information.

Case study operators argued that although the need is significant they are strategic when applying to the program, because they realize that program funds are limited or because they believe regional quotas are in place. Some indicated they had held back on applications for certain projects to ensure that they would be considered for others. Some operators informed us that they had simply gone ahead with an equipment purchase, because their need was immediate and they believed that their application would fail. These operators did note that they would not be able to self-finance major airside projects.

Finding 5. ACAP recipients and eligible non-recipients have benefited from other federal capital investment programs. At the time of the evaluation, however, there appeared to be few federal alternatives to ACAP and access to regional funding for airports varies across the country.

Interviews and documentary sources suggested that there has been other public funding for airports in the past decade. Documents received from departmental sources revealed that Canadian airports received $341M through various Infrastructure Canada programs between 2001 and 2012. From project-level information available for these programs, Evaluation determined that 61% ($209.1M) of the funding went to 32 airports that are or have been eligible for ACAP. This amount is the equivalent of half of ACAP's budget for the same 11 years.

Table 10: Other federal programs that provided funding to airports in 2001 through 2012
Infrastructure Canada Programs with various cost-sharing arrangements (2001 to 2012) Funding* provided to all airports ($M) Funding* provided to ACAP-funded and presently eligible airports ($M) Ongoing (as of 2014-2015)
Building Canada Fund-Communities Component 36.3 24.5 Yes
Canada Strategic Infrastructure Fund 15.5 15.5 Yes
Infrastructure Canada Program 12.3 2.8 No
Infrastructure Stimulus Fund 115.6 59.4 No
Municipal Rural Infrastructure Fund 8.3 4.9 No
Total 188.1 107.2  
Federal Gas Tax Fund (2005-2011) 1.1 0.3 Yes
Provincial-Territorial Base Funding Initiative 152.2 101.6 Yes
Grand Total 341.4 209.1  
Spending on airside (runway, apron, taxiway)   117.4
Spending on other than airside   91.7

* Includes all funding from the federal, provincial, municipal governments and other contributing parties.

Over half of the funding ($117M) provided through these programs supported airside projects at 12 ACAP recipient and/or eligible airports. Nine ACAP recipients received $94M for projects ranging from $1M to $27M. The projects have been referred to as expansionary by interviewees and include airside widening, upgrades, extensions, rehabilitation and the construction of a new asset. Of the ACAP airports that benefited from these programs, one had not received support from ACAP for airside work.

In addition, Canadian airports received $62.8M from the Regional Economic Development Authorities (REDAs) between 2002-2003 and 2010-2011. How much capital funding was provided to ACAP-eligible airports through the regional REDAs could not be determined from the material provided.

There appear to be fewer sources of funding for airports in 2014-2015 than in the previous years. Interviewees alluded to more funding opportunities in certain regions of the country than others. Eleven operators said they had local alternatives to ACAP. Two of these operators said they intended to apply to local programs. It was generally held that these other programs supported projects ineligible for ACAP. Evaluators conducted an on-line search for other funding programs that might accommodate ACAP-eligible airports. Twenty-two programs were uncovered that provide support for infrastructure. Of the nine federal programs that appear to be ongoing, three might fund ACAP airports: Capital Facilities and Maintenance Program (Aboriginal Affairs and Northern Development Canada for an airport on reserve), Gas Tax Fund (Infrastructure) and the Public-Private Partnership Fund (Infrastructure). The regional programs found that did support airports tended to exclude ACAP-eligible airports, save possibly the Northern Ontario Heritage Fund Corporation (Ontario, Ministry of Northern Development and Mines). However, the search criteria may not have identified broader regional funding programs such as the Regional Economic Development Authorities (REDAs).

Finding 6. ACAP continues to be aligned with the strategic outcomes of the department and with federal priorities and roles and responsibilities.

Airports eligible to receive ACAP funding are owned and operated by others. Ultimately, these airport owners and operators are responsible for maintaining their assets in a condition that is safe for users and that meet federal safety regulations. Transport Canada may choose to assist these owners and operators to be informed of and comply with its regulations in any number of ways, including providing financial aid to do so.

Delivery of ACAP falls within the powers of the Minister of Transport. Under the Aeronautic Act, "The Minister is responsible for the development and regulation of aeronautics and the supervision of all matters connected with aeronautics and, in the discharge of those responsibilities, the Minister may... provide financial and other assistance to persons, governments and organizations in relation to matters pertaining to aeronautics." [R.S., 1985, c. 33 (1st Supp.), s. 1] Over the years, the program has supported the departmental strategic objective pertaining to the safety of Canada's transportation system (S.O. 3.0). ACAP will be found under S.O. 1.0 beginning 2014-2015. The rationale is that only safe (i.e. certified) facilities are permitted to operate. Hence, the program ultimately contributes towards ongoing access and so supports an efficient transportation system. ACAP is aligned with the Government of Canada's social outcome – a safe and secure Canada.

Findings on Performance

Program performance is analyzed by assessing the extent to which the program meets its expected outcomes (impact) and the extent to which it is efficient and economical in the resources used to produce its outputs and achieve its outcomes.

Impact

ACAP's logic model implies that the completion of capital projects at the selected airports will enable eligible airports to continue to meet safety standards required for their continued operation. The cumulative impact of these capital projects is their contribution to the safety of the civil aviation system. Other outcomes associated with the program include the protection and increase in useful life of asset and the reduction of operating costs at ACAP recipient airports.

The evaluation will first assess the performance of the Program with regards to its safety-related outcomes. Other outcomes will be discussed at the end of this section.

Safety outcomes

In this section, the evaluation will present findings that demonstrate the impact of the program in addressing safety needs of the airports.

Finding 7. Almost half of the projects awarded in Term 4 had been completed by the time of data extraction for this evaluation. All regions were successful in disbursing final payments, although this activity is least evident in Quebec at this time. Consistent with historical precedent, over three-quarters of the projects completed were for Priority 1 work. The multi-year approach to funding was evident as final payments and cash flow increased in the later years of the term.

From 2010-2011 through to the day of the database extraction Footnote 7, 155 projects were funded. Of these, 107 (69%) are considered to have been completed and all disbursements made to clients Footnote 8. This includes 48 projects awarded in 2010-2011, 25 awarded in 2011-2012, and 34 awarded in 2012-2013. In addition, eight projects from Term 3 received their final payments in Term 4. Term 3 projects will not be discussed further here.

The greatest numbers of projects awarded and completed within the fourth term (up to October 2013) were recorded in Ontario Region, with the fewest being recorded in Quebec (see table below). The Prairie and North Region, however, disbursed the most funds on a range of priority 1 and priority 2 projects.

Table 11: Projects funded and completed between 2010-2011 and 2013-2014 by Region
Region Projects
(n)
Total Spending
($000)
Percent of Spending
(%)
Atlantic 12 13,036 18.8
Ontario 36 17,671 25.4
Pacific 22 13,924 20.1
Prairie and North Region 23 20,160 29.0
Quebec 14 4,652 6.7
Total 107 69,443 100.0

As can be seen in table below, priority 1 projects consumed over three-quarters of disbursements with runway projects representing over half of all disbursements. This is in keeping with historical funding patterns and reflects the relative high cost of airside rehabilitation versus equipment purchase. Mobile equipment projects were numerous, however, representing over half of the total number of projects.

Project Category Projects
(n)
Total Spending
($000)
Percent of Spending
(%)
Priority 1 Projects:
Airfield Lighting 11 3,494 5
Apron Rehabilitation 5 4,402 6.3
Fire Truck (CARS 303) 1 687 1
Field Electrical 10 4,757 6.8
Runway Rehabilitation 12 44,080 63.5
Sand Storage 2 503 0.7
Signage 1 17 0.0
Wildlife Fence 4 1,500 2.2
Total Priority 1 46 59,440 85.6
Priority 2 Projects:
Mobile Equipment 55 9,576 13.8
SNOWTAM 5 124 0.2
Total Priority 2 60 9,700 14.0
Priority 3 Projects:
Asbestos Removal 1 304 0.4
Grand Total 107 69,443 100

Finally, the flow of funds on completed projects indicates the use of a multi-year year approach. This is true for construction work as well as mobile equipment (i.e. almost a third of these projects). Roughly half of the projects were delivered over two or more years. So, as the term wore on an increasing number of projects were completed, and thus funds disbursed on terminating projects accumulated.

Finding 8. All ACAP-eligible airports maintained their certification in 2012-2013. The vast majority of airports that were eligible for ACAP in 2008 remained eligible at the beginning of 2012. The few that became ineligible in this period did so for reasons other than loss of certification.

The Program began reporting on the maintenance of certification by ACAP-eligible airports in 2012-2013. According to the Department Performance Report, in that year all ACAP-eligible airports maintained their certification.

A comparison of the lists of eligible airports in 2008 and those eligible as of January 2012 revealed that nine airports had become eligible in the intervening years. According to the Canada Flight Supplement (CFS, valid to April 3, 2014), seven of these were still certified. The eighth airport had closed. The inspector who provided the CFS information indicated that the operator of the ninth airport had elected to de-certify. Passenger volume statistics for these airports show that the majority had insufficient passenger volumes to be eligible to receive ACAP support. It is unclear why one airport with volumes exceeding 1,000 became ineligible in this period. Were these passengers predominantly on charter as opposed to commercial flights this would account for its ineligibility, but the statistics obtained do not distinguish between charter and scheduled commercial traffic.

Finding 9. The preponderance of funding has gone to the highest priority projects. Processes are in place to ensure that ACAP-funded projects respond to a safety-related need and the proposed restoration or replacement properly addresses the need. The effectiveness of Priority 1 projects is also evident in that they typically respond to deviations from regulatory safety standards.

The first near-term objective of the program is the funding of capital projects, which when completed will maintain or improve safety levels at ACAP-eligible airports.

Towards meeting this objective, the program established funding priority levels according to how critical projects were perceived to be to maintaining safety. Therefore, effectiveness is suggested by the extent to which funding goes towards higher versus lower priority projects.

In the first three years of the present term, ACAP disbursed 85% of all funds spent on Priority 1 projects. That the preponderance of the funding was dedicated to Priority 1 projects is consistent across all regions save Quebec, where adoption of the new version of the contribution agreement is requiring more time than elsewhere. However, in previous terms, Quebec's investments in Priority 1 projects have represented at least 85% of total spending, as has that of other regions.

Table 12: Proportion of spending on Priority 1 Projects
Region 2010-2011 to 2012-2013
All Projects
Spending
($M)
2010-2011 to 2012-2013
Priority 1 projects
Spending
($M)
2010-2011 to 2012-2013
Priority 1 projects
Spending
(%)
1995-1996 to 2012-2013
All projects
Spending
($M)
1995-1996 to 2012-2013
Priority 1 projects
Spending
($M)
1995-1996 to 2012-2013
Priority 1 projects
Spending
(%)
Atlantic 13 12 92.3 42 36 85.7
Ontario 20 17 85.0 117 100 85.5
Pacific 14 12 85.7 77 67 87.0
PNR 18 16 88.9 183 162 88.5
Quebec 5 3 60.0 84 71 84.5
Total 70 60 85.7 503 436 86.7

To some extent this is a function of the relatively greater cost of Priority 1 projects (i.e. airside projects that include the rehabilitation of runways, taxiways and aprons, as well as firefighting vehicles and shelters). On average, Priority 1 projects cost over five times that of any other priority-level project.

Alignment of ACAP-funded projects with safety-related issues and needs is verified through various processes. Specifically, these processes are in place to demonstrate that a critical asset is either missing or in unsatisfactory condition; to ensure that the proposed replacement or rehabilitation is an appropriate solution; and to verify post-completion that the purchase or work done was to specifications. For major projects, third party professional engineers typically provide their expert opinion on all or some of the original asset condition, sufficiency of the proposal and quality assurance of the work done.

Further, it appears that Priority 1 projects have been effective in addressing existing safety needs since these projects typically respond to deviations from regulatory safety standards. Interviews with staff and airport operators indicate that this is not the case for Priority 2 projects, as fleet requirements are not stipulated in the regulations. An internal guidance document has been developed for the assessment of equipment projects, but a scan of the website and program application material revealed no details that could assist prospective applicants. A number of operators were frustrated by the lack of clear guidelines for eligible fleet and some criticized the program's equipment funding decisions (e.g. lack of support for back-up vehicles).

In order to demonstrate that a Priority 2 project responds to a safety need, the program and applicants would benefit from guidelines outlining the size and composition (to the extent possible) of fleet to serve an airport of a certain size and environment and so on. The anticipated adoption of streamlined applications for equipment projects also suggests that such guidance would be useful.

Recommendation: It is recommended that the Program explores the extent to which there are gaps in the guidance to Operators regarding the support available for mobile equipment under ACAP (e.g. regarding the limits on the number and type of vehicles). It is then recommended that the Program produce and disseminate guidance material to address gaps, especially those that could impede the streamlining of mobile equipment project applications.

Finding 10. From 1995-1995 to 2012-2013, fewer than 3% of accidents reported at ACAP-eligible airports appear to have been related to the infrastructure at these airports. This suggests that ACAP has contributed to the maintenance of safety levels at airports.

Analysis of accident reports reveal that it is rare that occurrences at ACAP-funded and presently eligible airports were related to the condition of the infrastructure at the airport.

To test for the possible impact of ACAP on the safety of airports, evaluators obtained detailed information on accidents at ACAP-funded and currently eligible airports through the Civil Aviation Daily Occurrence Reporting System (CADORS) for the history of the program (January 1, 1995 to May 15, 2014). The query returned 6,639 accidents for the period. This compares reasonably well to formal accident statistics from the Transportation Safety Board. The records were then matched to the airports presently eligible for ACAP or that had received funding in the past. A subset of 781 events emerged. Two evaluators independently reviewed the 781 events, seeking those in which the accident might be associated with the state of the infrastructure at these airports. In all, 25 were thought to possibly relate to the state of airport assets. Another evaluator reviewed the 25 events and eliminated two for which there was a reference to assets but no likely cause and effect (e.g. no reference to the condition of the airside surface when a plane overran the runway). A third event was eliminated based on the Transportation Safety Board report which made it clear that there was no causal link to the state of the infrastructure.

As a result, fewer than 3% of accidents (22 out of 781 events) reported at ACAP-funded or currently eligible airports were found to be possibly related to the state of the infrastructure at the airports. Over the past 18 years, only two accidents (0.3% of the 781 events) were associated with asset degradation. Neither of these occurred between 2010-2011 and 2013-2014. Over half of the accidents were attributed in part to snow, ice or water on airside surfaces. Events involving wildlife accounted for almost 30% of the accidents. Consequences were largely in damage to the aircraft (20 accidents). In one case, there was damage to the airport infrastructure. In two cases, minor injuries were reported. No fatalities are associated with these events.

Table 13: Number of accident events at ACAP-eligible airports in which asset condition was implicated from 1995-1996 to 2012-2013
Asset Issue Number of Accidents
  Funded Airports Currently eligible but not funded Total Percent
Animal on runway 5 1 6 27%
Airside surface (bump, hole) 2   2 9%
Snow, ice or water on airside surfaces 9* 4** 13 59%
Trespassing   1 1 5%
Total 16 6 22 100%

* 3 occurred in the present term
** 2 occurred in the present term

That so few accidents have been found to be related to the state of the infrastructure at ACAP-funded or eligible airports suggests that the program may indeed be contributing (along with other funding programs and owners investing in these airports) to the safe operating condition of these airports. Over the 18 years that the program has provided funding of $503M in response to requests for support to obtain assets that are lacking or to improve the condition of deteriorated assets, reported accidents rarely implicate airport infrastructure. Further, none of the events discovered were repeated. In some cases, ACAP funding was subsequently provided that could be seen to address the issue (e.g. wildlife fencing where incidents involved animals, or snow removal equipment where incidents involved snow or ice on airside surfaces). It is true too that in other cases, ACAP had funded assets that could have been expected to address weather or intrusions prior to the event, suggesting that accidents are just that and not always preventable.

Finding 11. Case studies demonstrated that completed projects have been effective in improving asset condition or the adequacy of airport infrastructure.

Evaluators conducted case studies to probe the impact of ACAP funding on the ground. A heterogeneous set of funded projects and airports was selected in order to represent the diversity of the situations addressed.

The eight airports studied were located in Ontario (3), Atlantic (2), PNR (1), Pacific (1) and Quebec (1). Thirty projects were discussed across these airports consisting of six airside lighting/sign, three fire trucks, two sand storage sheds, two wildlife fences, nine mobile equipment and eight runway/apron/taxiway projects.

The evidence gathered through program staff interviews, operator interviews and document and data review indicated the effectiveness of implemented projects in many respects and, by extension, of program success. Across project categories, it can be said that ACAP-funded projects enabled these recipients to meet safety standards and/or improve safety levels at the airport through:

  • Rectifying non-compliant situations such as meeting the new CAR 303 requirement for enhanced firefighting capacity.
  • Restoring assets in poor condition such that potential risks to users were not realized. Runways were rehabilitated, such that there were no holes or other signs of degradation on visual inspection. Sand sheds were re/constructed, such that the granular and chemicals were dry to the touch and staff attested to improved surface cleaning times or effectiveness. Fences were improved or erected, such that the incidence of trespassing or animal incursions dropped in the period following construction of the barrier. (It is important to note that by taking positive action to tangibly improve asset condition and increase remaining years of useful life, funded parties mitigate real threats to safety and in doing so raise safety levels at the airport.)
  • Purchasing mobile equipment to replace old vehicles, such that interviews and records indicated reduced frequency of equipment down-time and in some cases improved performance.
  • Upgrading lighting and airside signage to comply with current regulations. In some cases the replacement of old technology with new technology also resulted in improved safety levels due to the features of the new products.

The table below provides some key observations of the situation at case study airports prior to project implementation and following completion of projects funded entirely or in part through ACAP.

Table 14: Summary of projects' impact (before and after project work)
Situation prior to project work Situation following project work
Airfield Lighting
Airport #3: Reduced Visibility Operating Plan (RVOP) required by NavCanada due to runway incursions. CADORs reports showing 22 incursions and requirement of CAP for airport. Operator tracking sheet showing incursions of 10, 10, 9, 14 for years 2008 to 2011 respectively. RVOP lifted. No such restriction noted in the Aerodrome Facilities Directory 2014. Operator data showing incursions following project work of 10 in 2012 and 7 in 2013, while traffic volumes have climbed from 97K in 2012 to 107K in 2013.
Airport #5: Airfield lighting did not conform to regulations. CADORs reported 8 events where airfield lighting was unserviceable between 2003 and 2010. Installation of ODALs brought airport up to standards and provided improved safety due to improved technology of new lighting system. Departmental inspectors and operators remarked on benefit to pilots realized with change to ODALS. From product description: "The omni-directional horizontal beam pattern, bright flashes and a sequential strobing flash pattern that rolls towards the runway threshold helps the pilot identify the runway in use." No CADORs reports of lighting failures following project work.
Fire Truck (CARs 303)
Airport #2: ARFF fire truck required to meet new regulatory requirements. Purchase of required fire truck brought airport into compliance with regulations (CARs 303). Test results provided by airport operator demonstrates that the truck met required safety performance standards (in terms of response time, distance travelled, fluid discharged).
Field Electrical
Airport #5: Application documentation includes evidence of 21 repair events throughout 2010. CADORs documents power failures: 2007 (1), 2008 (3), 2010 (1), 2011 (1). Following project completion, there was evidence of routine maintenance (3 events) but no evidence of major repair work being done. Power failures were reported but with less frequency (i.e. 2 in 2012 and 1 in 2013). Note that not all CCR panels were replaced with this project.
Mobile Equipment
Airport #6: With air temperature hovering around 0˚ centigrade, the time to clear ice from the runway using the old spreader extended from 8:06 a.m. to 9:13 a.m the following day. This led to the closure of airport over this time, causing 2 flight cancellations and 2 flight delays. For same weather condition (i.e. around 0˚ with icy conditions), the time to clear the ice from the runway with the new spreader was around 1 hour (i.e. from 6:18 a.m. to 7:29 a.m. the same day). Delays were minimal and the airport remained open.
Airport #7: VOHL sweeper experienced frequent breakdowns and extensive repair work, despite being within useful years of life (i.e. 109 repairs required between 2010 and 2013). Major issues were noted in the records, including overheating, clutch replacement, splined shift replacement, emergency shut downs, fan replacement, new D-shaft required, rebuild gear box, engine random shutdown. Following replacement, maintenance on new sweeper consisted of minor service and numbered 22 items in 2013-2014.
Airport #7: Chemical spreader originally in use was an adapted pesticide applicator with a sprayer arm spanning 12 feet. A hand pumped mechanism was employed to empty the 300 gallons it carried. Staff reported increased performance and reduced delays following the acquisition of a manufactured chemical spreader, able to cover a 45 foot swath and carrying 1100 gallons of chemicals (as documented in product material).
Runway Rehabilitation
Airport #3: Deteriorated condition indicated by high crack sealing costs in the years just prior to the rehabilitation (from 2007 to 2011 respectively = 138K, 20K, 20K, 100K, 100K). Improved surface condition of runway was apparent on visual inspection and indicated by lower crack sealing costs for the years following the rehabilitation (from 2012: 68K, 70K, 45K).
Airport #4: Foreign object debris was noted as an issue related to deterioration of pavement in application made to TC. A CADOR's report for 2008 expressly identified the lifting of loose sealant as the source of FOD. Following rehabilitation of the runway, no CADOR's report has remarked lifting asphalt or sealant for this runway.
Sand Storage
Airport #2: Granular and chemicals stored in an old hangar with limited space and damp conditions. Airport was unable to bulk purchase urea, which exposed the airport to the risk of being unable to obtain more urea later in the winter. Following construction of sand shed, improved storage conditions enabled airport to bulk purchase urea and thus ensure that it would have sufficient supply to last the winter. Invoices provided by the airport showed purchases of 750 kg bags before the construction and 5.25MT to 17MT after the construction. As well, operators report on increased efficiency and reduced delays due to reduced time to load new chemical spreader.
Airport #5: Rotting shingles on old sand shed allowed in wind, rain and snow and birds had taken to roost inside the roof. Before the re-shingling, the AIRMAN reports for the airport indicated 10 bird sighting events, which was a safety issue as the sand shed was located beside the runway. Following re-shingling, no pigeons were again sighted on or around the sand shed (again, as demonstrated through AIRMAN record search). Operators report granular in dry condition and available for use as needed.
Wildlife Fence
Airport #5: In its application for funding for a wildlife fence, the airport cited AIRMAN records of 73 large mammals within the airport boundaries. Following construction of the fence, AIRMAN records indicated 33 animal incursions. Further steps were being taken to prevent deer from slipping through the gates with vehicular traffic.
Airport #8: The airport applied for funding to install new 8 foot fencing around 3 sides of the airport. Before the construction, 44 large mammals and turkeys were sighted inside the perimeter of the old fence, versus 112 sighted outside the perimeter. After construction, the airport manager documented 4 deer and 12 moose within the perimeter of the fence. The airport is applying to install the 4th side to secure the entire perimeter.

Other outcomes

This section presents findings related to other outcomes of the Programs.

Protection of assets / increase in useful life at ACAP-eligible airports

Finding 12. Estimates of the extension of useful life of airport assets supported through ACAP funding are valid and reliable.

Between 2010-2011 and 2012-2013, 103 funded projects (96% of those funded in these years) provided an estimate for "added years of asset life" attributable to the project work. Since 2004-2005, operators seeking funds for 328 projects have done so (88% of those funded since this time). Funded projects, however, were often multi-purpose. When different categories of assets were handled within the same project, different life expectations were given. However, the database records just one estimate. So this section focuses on projects where a single purpose is noted in the project title.

The table below shows the range of "added years of life" by major project category for the sample (119) of single-purpose projects since between 2004-2005 and 2012-2013. Region(s) with the highest (H) and lowest (L) estimate per project category are shown where there are differences to remark.

Table 15: Range of added years of asset life per project category by region
Projects (n) Low (L) High (H) Atlantic Ontario Pacific PNR Quebec
Apron (10) 15 25 L, H     L L
Fire truck (7) 20 20          
Snow Plough (27) 10 20   H H H,L  
Runway (21) 10 20         L
Sand Storage (11) 25 50 H H L    
Sweeper (31) 15 20          
Taxiway (2) 20 20          
Wildlife Fence (10) 15 30   L   H H

This sample represents 36% of the projects with asset life extension forecasts. There are no systematic regional differences. Further, the extension of asset life supported by ACAP for these asset classes concurs with accepted accounting standards.

Comparing the ACAP figures to publically available depreciation schedules suggests that the funded projects were meant to extend asset life by between 50% and 100%. For example, the Listing of Suggested Life for Depreciation of Capital Assets for the government of North Dakota suggests the following useful life spans:

  • Buildings of brick or stone (50 years) and garages (33 years).
  • Vehicles, including trucks, snowplows, front end loaders and fire engines (10 years).
  • Lighting and transformers (40 years) and electrical poles and towers (33 years).
  • Concrete runway and apron (30 years) and asphalt overlay (20 years).
  • Wire (15 years) or wood fence (10 years).

Most staff and operator interviewees were knowledgeable about the origin of the estimate provided in the projects and thought they were valid. A few operators relied on regional support for this information. Staff and most operators, however, stated that these figures are known in the industry. Life extension estimates for major construction projects are typically provided by engineering consultants, while estimates for manufactured products are provided by the supplier. Validity of the estimate is also attributable to challenge processes at the review stage and to the requirement for various construction or manufacturer warranties during implementation.

The asset life measure as stated in the performance measurement framework (RMAF, 2009), however, does not allow the program to demonstrate its success as well as it might. The issue is that the added number of years is meaningful in terms of the life of a specific asset Footnote 9. The ideal would be to adopt a generic measure that would serve equally well across asset types. For instance, the Program may consider reporting on the number of funded projects that improve the condition of the asset to satisfactory or better. The second component of this immediate outcome might be better measured by the number of funded projects that extend the life of the asset by a certain percentage (e.g. 50% or more). Finally, given the association between asset condition and access, the Program might consider introducing an outcome and its measure that captures the notion of access, such as access at ACAP-eligible airports, is maintained or improved. This outcome might be operationalized through the lifting or absence of restrictions noted in the Aerodrome Facilities Directory imposed due to the state of the infrastructure.

Finding 13. Analysis revealed no cases of premature funding of projects by ACAP or indeed through other known funding programs. This suggests that the asset improvements have endured or are on track to achieving the lifespan that was forecast.

A related measure of ACAP's success in protecting assets would be the durability of the projects it has funded. Analysis of ACAP's database since its beginning indicates that the airports rarely repeated projects within a project category, except for the mobile equipment category. Seventy-five airports completed more than one mobile equipment project. However, this category is difficult to assess for duplication, since there are several types of equipment in the category (snow ploughs, loaders, chemical spreaders, graders, etc.). Therefore, the analysis focuses on the other project categories.

Table 16: Number of airports that received funding by number of projects
Number of Projects Number of Airports Running Total Percent of Total
1 46 46 27.2%
2 33 79 46.7%
3 18 97 57.4%
4 12 109 64.5%
5 11 120 71.0%
6 8 128 75.7%
7 9 137 81.1%
8 13 150 88.8
9 3 153 90.5%
10 3 156 92.3%
11 1 157 92.9%
12 3 160 94.7%
13 4 164 97.0%
14 0 164 97.0%
15 3 167 98.8%
16 1 168 99.4%
21 1 169 100.0%

Excluding mobile equipment, there is scant evidence of airports having received ACAP funding to repeat a specific project over the history of the program. The table below shows the number of airports that have received funding for two or more projects within a single project category. Airfield lighting projects are most likely to contain evidence of repeat work (at 22 airports). All wildlife fence, sand shed and reporting system projects are unique.

Table 17: Number of airports with more than one funded project within a project category
Category # airports with more than 1 project in this category
Airfield Lighting 22
Apron Rehabilitation 1
ARFF (Fire Truck) 4
Field Electrical 2
SNOWTAM 0
Other 4
Runway Rehabilitation 15
Sand Storage 0
Taxiway Rehabilitation 3
Wildlife Fence 0
Mobile Equipment 75

Project descriptions were reviewed in an attempt to tease out the duplicate work among them. A comparison of the project details for sets of projects within the same category uncovered four airports where duplicate work might possibly have occurred within the expected life span of the asset. A conservative approach was taken. Specifically, new fire trucks at Hamilton and Toronto were purchased within the lifespan of the original trucks. So they were considered repeat projects. They may have been acquired, however, in response to a regulatory change and hence may not be a duplication of the work. Similarly, runway projects at Moosonee and Fort Chipewyan are considered potentially duplicate work in the absence of details that would distinguish the earlier projects from runway projects occurring a few years later. Contrarily, five apron projects at Sydney were rejected as duplicates based on each one's low dollar value. It would appear from the cost that the apron projects represented phased work on this asset.

Table 18: Possible repeat projects per airport by fiscal year funded
Airport Fiscal Year Project Title Spending ($)
Fire Truck (ARFF)
Hamilton 1999-2000 Crash/Fire Rescue Vehicle Replacement 421,200
Hamilton 2007-2008 CAR 303 – Purchase Fire Truck & Exhaust System 785,603
Toronto Centre 2001-2002 Replace Fire Truck 211,110
Toronto Centre 2007-2008 Purchase Firefighting Vehicle (CAR 303) 739,477
Toronto Centre 2009-2010 Construction of an ARFF Equipment Shelter 871,193
Runway
Fort Chipewyan 1996-1997 Improvements to Runway Including Crack Repairs and Reshaping & Regrading of Drainage Ditches 62,674
Fort Chipewyan 1999-2000 Rehabilitation of Runway, Taxiway & Apron Surfaces and Associated Drainage Systems 2,132,144
Moosonee 1996-1997 Runway Restoration 673,014
Moosonee 1998-1999 Airport Pavement Rehabilitation (Runway 14-32) 2,460,500
Moosonee 2006-2007 Resurfacing Runway 06-24, Taxi A & Apron 1,089,487

The conclusion drawn from the database review was that ACAP might have funded duplicate work in the total amount of $1.4M, representing 0.3% of funded projects excluding mobile equipment. Subsequent communications with program mangers confirmed the uniqueness of these projects buttressing Evaluation's conclusion as to the longevity of the funded work and program success.

Table 19: Spending on possible repeat projects
Category Projects Total Spending
($M)
Unique Spending
($M)
Possible duplication of work
($M)
Duplication as percent of Spending (%)
Airfield Lighting 111 44.7 44.7    
Apron Rehabilitation 25 24.0 24.0    
ARFF (Fire Truck) 18 12.9 12.3 0.6 4.9
Field Electrical 31 10.2 10.2    
SNOWTAM 5 0.1 0.1    
Other 33 4.7 4.7    
Runway Rehabilitation 122 270.6 269.9 0.7 0.3
Sand Storage 14 4.1 4.1    
Taxiway Rehabilitation 25 49.1 49.1    
Wildlife Fence 24 9.1 9.1    
Sub-total 408 429.5 428.1 1.4 0.3
Mobile Equipment 314 73.5      
Grand Total 722 503.0      

Evaluation considered that the airports themselves might have financed repeat work. Interviews with operators indicate that while some might self-finance mobile equipment purchases, none could readily absorb the cost of major construction. It might be suggested that other public funders have supported a repeat of projects previously funded by ACAP. Review of the Infrastructure programs introduced previously suggests that these programs did not repeat ACAP-funded work. Finally, whether the REDA programs might have supported such work is unknown but represents a narrow possible alternative.

Table 20: Funding to ACAP-eligible airports for runway, taxiway and apron projects (RTA) from other federal programs with various cost-sharing arrangements and from ACAP
Other program support for airside (RTA) from 2001 to 2012 ACAP support for airside (RTA) from 1995-1996 to 2012-2013
  Projects Spending
($M)
Title of RTA funded
by other program*
Projects Spending
($M)
Title of RTA projects funded by ACAP
ACAP Recipient (airside or other project type)
Abbotsford 1 30.0 Parallel Taxiway and Apron Widening 3 3.9 Runway/apron/taxiway rehabilitation
Arviat 1 5.0 Runway Upgrades 0 0.0  
Campbell River 1 7.9 Runway Extension 2 1.0 Taxiway/apron rehabilitation
Deer Lake 1 9.0 Runway Extension Project 1 0.9 Taxiway rehabilitation
Rankin Inlet 1 27.0 Runway Upgrades 1 7.7 Runway Rehabilitation
Smithers 1 4.9 Runway Extension 1 1.8 Taxiway Rehabilitation
Terrace-Kitimat 1 2.8 Runway Extension 3 4.4 Runway/apron/taxiway rehabilitation
Wawa 1 0.8 Airside Pavement Rehabilitation 1 2.8 Runway Rehabilitation
Windsor 1 7.0 Construction of a taxiway 2 3.6 Runway/taxiway rehabilitation
ACAP Non-recipient
Cambridge Bay 1 16.0 Runway Upgrades 0 0.0  
Pangnirtung 1 1.2 Runway Resurfacing 0 0.0  
Taloyoak** 1 5.7 Runway Upgrades 2 0.0  
Total Spending   117.3     26.1  
Total Projects 12     14    

* Excludes heliopad in Gillies Bay.
** ACAP-funded project planned for 2012-2013; no funds provided at time of evaluation.

Reduced operating costs at ACAP-eligible airports

Finding 14. Estimates of the reduction in operating costs at ACAP-eligible airports are unreliable and the extent of the realized savings could not be confirmed.

Analysis of the ACAP database indicates forecasted total savings due to net operating cost reduction of $34M for 185 projects awarded from 2010-2011 to October 2013. This anticipated level of savings represents 21% of the total expected contribution amount awarded on these projects ($160M). This compares to anticipated O&M savings of $80M on 102 projects for which such expectations were expressed in the prior three terms. By way of comparison, these forecast savings to recipients amounted to 23% of the total expected contribution ($80M) for these projects Footnote 10.

Interviews with operators and staff members suggest that estimates of operational cost savings are likely inaccurate and inconsistently derived. Operators reported many considerations in making this estimate. Some operators said they relied on their history and experience with the old asset. Other operators obtained the figure from experts or engineers working at the airport. One operator had sought detailed performance information from the manufacturer to compare to the performance records of the old asset. Finally, a number of operators remarked that there were really no savings since the money not put into maintaining the new asset would be used to conduct maintenance that would otherwise have been deferred. Essentially, the benefit when put thus, is rather more safety-related than financial. In most cases, it appears that funds freed from the upkeep of one asset are not banked.

Operators said that they may or may not discuss the estimate with regional staff. Staff confirmed this was so. Typically, staffs do not challenge the estimate and it does not figure in the decision to approve the application. By exception, one region derived the estimate for the client in order to complete the performance measurement sheet to be submitted with the application for review.

A comparison of the contribution agreements and proposals collected for the case studies to the figures recorded in the database for these projects revealed errors in the database. The source of most errors appears to be the approach taken to calculate the total savings field. Total savings was calculated by multiplying the "increase in years of asset life" field by the "annual operating cost reduction" field. However, it was clear from the contribution agreement and/or application documents and also from discussions with operators that operating cost savings were expected to be enjoyed in the early years following purchase or construction. As the replacement or rehabilitated asset aged, it was itself expected to require more and more expensive repairs. So, when the annual savings that operators had forecast would be enjoyed for five years after project completion were multiplied by the 20 year life extension, operating savings were inflated.

Given the issues surrounding the data and the fact that operating cost savings are no longer germane to the program's objectives or its performance measurement, the Program might cease requesting this information from applicants.

Recommendation: It is recommended that the Program update its performance measurement indicators to reflect the recent move from S.O.3 to S.O.1 and strengthen its overall performance measurement. For instance, the Program may consider introducing an outcome that reflects ACAP's impact on access to eligible airports through the rehabilitation of existing airside infrastructure. As well, it is recommended that the Program consider strengthening its asset life measure and removing the "O&M cost reduction" outcome and measure.

Efficiency and Economy

Finding 15. Overhead costs represented 7% of contribution funding in the first three years of the present term. Historical overhead costs are estimated at 6.5% of contribution disbursements for 2000-2001 through 2012-2013.

The table below provides the actual spending on salaries and associated overhead charges against the contribution funding delivered by the department. For these three years together, overhead represented 7.4% of the funds delivered. Within the period, and historically, there has been little variation in the number of staff responsible for delivering the program. Swings in the relative overhead cost reflect differences in the project load and project budgets over time and across regions. This carrying cost assessment does not take into account time and effort spent in awareness-building activities or in pre-consultations and assessing proposals from airports that are not receiving funding at the time.

Table 21: Overhead costs and contribution disbursements (2010-11 to 2012-13)
Year Overhead
(O&M in $000)
Contribution Funds (G&C in $000) O&M / G&C (%) Regional High
O&M / G&C (%)
Regional Low
O&M / G&C (%)
2012-2013 1,567 29,823 5.3% 8.7 0.6
2011-2012 1,792 30,403 5.9% 17.9 3.5
2010-2011 1,809 9,616 18.8% 33.9 5.8
Total 5,168 69,842 7.4%    

By way of comparison, the historical overhead cost was estimated at 6.5% of disbursements. Since actual overhead spending was available for the current term only, trend data were derived for the previous terms through an estimation procedure for which the salary cost was deflated according to average rates of pay for the PM-02 and PM-05. A 15% charge was then applied to the estimated salaries for other overhead costs (OOC), as this level was typical for the years 2009-2010 through 2011-2012. No adjustments were made for staff levels, as these have been flat historically.

Program restructuring commenced around the time that the evaluation launched and ramped up throughout the period of data collection, but was not complete by the end of data collection. As such, this evaluation will not assess the decisions, actions or consequences of the efficiency measures taken.

Interviews and documents identified the following actions to reduce overhead costs and improve the consistency of delivery across the regions:

  • Delivery by two of the five regions;
  • Development of an electronic project database and standardized tools;
  • Program promotion only via regional airport councils;
  • Limited pre-consultations and site visits mostly for high risk projects; and
  • Streamlined application process including web-based applications, a reduced process for mobile equipment, accepting preliminary (rather than final) project designs at the application stage, and a simplified financial review.

Evaluation considered undertaking a benchmarking analysis. The alternative funding programs identified for the period 2002-2003 to 2010-2011 were reviewed for possible benchmarking purposes. No airport funding program was found of a reasonably comparable size, ACAP being 4.5 times the size of the Regional Economic Development Authorities (REDAs), which collectively are the next largest capital investment program in the past. Other transportation funding programs that were comparable in size to ACAP were deemed not feasible due to the broad difference in the nature of the businesses they support.

Finding 16. The cost per passenger flying in or out of ACAP-funded airports is estimated at approximately $5.50 per passenger for 1995-1996 through 2012-2013, and at $2.00 when expected benefits to future years' passengers are taken into account.

To explore cost-effectiveness, evaluators measured the cost per person travelling through airports funded by ACAP.

To conduct this analysis, the number of enplaned and deplaned passengers at ACAP-funded airports was obtained from the Electronic Collection of Air Transportation Statistics (ECATS) for calendar years 1995 through 2012. Passenger volumes were then identified per airport for the final year in which the airport had received funding and the years in which the benefit would accrue (i.e. added years of asset life for the new or rehabilitated asset) over the history of the program. The median increase in asset life for the project category was used for projects lacking this information. Finally, the number of passengers flying in these years was totaled across all airports and all years.

It was estimated through this process that 91M passengers had used ACAP-funded airports over the history of the program. Based on the total disbursements of $503M, an approximate cost of $5.50 per passenger was computed.

However, many of the asset improvements that had been made from ACAP contributions had not reached the end of their forecast increased life. Therefore, future passengers using these airports would also benefit from the ACAP investments made between 1995 and 2012. The additional years of benefit was calculated per airport by multiplying the estimated remaining years of useful life by the average of the enplaned/deplaned passenger volume recorded for the airport in the three last years (2010 to 2012). The total passenger volume predicted for the future obtained through this process was 169M passengers. When added to the passengers who had visited ACAP-funded airports in the past, a total of 260M passengers expected to benefit from ACAP investments was obtained. Based on the total disbursements of $503M, and past and future benefits to travellers, an approximate cost of $2.00 per passenger was derived.

This relatively modest cost is attributable to the ongoing benefits derived from these capital projects and from the broad reach of the program. Since the program has typically supported a small number of projects per airport, a large number of eligible airports have reaped some benefit from ACAP. As a result, the number of passengers touched by the program is relatively great despite the modest size of any one of the recipient airports.

Conclusions

This evaluation presents conclusions regarding the continuing relevance and the performance of the Program.

Relevance

Providing funds to support infrastructure work needed for the safe operation of airports not owned by the federal government is clearly within the authority of the Minister of Transport and is in line with the department's policy framework and strategic outcomes.

Historically, the Program encountered difficulties in transferring all of its annual allocation to recipients. In trying to determine if the allocated budget was too large for the actual needs, Evaluation came to the conclusion that lapsing could be explained by both systemic and circumstantial reasons. The decision not to attempt to re-profile funds to subsequent years also seems to have had a major impact. Between 2000 and 2009, TC obtained authorization to re-profile funds. As a result the Program awarded effectively its entire budget over those ten years.

Performance

Multiple lines of evidence converge to demonstrate that the Program was effective in maintaining the safety levels at eligible airports.

The Program has been relatively effective at reaching eligible participants. Evaluation found that only a very small proportion of eligible airports were not aware of the program and the lack of participation of certain airports is not driven by a lack of knowledge. Nevertheless, it could be relatively easy to improve the reach of the program through simple communication activities.

Between 2008 and 2012, no airport lost its certification, and the few that became ineligible during this period did so for reasons not related to the Program's mandate. Evaluators also used available information and reports on accidents at eligible airports to assess the long-term impact of the program. They found the safety performance of eligible airports to have been consistently good. Finally, a series of case studies demonstrated that funded projects are having a tangible impact on the safety of recipients' operations.

Evaluation also demonstrated that the Program is successful in increasing the useful life of assets at recipient airports. Evaluation cannot, however, confirm that ACAP has any impact in reducing operating costs at funded airports. In fact, evaluators are of the opinion that this result should be abandoned.

Management Action Plan

To address the recommendations presented in this evaluation, the following action plan will be implemented:

# Recommendations Proposed Actions Forecast Completion Date OPI
1 It is recommended that the Program develop a formal mechanism requiring Operators to express their anticipated near- and medium-term requirements for project funding. We accept the recommendation and a process whereby a 3-year rolling plan is populated and kept current will be formalized nationally. We will also work with TC IT specialists to explore the feasibility of developing an online portal into which Operators can submit non-binding Letters of Intent with respect to anticipated near and medium-term funding needs. April 1, 2016 Air and Marine Programs
2 It is recommended that the Program explores the extent to which there are gaps in the guidance to Operators regarding the support available for mobile equipment under ACAP (e.g. regarding the limits on the number and type of vehicles). It is then recommended that the Program produce and disseminate guidance material to address gaps, especially those that could impede the streamlining of mobile equipment project applications. We accept the recommendation. New guidelines are in the course of development. It is expected that the new guidelines and revised application process will be ready by January 31, 2016 for use in the 2016-17 project application cycle. January 31, 2016 Air and Marine Programs
3 It is recommended that the Program update its performance measurement indicators to reflect the recent move from S.O.3 to S.O.1 and to strengthen its overall performance measurement. For instance, the Program may consider introducing an outcome that reflects ACAP's impact on access to eligible airports through the rehabilitation of existing airside infrastructure. As well, it is recommended that the Program consider strengthening its asset life measure and removing the O&M cost reduction outcome and measure. We accept the recommendation. Existing performance indicator documents will be updated to reflect the performance measures, activities, outputs, immediate outcomes, intermediate outcomes, and ultimate outcomes identified under SO1 – An Efficient Transportation System. October 2015 Air and Marine Programs

 

Addendum

 

Summary of the Analysis of ACAP Spending

Database analysis was conducted to explore reasons for the Program's experience with lapsing funds. The analysis covered spending for the years 1995-1996 through 2012-2013 since this was the period for which complete project information was available from the ACAP project database.

Over these years, the ACAP budget was $669M of which $503M was disbursed to recipients. The $166M that was not spent is explained by both a failure to award contracts to the maximum that the program budget would allow plus recipients' under-spending of the funds awarded to them.

Table 22: Summary of the Analysis of ACAP Spending
  Funds ($M) Funds ($M) Percent of ACAP Budget (%)
ACAP Contribution Budget   669.0 100
Contribution Agreements Awarded (adjusted) 563.5   84
Funds not Disbursed on Awarded Contribution Agreements 60.5   9
Disbursements on Awarded Contribution Agreements   503.0 75
Total Funds Lapsed   166.0 25

Adjustments: For this analysis, adjustments were made to the project contribution figures to ensure that the forecast contributions aligned in time with the disbursement of actual project progress payments. Specifically, since projects in the database included payments forecast for 2013-2014 and beyond, the following adjustments were made:

  • Of the 772 funded projects, 50 were not to receive funds until 2013-2014. These projects were excluded reducing the number of projects to 722 and the expected contribution (TEC) or award amount to $566M.
  • By the end of 2012-2013, 8 multi-year projects had not been fully funded. The TEC associated with future payments ($2.5M) was deducted from the TEC for the period of study. Following these adjustments a TEC of $563.5M results for the period.
  • Upward adjustments were made in the TEC for 23 projects for which actual spending slightly exceeded the expected contribution amount. These adjustments were of such low dollar amounts that they did not alter the adjusted TEC when expressed in millions of dollars (i.e. adjusted TEC of $563.5M for this analysis).

Determination of the under-subscription amount: The extent of the under-subscription was determined by deducting the adjusted total expected contribution (TEC) of $563.5M from the program contribution budget for the period ($669M) to reveal that $105.5M (or 16% of the program budget) was not committed by the Program to project work. Under-subscription constitutes 64% of the funds lapsed. Related analysis revealed that failure to award contracts was more pronounced in the first and fourth terms of the program, and elaborated on the reasons for difficulties in the fourth period especially.

Determination of the amount of awarded contribution funds that were not disbursed to clients: The client under-spending analysis took the difference between the adjusted expected contribution ($563.5M) and disbursements ($503M) to airports for these 722 projects. From this analysis it was determined that clients did not use $60.5M (i.e. 9% of the program budget) of the amount awarded to them for the period. Under-spending by clients constitutes 36% of the amount lapsed. Related analysis revealed that the under-utilization of funds on runway projects constituted over half of the total amount lapsed for this reason.

Annexes

Annex 1: About the Evaluation

Evaluation Methodology

The evaluation employed multiple lines of evidence consisting of:

  • A review of program and other federal government documents;
  • Program administrative data analysis;
  • Project database analysis;
  • Key informant interviews; and
  • Case studies of recipient airports.

A data matrix showing the evaluation questions and lines of evidence is provided in Annex 3 at the end of this document.

Document Review

A document review was conducted to enable Evaluation to directly respond to questions of the current rationale for the program, its alignment with federal government priorities and departmental strategic outcomes and the role of the federal government as provider of financial support to airports outside the NAS. Documents consulted to this end include: departmental acts, regulations, policies and reports (e.g. Aeronautics Act, National Airport Policy, TP312 and Civil Aviation Regulations (CARs), Departmental Reports on Plans and Priorities, Departmental Performance Reports and departmental financial planning documentation). As well, insights were obtained from other federal government documents (Budget 2011; Speech from the Throne 2010, 2011, 2013), reports of other bodies of government (e.g. the Senate Committee Reports of 2012, 2013) and reports produced by industry representatives and/or non-profit policy organizations (such as the Canadian Civil Aviation and Conference Board's 2013 report on the Economic Impact of the Air Transportation Industry).

Understanding of the operation of the ACAP was obtained through a review of such program documents as the application guide and the evergreen record of decision entitled Interpretations and Definitions. In preparation for the case study site visits, program management provided Evaluation with project case files for three exemplar projects. An Internet search was conducted to locate information on capital asset management systems (including the life cycle approach to asset management, airport infrastructure maintenance and depreciation schedules for airport assets). As well, Evaluation conducted a media scan for news items via three search engines [i.e. the Canada News Centre (GOC), the Canadian Press, TC Info Media] and a search of airport incidents directly or potentially related to infrastructure issues in reports from the Civil Aviation Daily Occurrence Reporting System (CADOR) for the case study sites. The CADOR search was conducted for the two years prior and following the project work sampled for this evaluation and filtered for types of occurrences that could be attributed to the infrastructure issue being addressed through the funded project work.

In addition, CADOR reports were pulled for all accidents recorded by the operator or other stakeholders (notably airlines) at ACAP-eligible airports (i.e. presently eligible and all past recipients) over the history of the program (i.e. from 1995-1996 to 2013-2014) in order to identify whether accidents had occurred that could be associated with the state of the airport infrastructure. The CADORs reports contain details of the events, and their numbers were compared with the official statistics that are made public by the Transportation Safety Board of Canada.

As well, Evaluation conducted an on-line search to describe alternative funding programs for which ACAP's clients might also be eligible. Possible alternatives were found through various means. The 2009 evaluation of ACAP was consulted to highlight alternative funding programs active at that time. Airport operators and departmental staff interviewed for the present evaluation were asked to name any programs they were aware of that currently funded airport infrastructure work. Finally, in the course of Evaluation's on-line review of these programs, other possible alternatives were located and their details sought. The evaluator prepared a summary table of the programs' selection criteria, objective, budget and other criteria that would facilitate a comparison between each of them and ACAP. The exercise was meant to address relevance; and as such, an emphasis was placed on verifying that Canadian airports were eligible for funding or had received funding through the program at some point in the past.

Administrative Data

Administrative data were obtained from various sources. Data required for the assessment of program performance were obtained from the financial manager, financial planning documents and the Annual Reports of Transportation in Canada. Some historical data were obtained from prior evaluations (i.e. ACAP program staff levels, eligible airports at the time of the evaluation). Inflation factors for salaries were approximated based on historical union settlements for the project manager category. CANSIM on-line resources furnished time series data for the industrial product price index, the energy and petroleum products index and the construction union wage rate index. Information for other departmental grants and contributions programs necessary for their consideration as prospective benchmarking programs were obtained from corporate planning sources.

ACAP Project Database

The project database was analyzed to address evaluation questions concerning uptake and ongoing need, effectiveness and cost-effectiveness. It also served as a sampling resource for the identification of participants in the airport case studies and airport operator interviews.

The ACAP database is an electronic archive that contains project data from the start of the program. The database is updated periodically. As program officers have relatively recently (i.e. within the past year or so) constructed the ACAP electronic database, some fields contain limited information and others are quite complete. Fields contain descriptive information about the airport applicant/recipient (e.g. name, aerodrome status, operator, owner) – both qualitative and quantitative project-level information (e.g. date agreement signed, contribution amount, years in which disbursements were made, federal contribution percentage, project priority, project title and category, increased asset life estimate, operating cost estimate, approved budget and disbursements) – and were set up to include tracking of dates of various communications with the airport (e.g. date of notification letter). For internal purposes, the ACAP database enables authorized users to access information from other departmental databases, such as Statistics Canada figures for passenger volumes. Evaluation employed passenger volume data to aid in the selection of case study airports and airport operators to be interviewed.

The database extraction for this evaluation occurred on October 17, 2013. At the time of extraction, there were 819 projects in the database. Of these, 722 were classed as funded and some funds had been disbursed to the client. The evaluation largely focused on this subset of the total projects in the database for the period 1995-1996 through 2012-2013.

Database Validation

Evaluators met with the program analyst responsible for the ACAP database over the summer and fall of 2013 to acquaint themselves with the data fields and the extraction of information from the database. Following the October extraction, evaluators took a number of steps to validate the data in the fields that would be analyzed for the evaluation. These fields relate to the project (e.g. year of signing the contribution agreement, program funding years, the expected contribution, project priority level, project category, urgent project, federal share, asset life increase and operating cost savings) and airport (name, location, aerodrome status, operator and owner).

Categorical variables were reviewed for values that were out of range or potentially miscoded when compared to the variable containing the information being classed (e.g. type of project work compared to priority level of project work, project category compared to project title and owner compared to coding as provincial owner or not). Some questions were raised with the Program regarding values for project priority and federal share and corrections were made to both the ACAP database and evaluation extract as a result. Variables that the Program identified as having partial or limited data were excluded from analysis for this evaluation. Notably, the evaluation excluded fields pertaining to the tracking of communications with the client and unfunded projects.

Database values were validated for 33 projects for which Evaluation had received paper files (i.e. three example project files and 30 case study project documents). The comparison of paper file to database record was performed for several key fields. Qualitative fields included project title, dates, owner, operator, aerodrome status and eligibility for ACAP. The exercise validated the qualitative data in the database where comparison information was available in both sources. Quantitative fields included passenger statistics, change in years of asset life, change in operating costs, federal contribution percentage, the total expected contribution (TEC) and annual project disbursements.

Qualitative data were largely confirmed through this exercise. Discrepancies were found for the estimated increase in years of asset life and reduction in operating costs, which will be discussed in the findings section of this report. Where updates had not been made to the total expected contribution, the adjustments made to the budget in order to conduct various analyses are described in the pertinent findings section of this report. Most importantly, while the years in which disbursements were made were consistent across the two sources of information, the disbursement figures in the ACAP database were found to have been rounded. Therefore, Evaluation used these rounded figures along with the field showing the years in which disbursements were planned, both of which are contained in the original database, to guide the insertion of the actual spending information into the spreadsheet containing the original extract of October 2013. The actual spending data were taken from the financial management reports for 1998-1999 through to 2012-2013 and from the Public Accounts of Canada for the first three years of the program (1995-1996, 1996-1997 and 1997-1998)Footnote 11. The accuracy of the insertion was verified by corroborating the new values against the rounded values and the stream of funding years recorded in the original database. Tallies were then made by region and by year to confirm that the revised spending information in the ACAP database extract agreed with the financial reports and Public Accounts.

Key Informant Interviews – TC Staff 

Staff interviews were conducted between August 2013 and March 2014. Early scoping interviews contributed to the development of the Terms of Reference and identified sources of information (documentary, administrative and project databases, and interviewees) for the evaluation. Formal interviews designed to address questions in the evaluation matrix followed in late fall and winter. Departmental interviewees responded to questions on the rationale and ongoing need for the program and its alignment with departmental strategic outcomes. In terms of performance, departmental interviewees responded to questions about delivery and resources.

Interviewees represented executive and management, program and financial officers from the national capital region (11), regional management and program officers and technical experts (12) and Civil Aviation inspectors responsible for the airports featured in the case studies (8).

A short list of interviewees was provided by the program officer in the national capital region. Evaluation supplemented the list by consulting the previous evaluation sources, departmental staff directory and employing a snowballing technique once interviews had commenced. The final set of interviewees included all regional and national capital region managers and technical experts from every region. Some former management and staff were also interviewed, since the program was in the process of restructuring over the course of the evaluation.

Key Informant Interviews – Airport Operator

Airport operators were interviewed to obtain the perspective of the funding recipient on key issues. Interviews were conducted with 20 ACAP recipient airports and five non-recipient airports between January 20 and February 28, 2014. (Note that four individuals were contacted for non-recipient interviews as one of the non-recipient operators interviewed was responsible for 28 airports in Northern Ontario and another was responsible for five airports in Nunavut.) The focus of the recipient interviews was the operators' recent application experience, the capital asset management processes of the airport and the safety-related impact that the ACAP projects were perceived to have had at the airports. Interviews with non-recipients focused on the operators' awareness of ACAP.

A purposive sampling methodology was employed to identify airport operators to be interviewed. In the case of recipient operators, the selection criteria included: the geographic region of the airport (organized according to the five regions used by Transport Canada), whether the airport had completed one or more ACAP projects in the past five years and the categories of projects completed (e.g. runway rehabilitation, wildlife fencing). The goal was to select approximately the same number of airports from each region that when considered together would contain projects representing all 10 of the discrete project categories. Twenty airport operators were selected to be interviewed, with five coming from the Pacific Region, four from the Prairie and Northern Region, five from the Ontario Region, four from the Quebec Region and two from the Atlantic Region. Roughly half of the sample were small airports (10 were under 50,000 passengers) and the other half were medium (six were from 50,000 to 149,000) to larger airports (four were up to 337,000 passengers).

Due to the small number of the ACAP-eligible non-recipient airports across Canada, they were selected based on geographic location alone. Of the five non-recipients interviewed, one was from the Atlantic Region, two were from the Ontario Region, one was from the Prairie and Northern Region and one was from the Pacific Region. All five were small in terms of scheduled passenger volumes.

Before calling the airport operators, evaluation staff verified their contact information with regional management and relayed Evaluation's intent to contact program clients. In three instances, regional management requested that Evaluation not contact a client because the region was in the midst of negotiations or resolving other issues with the airport. In these instances, the evaluation team sought replacement interviewees with the same critical characteristics as the airport originally selected. Subsequently, all of the airport operators contacted (both funding recipients and non-recipients) agreed to speak with Evaluation.

Case Studies

Eight case studies, including two pilot case studies, were conducted in December 2013 through March 2014. Thirty projects were examined across these airports representing 10 runway, apron or taxiway rehabilitation projects; seven mobile equipment projects; five lighting projects; three firefighting vehicle, equipment and/or garage projects; two sand storage facility projects; two wildlife fence projects; and one runway sign project.

The main purpose of the case studies was to gather evidence on the impact of the funded projects. Case study interviewees also responded to questions on the capital asset maintenance system in place at the airport and ACAP's role within it, ACAP applications intended for 2014-2015 or so and the support they received from TC staff. The pilot case studies were conducted before the main case studies, during which interview guides were tested. Pilot case studies also clarified the type of outcome data relating to ACAP projects that airports would be able to provide.

Selection of Case Study Sites

Airports were selected for the case study component via purposive sampling based on location, project category, completion date of project work and airport size. The sample contains at least one airport from each geographic region. The sample collectively contains two or more projects from each project category. Case study airports ranged in size (as defined by traffic volume) from airports with under 10,000 passengers to approximately 300,000 passengers. Finally, Evaluation sought airports with projects that had been completed at least two years before data collection was to commence for the evaluation to allow for the accumulation of evidence of post-project impact. At the same time, in anticipation of airport staff turnover, relatively recently completed projects were targeted (i.e. those funded between 2008-2009 and 2010-2011). All eight airports had at least one project funded in this period. Once airports for site visit were identified, regional managers were advised of the clients chosen for the case studies. Managers indicated that one of the airports selected was now too large to receive ACAP assistance and that another could be difficult to access during winter. Therefore, Evaluation selected two substitute airports with much the same characteristics in terms of region, size and available project categories.

Limitations and Mitigation

The evaluation used multiple lines of evidence for each evaluation question to the extent possible. Questions pertaining to relevance tended to rely more heavily on documentation and to a lesser extent on interview data. Questions pertaining to performance relied less on documentary evidence and more heavily on the other lines of evidence. By and large the documentary evidence was sufficient for the purposes of this study. Evaluation identified and attempted to address issues related to sampling for key informant interviewees and case study sites, and with respect to the quality of the data in the ACAP project database.

With respect to the staff interviews, the main issue was addressed as follows: program restructuring began roughly simultaneously with the evaluation launch. There was the risk that corporate knowledge would be lost to the evaluation. Evaluation's response was to interview retiring staff and key ACAP personnel who had moved to other programs early in the scoping and data collection periods. The snowballing technique enabled Evaluation to identify key former staff as well as key current staff. Had gaps or inconsistencies been found in the analysis of the interview data that might have been explained by characteristics of the sample, the intention was to seek out additional interviewees. Subsequent analysis suggested that this was not necessary.

With respect to the airport operator interviews and case study sites, the main issues were addressed as follows: for both the operator interviews and airport case studies, relatively small samples were chosen vis-à-vis the population of eligible airports (i.e. the operator sample was 13% of eligible operators and the case study sample was another 4% of eligible airports). The sample size was restricted because labour intensive data collection methods were employed. Had operator surveys been conducted, input from more operators could have been obtained in the same time frame. Given that the samples would be small, sampling for heterogeneity was preferred over random selection to ensure that the final samples provided coverage of factors that could explain differences in responses across participants. This approach did produce samples that over-represented some regions and under-represented other regions. Ultimately, however, the intent was to propose that Evaluation could generalize from the responses of the few to the population of airport operators, not by virtue of the principles of randomization but because participants' responses emanated from a philosophy or mind-set that is universally held across capital asset managers.

Regional managers were informed of the clients that Evaluation proposed to contact and suggested that Evaluation exclude a few of them. Evaluation did so. Therefore there is the risk of sampling bias; specifically, a bias that would result in samples overly favourable to the program. First, it is important to note that regional managers provided reasonable (i.e. operational) justification for the elimination of certain clients. Second, even participants that might be viewed as friendly to the program provided criticism and suggestions for program improvement. Given the mix of "pro and con" responses received from operators, acting on the Program's suggestions does not appear to have unduly compromised the information collected from these lines of evidence.

In discussions regarding the selection of airports for case study, regional managers cautioned Evaluation that small or single-operator airports would participate with difficulty due to time pressures, lack of available staff to meet with us and the limited availability of electronic records. In recognition of the burden that this line of evidence presented, Evaluation chose a disproportionate number of medium and large airports for the site visits. To make up for the imbalance, Evaluation drew on a relatively greater number of small airports for the operator interview component. As well, Evaluation paid shorter visits to the two smaller airports in the case study component. In one case, the airport was used to scope this line of inquiry. The other airport was visited at the end of the case studies component so that the evaluators would be as efficient as possible (i.e. knowing what material was needed from the operators on site and what questions were most fruitful) and need less time on site.

As noted in the discussion above, the program analyst had made it clear that the ACAP database was a work in progress. Some fields were considered reliable and complete, while others were not. Evaluation identified the fields that would be useful in addressing the evaluation questions and attempted to validate the information in these fields. With respect to the ACAP project database, the main issues were addressed as follows:

Incomplete fields (e.g. dates on which actions related to delivery were taken; amendments; status other than funded) were dropped from the analysis. Where this would limit the analysis that could be done or conclusions that could be drawn, the appropriate qualifications were made in the body of this report.

Inaccurate data was corrected where possible. Specifically, the program analyst was contacted to inquire about possible miscoding, and explanations were given or corrections were made by both the Program and Evaluation (e.g. priority level assignment, project category assignment). Where Evaluation could find more accurate sources of important fields, it did. Notably, actual annual disbursements to ACAP recipients were taken from the departmental financial system and Public Accounts. Where it was not feasible to improve the accuracy of data from other records [e.g. increase in asset life years, change in operating costs and instances where actual spending is less than the final total expected contribution (TEC) recorded], the appropriate qualifications were again made in the findings section of this report.

Annex 2: Logic Model for the Airports Capital Assistance Program (ACAP)

Inputs Activities Outputs Immediate
Outcomes
Intermediate
Outcomes
Ultimate
Outcomes
19 FTEs Review applications Funding of capital projects at ACAP-eligible airports Completion of capital projects at ACAP-eligible airports Safety levels are maintained at ACAP-eligible airports A safe civil aviation system
$190M in contribution funds for 2010-2011 through 2014-2015 Fund approved projects        
  Manage contribution agreement   Protection of assets / increase in useful life at ACAP-eligible airports    
Clients: Airports not owned by federal government, meet TP 312, receive regularly scheduled service Assess program performance   Reduced operating costs at ACAP-eligible airports ACAP-eligible airports meet safety standards required for continued operations  
  Forecast the need for ACAP funding        

Source: ACAP Results-Based Management and Accountability Framework (December 2009)

The following document is available for downloading or viewing: