Training Failures Among Newly Hired Air Traffic Controllers
By Daniel Baxter
June 12, 2009, The Department of Transportation, Office of Inspector General releases report on Training Failures Among Newly Hired Air Traffic Controllers. OIG conducted this review at the request of Representative Jerry F. Costello, Chairman of the House Subcommittee on Aviation. Specifically, the Chairman expressed concern that the Federal Aviation Administration (FAA) was facing a high number of training failures among newly hired air traffic controllers, especially at some of the busiest and most complex facilities in the National Airspace System.
The audit objectives were to determine (1) the training failure rate among newly hired air traffic controllers and (2) the common causes and factors that are contributing to this rate.
OIG conducted this audit between December 2007 and January 2009 in accordance with generally accepted government auditing standards prescribed by the Comptroller General of the United States. Their audit included site visits to 30 Air Traffic Control en route and terminal facilities, the FAA Academy in Oklahoma City, Oklahoma, and FAA Headquarters in Washington, DC.
FAA plans to hire and train nearly 17,000 new controllers through 2017 to replace the large pool of controllers who were hired after the 1981 strike and are now retiring. Training success among these newly hired controllers is critical as more veteran controllers leave the system. New air traffic controllers must complete an arduous training program that includes learning the basic concepts of air traffic control at the FAA Academy, followed by extensive on-the-job training at their assigned facilities.
Those controllers who are unable to pass the training process are either transferred within their assigned facilities to a new area of operations, transferred to a lower level facility to begin the training process again, or terminated from employment with FAA. While certification times for individual controllers may vary, FAA’s goal is to have terminal candidates complete the training process in 2 years and en route candidates in 3 years.
FAA uses a system, known as the National Training Database (NTD), to track the progress of newly hired controllers in training. FAA also uses the NTD to determine the number and rate of controllers who do not complete training because they either lack the skills needed to certify at their assigned facility (training failures) or leave the Agency for other reasons.
RESULTS IN BRIEF: According to FAA, the training attrition rate for all controllers in fiscal year (FY) 2007 was approximately 7 percent. However, OIG found that FAA’s reported rate is not accurate for several reasons.
First, FAA’s reported rate includes training failures of newly hired air traffic controllers and transferring veteran controllers who are training on the airspace of their new facility. FAA does not track training failures among newly hired controllers as a separate metric. Combining these two categories of trainees to determine one overall rate of failures system wide misconstrues the data FAA needs to effectively carry out its plans to train 17,000 newly hired controllers through 2017.
Second, FAA’s rate was based on FY 2007 training failure data; however, in FY 2007 most new controllers were still in the early stages of training. FAA only began implementing its plans to address the surge in attrition starting in 2005. From FY 2005 to FY 2006, FAA hired 1,635 new controllers. In contrast, from FY 2007 to FY 2008, FAA hired 4,011 new controllers. Since it takes new controllers 2 to 3 years to certify, most controllers had not been in training long enough in FY 2007 to accurately assess a realistic rate of training failures. As FAA hires more new controllers, the number and rate of training failures will likely increase.
Third, FAA does not have a uniform definition of training failures and other types of attrition. Consequently, it is largely left to managers to determine how to classify controller attrition, which is not done consistently. For example, Headquarters managers told us that transfers to lower-level facilities are not counted as training failures. However, several facility managers told OIG they classify all candidates who fail to make satisfactory training progress as training failures, regardless of whether they transfer or are terminated from FAA. Developing a clear and consistent definition is important since OIG found that many new controllers left training for reasons other than failing (e.g., medical disqualification, compensation issues, and misconduct).
Fourth, OIG found that FAA’s data in the NTD, its primary source for tracking training progress and failures, were incomplete, inaccurate, or understated. For example, compared data from the NTD with source data from 30 facilities. Seven of the 30 provided data that differed, often significantly, from what OIG found in the NTD for the same time periods. One reason for these discrepancies is that FAA does not uniformly enforce requirements for facilities to enter data into the NTD. As a result, OIG found several facilities were not entering data on a timely basis and some not at all. For example, one large terminal location failed to enter any data into the NTD for 3 non-consecutive months.
With the influx of new controllers planned through 2017, FAA must ensure that facilities and Headquarters properly collect and categorize training failure data so that the Agency has accurate information to effectively, efficiently, and expediently manage its efforts to train the next generation of air traffic controllers. This will be particularly important as FAA begins implementing its new controller training contract, the Air Traffic Control Optimal Training Solution (ATCOTS), a new $900 million contract with Raytheon Corporation that will manage controller training throughout the Agency including new controller training at the Academy and at air traffic facilities. FAA will need accurate data on training failures in order to assess the performance of the contractor.
OIG also found that it was premature to make conclusions regarding trends and potential root causes of training failures. This was because, at the time of the review, FAA was still in the early stages of its hiring and training efforts. OIG briefed Chairman Costello’s staff on this issue at the conclusion of review. Based on those discussions, the Chairman requested that OIG review FAA’s process for selecting and placing newly hired controllers. OIG began that review in November 2008.
OIG identify a series of factors that could indicate trends and potential root causes of training failures. Those include: (1) facility type (en route versus terminal and facility complexity and level), (2) time or stage in training.
OIG visited 7 facilities during the early phase of the audit and subsequently chose a statistical sample of 20 facilities (Level 10-12) to project a rate of training failures among newly hired controllers. FAA Air Traffic facilities are organized by complexity into levels (Level 4 through 12—least complex to most complex) based on traffic levels, airspace configurations, and other related factors.
Prior to failure, (3) hiring source and previous experience, (4) Air Traffic Selection and Training (AT-SAT) test scores, and (5) Academy Performance Verification (PV) scores. Although at the time of OIG's review there were not sufficient data on these factors to identify trends or determine conclusive causes, aggregated data on these types of factors will become increasingly important as FAA increases its hiring efforts in coming years. As more newly hired controllers enter the system over the next several years, FAA should monitor these types of factors to identify trends and potential root causes of training failures. This will also be an important management tool for FAA to assess the performance of the ATCOTS contractor.
OIG's recommendations to FAA focus on developing procedures to obtain accurate data on training failures in order to effectively monitor and improve its controller training program. FAA concurred with OIG's recommendations, which are listed on page 12.
FINDINGS: FAA’s Reported Training Failure Rate for Newly Hired Controllers Is Not Accurate. According to FAA, the training attrition rate for all controllers in FY 2007 was approximately 7 percent. However, OIG found that this rate does not reflect actual training failures among newly hired controllers for four reasons: (1) FAA does not track training failures among the new controllers as a separate metric; (2) FAA’s current rate is based on FY 2007 training failures and will likely increase as FAA hires more new controllers between FY 2009 and FY 2017; (3) FAA does not have a uniform definition of training failures and other types of attrition; and (4) FAA’s training failure data in the NTD were incomplete, inaccurate, or understated.
FAA Does Not Track Training Failures Among Newly Hired Controllers as a Separate Metric
Although FAA tracks training failures for all controllers, this rate includes both newly hired air traffic controllers and transferring veteran controllers who are training on the airspace of their new facility. Because FAA does not track training failures among newly hired controllers as a separate metric, OIG collected data (for the period FY 2005 through mid-2008) on training failures for newly hired controllers who had failed training at the FAA Academy and 20 randomly selected en route and terminal facilities. Specifically, the results of OIG's sample found the following:
At the FAA Academy, between the beginning of FY 2005 and June 2008, 3,177 newly hired controller candidates received training at the FAA Academy. Of these, only 98 candidates (3.1 percent) failed Academy training.
At 20 randomly selected (Level 10 and above) air traffic control facilities, between the beginning of FY 2005 and April 2008, 944 newly hired controllers were in facility training. Of these, only 48 (about 5.1 percent) failed training. Based on this sample, OIG projected (with a 95-percent confidence level) that the failure rate at Level 10 and above facilities is between 2.4 percent and 7.7 percent, with a best estimate of 5 percent.
FAA Does Not Clearly Define Controller Training Failures or Other Types of Attrition
OIG found that FAA does not uniformly apply definitions for training failures and other forms of non-failure training attrition. Two Agency documents, FAA Order 3120.4L and Employment Policy 1.14, prescribe procedures for new controller training and for the disposition of training failures. In addition, the NTD provides users with several categories by which controllers who do not complete facility training can be classified.
However, during interviews with FAA personnel, OIG found they had differing definitions of what constitutes a training failure. For example, FAA’s Director of Technical Training & Development stated that, “transfers from higher- to lower-level facilities are not counted as training failures.” Conversely, managers at the Potomac Terminal Radar Approach Control facility (TRACON) told us that they classify all candidates who fail to make satisfactory training progress as training failures, regardless of whether they are transferred to lower-level facilities or terminated from FAA.
This has led to inconsistencies in how FAA facilities classify training failures and other forms of attrition in the NTD. For instance, during the initial phase of review, the NTD showed that since FY 2005 only one controller at the Atlanta Center had been classified as a training failure. However, three additional controllers had failed training at the facility and were allowed to transfer to other facilities. The facility had incorrectly classified these three controllers as “Transferred Prior to Completion of Training” in the NTD.
Developing a clearer definition of training failures and other types of attrition is important since some new controllers did not complete training for reasons other than failure. While this type of attrition was low at the FAA Academy, it was nearly double the failure rate (of 5.1 percent) at sampled air traffic facilities. The subjectivity of classification and number of other reasons make those numbers suspect and underscore the need for a consistent definition.
At the FAA Academy, of the 3,177 newly hired controllers, only 37 controllers, or 1.2 percent, did not complete training for other reasons. However, of the 944 newly hired controllers who were in training at 20 randomly selected air traffic control facilities, 89 controllers, or about 9.4 percent, did not complete training for other reasons. Of these 89 controllers:
21 (about 24 percent) resigned for unspecified personal reasons.
19 (about 21 percent) resigned due to candidate hardships.
12 (about 13 percent) resigned due to compensation issues.
6 (about 7 percent) resigned due to retirement and medical issues.
31 (about 35 percent) left for other various reasons including career change, misconduct, and security issues.
FAA’s Primary Source of Training Failure Data, the NTD, Contains Outdated or Inaccurate Data
OIG found that while FAA does have a database, the NTD, in place to track the training progress of newly hired controllers, the database contained inaccurate information. The NTD is the main system used by FAA to track the progress of developmental controllers at air traffic facilities. Originally designed to track the training time for developmental controllers, the database was expanded to capture data relating to the various categories of training attrition, including training failures.
In the case of Miami Center, OIG found three different sets of figures for training failures and other attrition. According to NTD figures pulled at FAA Headquarters on February 3, 2008, there were 13 newly hired training failures and other attrition. According to the list compiled by the facility on February 14, 2008, there were 21 newly hired training failures and attrition. According to information pulled from the NTD on February 14, 2008, there were 19 newly hired training failures and other attrition.
OIG also found that some facilities were not entering any information into the NTD. The NTD produces a report showing which facilities have not entered its data. A February 2008 report showed several facilities were not up to date in entering information into the NTD. One facility, the Southern California TRACON, had yet to enter data for September, October, or December of 2007. When officials were asked at the TRACON about the discrepancy, they noted that they were more concerned with training their controllers than entering information into the NTD.
As a result of these data discrepancies, FAA cannot ensure the accuracy of the information it provides to Congress and other stakeholders regarding controller training. As such, current issues with the NTD, combined with the collection and categorization issues OIG identified, raise questions regarding the accuracy of FAA’s training attrition statistics.
At the Time of OIG's Review, There Were Insufficient Data To Identify Root Causes of Training Failures Among Newly Hired Controllers
To address the Chairman’s request regarding possible root causes, OIG examined a series of factors that could indicate potential trends or root causes of training failures. The factors OIG identified were: (1) facility type (en route versus terminal and facility level), (2) time or stage in training prior to failure, (3) hiring source and previous experience, (4) Air Traffic Selection and Training (AT-SAT) test scores, and (5) Academy Performance Verification (PV) Scores.
OIG found that it was premature to make conclusions regarding trends and potential root causes of training failures. This was because, at the time of review, FAA was still in the early stages of its hiring and training efforts. For example, about 62 percent of the 944 new hires from 20-facility sample were still in training at the time of analysis, and only 48 new controllers had failed training at sampled facilities. Additionally, there were not enough data on those factors to identify trends or determine conclusive causes of training failures.
OIG briefed Chairman Costello’s staff on this issue at the conclusion of their review. Based on those discussions, the Chairman requested that OIG review FAA’s process for selecting and placing newly hired controllers.
Time or Stage of Training
OIG also reviewed data on training stages to determine if there was a common stage during training in which most training failures occurred. Overall, thier sample of air traffic facilities, found that the average time newly hired controllers spent in training prior to failure was 1.86 years. In addition, 37 en route candidates sample failed during the radar portion of on-the-job training (OJT). During OJT, each en route candidate must certify on three operational positions that they will control when fully certified: (1) flight data (primarily responsible for entering flight data into the Agency’s computer systems); (2) radar associate (assists the radar controller in planning, organizing, and expediting the flow of traffic); and (3) radar position (initiates control instructions, monitors and operates radios, and has responsibility for separation between aircraft).
Previous controllers: Individuals who have previous air traffic control experience with the Department of Defense (DOD) (civilian or military) or FAA.
Collegiate Training Initiative (CTI) Program: Individuals who have successfully completed an aviation-related program of study from a school in FAA’s CTI program.
General Public: Individuals who have limited or no aviation knowledge or experience and apply for positions via vacancy announcements.
Of the 944 developmental controllers reviewed in the sample, 890 controllers, or about 94 percent, came from the CTI or ex-military hiring pools. Consequently, of the 48 total training failures in sample, 25 candidates (52 percent) came from CTI schools, and 23 candidates (48 percent) were ex-military or civilian DOD controllers. As part of OIG's ongoing audit of FAA’s process for selecting and placing newly hired controllers, they are reviewing current data on FAA hiring sources and the type and level of facility where newly hired controllers from those sources are placed. Data on hiring sources will become increasingly important as FAA has recently begun hiring new controllers from the general public. 11
Air Traffic Selection and Training Test Scores
Reviewed data on AT-SAT test results to determine if a candidate’s score was indicative of their ability to complete training. While FAA has cited the AT-SAT test as its main tool for screening potential controllers, only a small number of training failures had actually taken the test. Of the 48 controller candidates that failed during facility training in sample, only 5 corresponding AT-SAT scores from the FAA Academy. Of the 5 candidate scores received, 4 candidates qualified with scores between 70 and 84, while the remaining candidate were classified as “well qualified” with a score of 85 or above.
As for the remaining 43 training failures:
22 candidates (51 percent) had previous military controller experience and, per FAA’s guidelines, were exempt from taking the AT-SAT test.
14 CTI candidates (33 percent) lacked AT-SAT scores; rather, they had Office of Personnel Management test scores that were used as their qualifying scores. Most of these candidates were hired in FY 2005 or earlier.
The FAA Aeronautical Center Human Resource Management Office and the Civil Aerospace Medical Institute were unable to locate AT-SAT scores for the remaining seven training failures in sample (16 percent).
As part of ongoing audit of FAA’s process for selecting and placing newly hired controllers, OIG is reviewing current data on AT-SAT scores and the type and level of facility that newly hired controllers are placed to determine if there is any correlation.
Academy Performance Verification Scores
OIG also reviewed Academy PV scores to determine if a candidate’s score was indicative of their ability to complete training. Student performance is verified at the end of the Academy training program with a PV test to ensure that developmental controllers are ready to proceed to facility training. FAA’s Air Traffic Controller Training and Development Group (AJL-11) was only able to provide PV scores for 21 of the 48 facility training failures in sample. Only six candidates failed the first PV, with one student failing two PVs. This student was given a third chance, which he successfully completed, and was then allowed to proceed to facility training.
As with AT-SAT scores, some candidates were not required to take the PV test—20 of the 48 facility training failures in sample population (42 percent) were ex-military controllers and were not required to attend the FAA Academy. Consequently, they had no PV results. Additionally, the PV results for seven CTI candidates (15 percent) could not be located by FAA Academy officials.
While FAA is still in the early stages of replacing its controller workforce, it is important that the Agency begin improving how it collects and categorizes data on newly hired controllers. As large numbers of new controllers enter training in the next several years, this information will be critical for FAA, Congress, and other stakeholders to (1) determine FAA’s progress toward addressing attrition, (2) assess training problems at individual facilities, (3) identify needed changes to the overall training program, and (4) monitor the performance of the ATCOTS contractor.
1. Develop and implement procedures for tracking training failure rates of newly hired and transferring veteran air traffic controllers separately.
2. Develop a comprehensive and uniform definition of controller training failures and other types of attrition and ensure that the definition is consistently applied at FAA Headquarters, the FAA Academy, and air traffic facilities.
3. Enforce procedures requiring facilities to enter training data into the National Training Database accurately and in a timely fashion.
4. Develop a process for conducing periodic reviews to determine if facilities are complying with National Training Database data entry requirements.
5. Identify factors that could indicate trends or potential root causes of training failures and require FAA and contractor officials to retain sufficient data on those factors to assess the performance of the ATCOTS contractor. Factors should include data on: (a) facility type (en route versus terminal and facility level), (b) time or stage in training prior to failure, (c) hiring source and previous experience, (d) Air Traffic Selection and Training (AT-SAT) test scores, and (e) Academy Performance Verification (PV) Scores.
|©AvStop Online Magazine To Advertise With Us Contact Us Return To News|
Grab this Headline Animator