|Chapter 5. Assessment
Maneuver or Procedure “Grades”
For example, a student can describe a landing and can tell the flight instructor about the physical characteristics and appearance of the landing. On a good day, with the wind straight down the runway, the student may be able to practice landings with some success while still functioning at the rote level of learning. However, on a gusty crosswind day the student needs a deeper level of understanding to adapt to the different conditions. If a student can explain all the basic physics associated with lift/drag and crosswind correction, he or she is more likely to practice successfully and eventually perform a landing under a wide variety of conditions.
Single-Pilot Resource Management (SRM) “Grades”
In SRM, the student may be able to describe basic SRM principles during the first flight. Later, he or she is able to explain how SRM applies to different scenarios that are presented on the ground and in the air. When the student actually begins to make quality decisions based on good SRM techniques, he or she earns a grade of manage-decide. The advantage of this type of grading is that both flight instructor and student know exactly where the student learning has progressed.
Let’s look at how the rubric in Figure 5-4 might be used in the flight training scenario at the beginning of this chapter. During the postflight debriefing, CFI Linda asks her student, Brian, to assess his performance for the day, using the Replay – Reconstruct – Reflect – Redirect guided discussions questions described in the Collaborative Assessment subsection. Based on this assessment, she and Brian discuss where Brian’s performance falls in the rubrics for maneuvers/procedures and SRM. This part of the assessment may be verbally discussed or, alternatively, Brian and Linda separately create an assessment sheet for each element of the flight.
When Brian studies the sheet, he finds “Describe, Explain, Practice, and Perform.” He decides he was at the perform level since he had not made any mistakes. The flight scenario had been a two-leg Instrument Flight Rules (IFR) scenario to a busy class B airport about 60 miles to the east. Brian felt he had done well in keeping up with programming the GPS and MFD until he reached the approach phase. He had attempted to program the Instrument Landing System (ILS) for runway 7L and had actually flown part of the approach until air traffic control (ATC) asked him to execute a missed approach.
When he compares the sheet he has completed to Linda’s version, Brian discovers that most of their assessments appear to match. An exception is the item labeled “programming the approach.” Here, where he had rated the item as “Perform,” Linda had rated it as “Explain.” During the ensuing discussion, Brian realizes that he had selected the correct approach, but he had not activated it. Before Linda could intervene, traffic dictated a go-around. Her “explain” designation tells Brian that he did not really understand how the GPS worked, and he agrees.
This approach to assessment has several key advantages. One is that it actively involves the student in the assessment process, and establishes the habit of healthy reflection and self-assessment that is critical to being a safe pilot. Another is that these grades are not self-esteem related, since they do not describe a recognized level of prestige (such as A+ or “Outstanding”), but rather a level of performance. The student cannot flunk a lesson. Instead, he or she can only fail to demonstrate a given level of flight and SRM skills.
Both instructors and students may initially be reluctant to use this method of assessment. Instructors may think it requires more time, when in fact it is merely a more structured, effective, and collaborative version of a traditional postflight critique. Also, instructors who learned in the more traditional assessment structure must be careful not to equate or force the dimensions of the rubric into the traditional grading mold of A through F. One way to avoid this temptation is to remember that evaluation should be progressive: the student should achieve a new level of learning during each lesson. For example, in flight one, the automation management area might be a “describe” item. By flight three, it is a “practice” item, and by flight five, it is a “manage-decide” item.
The student may be reluctant to self-assess if he or she has not had the chance to participate in such a process before. Therefore, the instructor may need to teach the student how to become an active participant in the collaborative assessment.
|©AvStop Online Magazine Contact Us Return To Books|
Grab this Headline Animator