Borrowing an Idea from Poverty Literature to Explain SAEs
In Ruby Payne's (2003) book A Framework for Understanding Poverty, the concept of "hidden rules" among social classes is introduced. The concept is simple, yet has far-reaching implications. Those from lower socioeconomic classes struggle in school and society partially because they do not understand the "rules" of how to act or interact in a middle class society. These hidden rules are common knowledge amongst the middle and upper socio-economic classes; hence their opportunities for success increase compared to those who do not know the rules. Let's transfer that concept to SAEs. Are there hidden rules out there that define success for SAEs that some know but others do not? If so, what are those rules, who created them and why do we continue to follow them? We will take a look at this question of how SAEs are defined, through a recent research study, which sought to define quality indicators for SAE. Next, we'll discuss what some of these hidden rules are and end with some conversations the profession should have if we want SAEs to be a prominent component in our secondary classroom model.
The Research Study
Over the past several years, the 10 x 15 initiative has really made us scratch our heads - not only about how to build toward 10,000 quality programs, but what "quality" actually means. In an effort to figure that out, we decided to look at the question of quality to see how experts in the field of agricultural education defined quality in instruction, FFA and SAE (Jenkins & Kitchel, 2009). In the article from which we're citing, the study focused on FFA and SAE. Findings were interesting as the panel of experts could agree on 1 9 quality indicators of FFA, but only 6 quality indicators of SAE. The panel has a little over 40 indicators in which to review, for both FFA and SAE.
The research process aimed to define quality indicators of FFA and SAE using the Delphi method, which seeks to develop consensus or full agreement among a group of experts. In the first round, we asked experts (agriculture teachers, state staff and teacher educators, all of whom had national prominence) to list quality indicators of FFA and SAE. In round two, the responses from round one were compiled and experts were asked to state their level of agreement for each item being accepted as a quality indicator. AU responding panelists had to agree at some level (a "4" meaning "agree" or "5" meaning "strongly agree", on a 5-point scale) that the item should be a quality indicator. Items with less that 75% agreement (less than 75% of the experts marked a 4 or 5) were discarded. Items with 100% agreement were considered quality indicators. The remaining items were taken to round three. In this round, participants were informed of the mean score, in terms of agreement (again, on a 5-point scale). This helped the participants understand how others thought about that particular item. The question was then simplified to a yes or no answer - should this item be considered a quality indicator? Again, those items with 100% agreement (in this case, all panelists marking "yes") were added to the list of quality indicators.
From that process, the following items made consensus amongst the panelists, and were identified as quality indicators of SAE:
* SAEs are assisted (e.g. in the planning process) by instructors, parents, employers and other partners
* Student is satisfied with SAE
* Teacher has supervision time for SAE
* Student has up-to-date records on SAE
* SAEs involve goal-setting
* A diversity/variety of SAE types is promoted
Again, the panel could agree on 19 quality indicators of FFA, but only 6 quality indicators of SAE. Only 6 of the 46 (13.04%) proposed items introduced by panelists made consensus and were considered SAE quality indicators. The question is - so what? There are many questions on why the profession seems clearer on quality indicators of FFA, yet disjointed when it comes to indicators of SAE. …