Improving Retention and Fit by Honing an Honors Admissions Model

Article excerpt

For over a century, admissions officers and enrollment managers have relied on external validation of merit in selective admission of undergraduates. A main criterion used for selection is standardized testing, i.e., the SAT and ACT. Since these tests have been long-suspected and then shown to contain class and race biases while not accurately predicting retention (Banerji), the Schedler Honors College at the University of Central Arkansas (UCA) shifted to a holistic, multi-criterion selection process, de-emphasizing standardized tests, and then analyzed the outcomes. The statistical analysis served two goals. The first was to test whether variables in the admissions model, developed in 2007, predicted retention; the results led to changes in the weighting of variables for a revised rubric that we have used since 2010. The second goal was to improve enrollment of a more racially diverse population of students. Our findings demonstrated that most variables used in typical higher education admissions protocols did not accurately predict retention in the Schedler Honors College at UCA. Only one variable correlated to retention in honors, namely, high school grade point average (hsGPA). By increasing the value of hsGPA in the revamped selection rubric, UCA was able to increase rates of retention as well as diversity of incoming students.

Although the ACT and the SAT are widely accepted as indicators of college success by enrollment managers, the College Board states that standardized tests predict only 42% of academic success within the first year of college (Chenowith). Colleges nevertheless continue to base admissions and scholarship decisions on a test with this poor level of reliability. Gilroy claims that the ACT and SAT tests are one of the only ways that colleges can compare students from all over the world on a predetermined scale in a cost-effective manner. A key fallacy in this logic, however, is that the SAT and ACT were not meant to be used interchangeably (Syverson). The two tests measure different characteristics in students. The ACT measures mastery of basic high school material while the SAT tests abstract and critical thinking skills (Syverson).

Because of these inconsistencies, as well as concerns about bias in standardized tests, more than 800 institutions (including Texas Tech University, Central Bible College, Cambridge College, Texas Women's College, University of Arizona, and University of Memphis) have chosen to be test-free institutions, meaning that these colleges do not use the SAT or ACT in their admissions decisions (FairTest). Preliminary research conducted on institutions not using standardized testing has demonstrated that their selection methods have been just as effective (Banerji).

Using grade point average and class rank for selective admissions has its own problems: methodologies used to calculate hsGPA vary from school to school; neither grade point average nor class rank is standardized (Sadler, et. al.); and the scale for reporting hsGPA varies, with some high schools refusing to report class rank altogether. If high schools do not rank students, then the university bears the burden to understand hsGPA in context (Sadler et al.).

Honors programs and colleges with selective admissions typically rely on criteria used more generally in higher education, including standardized tests, despite the fact that honors education in the United States started as a reaction to excessive standardization. Frank Aydelotte, while serving as President at Swarthmore College, noticed that the education system was not challenging top students. Having been a Rhodes Scholar, he was familiar with the Oxford methodology, and used it to begin the first American honors program at Swarthmore College in 1922 (Rinn). Honors programs have broadened teaching and learning practices since then, largely because of shared information among participants in the National Collegiate Honors Council (NCHC). …


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.