Opinion on ETS Report on "Cut
Homeschoolers in Georgia have reason to be concerned about the
recommendations in the ETS report. In particular, it is not unreasonable to
question both the methodology used to develop the recommendations and the
With regard to the methodology:
With regard to the results, there is quantitative data that indicates that the recommendations of ETS are
not consistent with the stated goals of the study and not
consistent with the stated admissions requirements for the University
System of Georgia.
- The most disturbing aspect is that there is
no empirical validation of the "equivalent scores", nor is there any
announced plan on the part of either ETS or the Board of Regents to do any
- Homeschoolers have good reason to question the impartiality and lack
of bias in a group of evaluators whose largest representative organization
(GAE) has made clear its dislike of homeschooling. Moreover, these teachers
were asked, in effect, to evaluate their own success in teaching the material--
the better a "C" student "would have done" on the test, the
better their performance as teachers.
- Finally, the lack of any input from the
homeschool community in either the process or its execution is also troubling,
given the wealth of data indicating that the homeschooling community has an
excellent track record in education.
The Angoff method, as used in this case, addresses two groups of students, one a group of "minimally qualified" graduates of accredited high schools in Georgia, and the other the group of students taking the SAT II subject tests. The data used for the analysis presented here comes from two sources:
To understand the basic capabilities of these two groups of students, consider how the two groups perform on the SAT I test--widely accepted as the most reliable single-factor predictor of college success. As defined by the Board of Regents in the instructions to the panelists, the group of "minimally qualified" students score 430 on the verbal part of the SAT I and 400 on the math part, for a composite score of 830. An 830 composite corresponds to a 19th percentile score, which means that for a single student scoring an 830, only 19% of all test takers score lower, and 81% of all test takers scored higher. For the group of students who took the SAT II subject tests, the College Board provides data on how they performed on the SAT I test. For students who took at least one SAT II test, the median composite score was 1195, or roughly the 79th percentile of all SAT I test takers. For a student scoring 1195 on the SAT I, only 21% of test takers scored better. In other words, there are 60 percentile points separating the imagined "minimally qualified" student, and the average student taking at least one SAT II test. As Dr. Fullerton told the House Education Committee on July 29, "Only very, very good students take the SAT II tests." To appreciate just how large this gap is, if a student took the SAT I and achieved an 830 composite, and then took it again and achieved an 1180 composite, ETS would automatically investigate the second score as potentially fraudulent.
- for the group of minimally qualified students, the source of data is the group of panelists convened by the Board of Regents and ETS, who will try to imagine how these students "would have performed" had they actually taken the SAT II subject test, as reported by ETS to the Board of Regents;
- for the group of students who actually took the SAT II subject tests, the distribution of scores achieved by all SAT II test takers is reported by ETS and the College Board for the 1996-97 administrations.
Keeping in mind the fundamental differences in the ability of these two groups of students, and the fact that the "minimally qualified" student achieved only a "C" average in the college prep curriculum (they might have scored below a C in some courses), consider the ETS recommended passing scores. (For some tests the report lists two scores but not how to reconcile them, so an analysis is given for each score.)
- English writing: 550. The minimally qualified student would have scored better than 42% of the "very, very good students" who took this subject test.
- American history: 590. The minimally qualified student would have scored better than 49% of the "very, very good students" who took this subject test.
- Math IC: 550 (510). The minimally qualified student would have scored better than 39% (25%) of the "very, very good students" who took this subject test.
- Biology: 510 (460). The minimally qualified student would have scored better than 20% (9%) of the "very, very good students" who took this subject test.
- Chemistry: 570. The minimally qualified student would have scored better than 36% of the "very, very good students" who took this subject test.
- Physics: 650. The minimally qualified student would have scored better than 57% of the "very, very good students" who took this subject test.
The ETS recommendations simply do not pass the test of common sense. It is ludicrous to suggest that a minimally qualified student as defined by the Board of Regents could outperform almost half the "very, very good students" who take the SAT II in American history and 57% of those who take the SAT II in Physics. The recommendations for English writing and Chemistry also are unbelievable.
The staff of the Board of Regents has chosen to ignore the obvious lack of credibility in the way they plan to use the ETS recommendations. However, it has not escaped their notice that the recommended passing scores are likely to be unachievable for large numbers of students, so they have decided to "adjust" the passing scores downward. They have (for the time being) made a completely arbitrary decision to adjust down by one standard error. As Dr. Fullerton stated to the Committee on July 29, the Board has no plans to validate any of these recommendations using actual test performance data.
Even if it were possible to establish a "grade equivalent" score--which it is not--the Board has implemented its policy for homeschoolers on a "minimum" rather than "average" basis. In other words, while graduates of accredited programs need achieve only an average grade at or above the standard, homeschoolers would have to achieve every grade at or above the standard. Thus, homeschoolers are treated differently, and the standard for them is more stringent.
The Board's process for establishing a "standard" is arbitrary. It is not, nor is there any intention that it be validated. It most certainly is not fair to homeschooling children. There has been no role for the homeschooling community in the process of establishing the policy, homeschoolers have not been informed about the policy in a timely way, nor have they been given any meaningful way to present arguments opposing the policy or suggesting improvements. Dr. Fullerton informed HEIR that the Board of Regents does not hold public hearings, and it does not confer with special interest groups. Despite having been in contact with Dr. Fullerton's office on an almost weekly basis since late March, his statements to the Committee on July 29 regarding the adjustment to the ETS recommendations, and the change in requirements for the language test (not SAT II but some other test, yet to be specified) were a complete surprise.
Whether or not the SAT II subject tests are an appropriate mechanism for
making admissions decisions for homeschoolers, the evidence seems to indicate
the ETS-recommended passing scores are, to put it mildly, unfair. They set the
bar much higher for homeschoolers than it is set for other students in Georgia.
If the Board of Regents implements a policy based on "grade equivalent" scores developed through the Angoff method, they will, without
question, discriminate against homeschoolers in admissions decisions.
This opinion was contributed by Leon McGinnis.
posted: 7/28/97, updated 8/04/97
© 1997 HEIR