Gene Gotwalt anticipated his ECON 101 students would show better-than-average improvement on the Test of Understanding of College Economics at the completion of the fall semester. But the bump from the 52nd to 82nd national percentile in their pre- and post-course scores turned the assistant professor’s expectant satisfaction into hard-to-suppress delight.
Gotwalt administered the nationally recognized TUCE test, which colleges and universities across the country have relied on as an assessment tool for 40 years, to measure the effectiveness of the department’s Principles of Microeconomics course. In its fourth edition, the TUCE microeconomics test consists of 30 multiple-choice questions. The same test is given on the first and last days of the class.
As an assessment tool, a comparison is made between the national percentile rankings on pre- and post-tests. If the school’s percentile remains unchanged, it indicates that students improved their scores on the test as would be expected on average in the national sample (3,255 students took the pre- and post-test). However, ranking in a higher percentile on the post-test means the program did a better job of educating students in economics than the national average.
Twenty-four Sweet Briar students completed the test both times. On the pre-test, Gotwalt’s students averaged 8.7 correct answers. This result was below the national mean of 9.39, but above the median, placing Sweet Briar students in the 52nd percentile.
To remain in the same percentile, the students needed a mean score of 12 on the post-test. The mean score on the post-test was 16.61, which ranked the students in the 82nd percentile nationally.
“The scores were close to one standard deviation from the mean, and with a sufficient sample size to have confidence in the results. …” Gotwalt wrote in a summary of the results.
“On the pre-test, Sweet Briar students ranked below the mean; on the post-test, they far exceeded the national average. There is real information contained in the jump in the percentile ranking. Sweet Briar does far better in educating students than the majority of the competitors.”
Gotwalt reported that Sweet Briar students had a higher mean than did students in the following national categories: females (13.37), white (14.76) and other baccalaureate colleges (13.15). They outperformed students in doctoral/research universities (14.76), older students (15.18), and males (14.77), which were the highest-ranking subgroups.
Reflecting on the results during a later conversation, he noted that what makes them so pleasing is that TUCE is an objective measure of how well the College is teaching ECON 101. “I thought the students were learning and this was just someone outside the College saying it.”
Dean of the College Jonathan Green agreed with that point of view and noted that as an introductory class, a number of the students would have been fulfilling general education requirements.
“The TUCE is a standard instrument that has been in use for decades. As with many standardized tests, I am not surprised that our students performed as well as they did,” Green said. “These kinds of results should attract students interested in any major. It is a testament to the quality of instruction at the College.”
Gotwalt had expected significant upward movement in the Sweet Briar students’ percentile ranking, but the 30-point climb did surprise him a little. At some point, Gotwalt said, it struck him that, “This was really good.”