Multidimensional Scaling, Hierarchical Cluster Analysis, Exploratory Factor Analysis; Item Response Theory, Differential Item Functioning, Multiple Linear Regression, Placement Testing, STEM Education, Gifted Education, Mathematics Education, Psychological Tests
Educational institutions, at all levels, must justify their use of placement testing and confront questions of their impact on students’ educational outcomes to assure all stakeholders that students are being enrolled in courses appropriate with their ability in order to maximize their chances of success (Linn, 1994; Mattern & Packman, 2009; McFate & Olmsted III, 1999; Norman, Medhanie, Harwell, Anderson, & Post, 2011; Wiggins, 1989). The aims of this research were to (1) provide evidence of Content Validity, (2) provide evidence of Construct Validity and Internal Consistency Reliability, (3) examine the item characteristics and potential bias of the items between males and females, and (4) provide evidence of Criterion-Related Validity by investigating the ability of the mathematics placement test scores to predict future performance in an initial mathematics course.
Students’ admissions portfolios and scores from the mathematics placement test were used to examine the aims of this research. Content Validity was evidenced through the use of a card-sorting task by internal and external subject matter experts. Results from Multidimensional Scaling and Hierarchical Cluster Analysis revealed a congruence of approximately 63 percent between the two group configurations. Next, an Exploratory Factor Analysis was used to investigate the underlying factor structure of the mathematics placement test. Findings indicated a three factor structure of PreCalculus, Geometry, and Algebra 1, with moderate correlations between factors.
Thirdly, an item analysis was conducted to explore the item parameters (i.e., item difficulty, and item discrimination) and to test for gender biases. Results from the item analysis suggested that the Algebra 1 and Geometry items were generally easy for the population of interest, while the PreCalculus items presented more of a challenge. Furthermore, the mathematics placement test was optimized by removing eleven items from the Algebra 1 factor and two items from the PreCalculus factor. All Internal Consistency Reliability estimates remained strong and ranged from .736 to .950.
Finally, Hierarchical Multiple Linear Regressions were used to examine the relationship between students’ total and factor scores from the mathematics placement test with students’ performance in their first semester mathematics course. Findings from the four Hierarchical Multiple Linear Regressions demonstrate that the total score students’ receive on the mathematics placement test predicts their achievement in their initial mathematics course, above and beyond the contributions of their demographic information and previous academic background. More specifically, the Algebra 1 Factor Score from the mathematics placement test was the strongest predictor of student success among the lower level mathematics courses (i.e., Mathematical Investigations I or II). Similarly, both the Algebra 1 and PreCalculus Factor Scores from the mathematics placement test were significant predictors of students’ grades in their first upper level mathematics course (i.e., Mathematical Investigations III or IV), providing evidence of Predictive Validity. The current mathematics placement test and procedures appear appropriate for the population of interest given the empirical evidence demonstrated in this research study regarding the psychometric properties of the exam.
The continued use of the revised mathematics placement test in the course placement decision-making process is advisable.
Anderson, Hannah Ruth, "A Psychometric Investigation of a Mathematics Placement Test at a Science, Technology, Engineering, and Mathematics (STEM) Gifted Residential High School" (2020). Publications & Research. 10.