The development and validation of effect size measures for IRT and CFA studies of measurement equivalence
Nye, Christopher D.
Loading…
Permalink
https://hdl.handle.net/2142/26174
Description
Title
The development and validation of effect size measures for IRT and CFA studies of measurement equivalence
Author(s)
Nye, Christopher D.
Issue Date
2011-08-25T22:17:21Z
Director of Research (if dissertation) or Advisor (if thesis)
Drasgow, Fritz
Doctoral Committee Chair(s)
Drasgow, Fritz
Committee Member(s)
Roberts, Brent W.
Rounds, James
Newman, Daniel A.
Hong, Sungjin
Department of Study
Psychology
Discipline
Psychology
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
Ph.D.
Degree Level
Dissertation
Keyword(s)
Measurement Equivalence
Structural Equations Modeling
Item Response Theory
Abstract
Evaluating measurement equivalence is a necessary first step before comparisons can be made across groups or over time. As a result, techniques for evaluating equivalence have received much attention in the literature. Given the many benefits of these approaches, measurement equivalence is most appropriately assessed using item response theory (IRT) or confirmatory factor analytic (CFA) techniques. For both methods, the identification of biased items typically involves statistical significance tests based on the chi-square distribution or empirically derived rules of thumb for determining nonequivalence. However, because of the disadvantages of these criteria, it may be informative to use effect size estimates to judge the magnitude of the observed effects as well. As such, the present work proposed the development and evaluation of effect size measures for CFA and IRT studies of measurement equivalence. First, simulation research was used to illustrate the advantages of effect size measures and to develop guidelines for interpreting the magnitude of an effect. Next, these indices were used to evaluate nonequivalence in both cognitive and noncognitive data. In sum, the results show the benefits of evaluating the effect size of DIF in addition to assessing its statistical significance.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.