COMING SOON! PQDT Open is getting a new home!

ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at

Questions? Please refer to this FAQ.

Dissertation/Thesis Abstract

Examining Negative Wording Effect in a Self-Report Measure
by Xia, Xiaoyan, Ph.D., University of Pittsburgh, 2018, 134; 13872037
Abstract (Summary)

Researchers often include both positively and negatively worded items in one survey to reduce acquiescence bias. The incorporation of negatively worded items can raise concerns for the internal-consistency coefficients, the validity evidence for criterion relationships and the internal structure of the measure. This study aims to investigate the impact of misspecifying the model when using negatively worded items. Simulated datasets were generated from three models, 1) CFA with two correlated factors, 2) bi-factor CFA with two specific factors for positive and negative wording effects, and 3) bi-factor CFA with one specific factor for negative wording effect, and compared with each other and the unidimensional model. Models were compared with respect to model fit, and their estimation of internal-consistency coefficients, criterionrelated validity coefficients, and the internal structure validity.

Approximate and comparative model fit indices were not informative for model comparison because they presented similar fit among the three multidimensional models, although they tended to correctly identify the misfit of the unidimensional model under some conditions. Misspecifying the model for the negative wording effect resulted in biased estimates of internal-consistency coefficients. For the data generation bi-factor model with two specific factors, the under-fitting bi-factor model with the negative wording effect overestimated the homogeneity coefficient. When there were positive and negative wording effects, omitting one or both specific factors resulted in underestimated criterion-related validity coefficients and biased factor loadings. However, over-fitting with an additional specific factor did not impact the estimation of criterion-related validity coefficients or factor loadings of the general factor and the other specific factor.

Results suggest that model fit indices provide limited information for selecting models for negatively worded items. Evaluation of internal consistency reliability, criterion-related validity, and internal structure validity is recommended when selecting an approach for modeling negatively worded items. Researchers still need to rely on substantive and conceptual grounds when examining the nature of negatively worded items.

Indexing (document details)
Advisor: Stone, Clement A.
Commitee: Lane, Suzanne, Stone, Celement, Ye, Feifei, Yu, Lan
School: University of Pittsburgh
Department: Psychology in Education
School Location: United States -- Pennsylvania
Source: DAI-A 80/08(E), Dissertation Abstracts International
Subjects: Educational tests & measurements, Educational psychology
Keywords: Bi-factor models, Internal consistency, Model fit, negative wording effect
Publication Number: 13872037
ISBN: 978-1-392-04259-5
Copyright © 2021 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy