Learning progressions (LPs) are “descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time” (National Research Council, 2007). One challenge that arises in LP research is the collection of evidence to ensure that ordered levels of an LP are an adequate representation of how learning occurs for all students. These LP validation studies involve the identification, accumulation, and interpretation of evidence collected using assessment tasks to make claims about the development of student thinking hypothesized by the LP.
This dissertation develops a novel method using item response theory (IRT) to evaluate the order of LP levels with items that have answer choices mapped to the levels of an LP. I use two IRT models – the nominal response model (NRM; Bock, 1972) and the partial credit model (PCM; Masters, 1982). The NRM is used to evaluate and revise partial credit scoring schemes. The PCM is used to explore whether a quantitative scale can be developed that maps onto the LP hypothesis. Empirical data from the Diagnoser assessment system (Thissen-Roe, Hunt, & Minstrell, 2004) are analyzed to illustrate how to apply this method with data collected from items with responses mapped to levels in a model of student thinking.
The findings illustrate how analyzing patterns among NRM response option curves relative to the content of the items reveal how partial credit could be assigned to the answer choices and how analysis of a PCM item-person map can be used to evaluate the quality of a latent ability scale. These results also uncover additional areas for investigation, including more research on the psychometric characteristics of items with answer choices mapped to models of student thinking. The use of the NRM and PCM advanced in this dissertation provides a new method of providing psychometric information that can be used for LP validation studies.
|Advisor:||Briggs, Derek C.|
|Commitee:||Shepard, Lorrie A., Furtak, Erin M., Shear, Benjamin R., Perkins, Katherine, Minstrell, Jim|
|School:||University of Colorado at Boulder|
|School Location:||United States -- Colorado|
|Source:||DAI-A 82/6(E), Dissertation Abstracts International|
|Subjects:||Educational tests & measurements, Science education, Educational administration, Educational leadership, Education Policy|
|Keywords:||Facet clusters, Item Response Theory, Learning progressions, Ordered Multiple Choice Items, Physical sciences|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be