Computer-based assessments allow practitioners to collect rich process data by logging students’ interactions with assessment tasks. In addition to providing final responses to test questions, computer-based assessments promise to furnish more evidence to support claims about what a student knows and can do through logging process data in log files. However, because these rich log file data often contain both useful information and noise, such as unpredictable actions or emerging behaviors, culling useful information is a challenging task. Some data-driven approaches (such as educational data mining) have recently been used to identify patterns in process data, but the identified patterns are often difficult to interpret.
There is a crucial need to interpret process data based on cognitive theories to provide valid evidence for claims made in large-scale summative assessments. Evidence-centered design (ECD; Mislevy, Almond, & Lukas, 2004; Mislevy, Almond, Steinberg, & Lukas, 2006), a framework for educational assessment development, advocates that evidence identification and interpretation should be linked to the competencies/skills to be assessed and the design of a task used in an assessment. Guided by the ECD framework, this study employs a theory-driven approach to analyze and interpret rich process data generated from one interactive task used in the 2012 Programme for International Student Assessment (PISA) complex problem-solving assessment.
To develop principles that can be used to identify evidence of complex problem-solving competencies/skills, this study examined the interactive task structure and reviewed cognitive theories of complex problem-solving, and based on which, a theory-based coding scheme was proposed. Based on the coding scheme, process data were coded to identify specific behaviors that are relevant to the complex problem-solving. An exploratory factor analysis revealed multi-dimensional cognitive abilities involved in the complex problem-solving. Then, the 2012 PISA complex problem-solving scoring rubric was evaluated in terms of whether there is evidence from process data to support it. Finally, the incremental predictability of student competency/skill evaluations based on behavioral coding is examined. Specifically, the study aimed to answer three research questions: (a) Can student process data generated from one interactive task in the 2012 PISA complex problem-solving assessment reflect distinct cognitive competencies/skills? (b) To what extent can evidence from process data support the ratings of student performance evaluated based on their final responses on the PISA complex problem-solving assessment? (c) To what extent do student groups sorted by a few behavioral indicators show different levels of academic achievement on the 2012 PISA assessments of complex problem-solving, science, mathematics, and reading? The results of the study showed that, based on a theory-based coding scheme, the process data generated from the 2012 PISA complex problem-solving task can be meaningfully interpreted and reflects distinct cognitive competencies/skills. In addition, this study found that process data provides supporting evidence to the 2012 PISA scoring rubric. Furthermore, the study identified behavioral indicators of targeted cognitive competencies/skills that differentiate student performance on complex problem-solving and other academic assessments. The study demonstrates how a theory-driven approach to evaluating student performance through process data could contribute to establishing an evidence model for a complex problem-solving task.
|Advisor:||Yoon, Susan A.|
|Commitee:||Baker, Ryan S., Jia, Yue|
|School:||University of Pennsylvania|
|Department:||Teaching, Learning and Curriculum|
|School Location:||United States -- Pennsylvania|
|Source:||DAI-A 81/2(E), Dissertation Abstracts International|
|Subjects:||Educational tests & measurements, Educational technology|
|Keywords:||Complex problem solving, Computer-based assessment, Evidence model, Large-scale assessment, log files, Process data|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be