Assessment is a powerful tool that can be utilized to enhance teaching and learning. Within a response-to-intervention (RtI) framework, data collected via assessment are used to measure student proficiency, track progress, and subsequently align instruction with student learning needs. The purpose of this program evaluation was to explore the use of a computer-adaptive testing (CAT) measure, i-Ready Diagnostic, as part of a rural middle school’s assessment process, within an RtI model. This program evaluation sought to gather evidence regarding the ability of different i-Ready reading and math measures, administered in the fall, winter, and spring, to predict student performance on end-of-the-year high-stakes tests (NYS ELA and Math exams). In addition, this evaluation examined the variability in student performance on i-Ready measures based on demographic characteristics, including grade, gender, socio-economic status, ethnicity, and educational program. The evaluation was conducted using archival student data from the 2016–2017 school year and analyzed using multivariate statistical methods.
Analyses of data revealed that, for this population of students, i-Ready scores demonstrated a strong relationship with NYS exam scores, yet the repeated measures administered in fall, winter, and spring provided little new information about variability in student performance compared to a single measurement. Regarding differences in i-Ready performance across disaggregated subgroups of students, the results suggest significant differences exist across time based on grade, gender, socio-economic status, ethnicity, and educational program. These findings are supported by the literature and provide important information for the school to consider when interpreting assessment results. Given these findings, implications for the associated school are discussed and recommendations for developing and improving assessment practices, to enhance the efficiency and effectiveness, are provided. These results are setting specific but may spur future research and evaluation of these assessment measures in the field.
|Commitee:||Kostelnik, Callen, Spaulding, Dean, Saddler, Bruce|
|School:||State University of New York at Albany|
|School Location:||United States -- New York|
|Source:||DAI-A 81/2(E), Dissertation Abstracts International|
|Subjects:||Educational psychology, Special education, Educational tests & measurements, Middle School education|
|Keywords:||Academic screening, i-Ready, Middle school, Program evaluation, Response to intervention, RM-MANOVA|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be