Dissertation/Thesis Abstract

Assessment of the newly developed methods for accessibility evaluations of e-textbooks
by Sun, Yu Ting, M.S., California State University, Long Beach, 2017, 55; 10253601
Abstract (Summary)

Digital materials have become popular and have attracted a diverse group of users, such as college students, who benefit from low cost and portable access to the materials. However, college students with disabilities might have trouble accessing electronic materials. Laws and standards provide guidance on making digital documents accessible but are being implemented slowly. Published materials on the market might have accessibility issues. Efforts have been made to produce evaluation methods for eBooks. Automated tools have been examined in multiple studies but using automated tools to evaluate accessibility of electronic materials is not enough; evaluators are needed. This study assessed a newly developed accessibility evaluation methodology that was designed for e-textbooks and examined whether being rated as highly accessible makes a difference in user experience and performance. This study recruited 6 visually impaired students and 6 students with normal or corrected-to-normal vision and asked them to interact with the eBooks. User experience and performance were measured using subjective questionnaires, time, and accuracy. Results showed differences between high and low accessibility levels in user experience but not in user performance.

Indexing (document details)
Advisor: Strybel, Thomas Z.
Commitee: Miles, James D., Vu, Kim-Phuong
School: California State University, Long Beach
Department: Psychology
School Location: United States -- California
Source: MAI 56/03M(E), Masters Abstracts International
Source Type: DISSERTATION
Subjects: Educational tests & measurements, Experimental psychology
Keywords: Accessibility, Accessibility evaluations, E-textbooks, Usability, Usability evaluations
Publication Number: 10253601
ISBN: 9781369512069
Copyright © 2019 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest