Dissertation/Thesis Abstract

Crowd sourced product reviews: A study of evaluation standards used
by Manga, Alexander W., Ph.D., Western Michigan University, 2016, 200; 10297219
Abstract (Summary)

People are using online product reviews and evaluations more and more (Chandler et al., 2013). As the usage of online reviews persists and more smart phone applications are created, the demand for online product review continues to increase; yet, there is no indication of the quality of these reviews. In fact, some online reviews have been found to be fraudulent and misleading. Many online product reviews come from Internet-based crowdsource organizations. Few studies have explored evaluation practices among these organizations, and as a result, it is unclear what, if any, evaluation standards are used by crowdsource reviewers, particularly those found on open, self-serve sites such as MTurk. The purpose of this study, therefore, was to determine (a) what, if any, evaluation standards are used by crowdsource organizations and their requesters, and (b) to what extent these standards adhere to the Joint Committee on Standards for Education Evaluation (JCSEE) Program Evaluation Standards (Yarbrough et al., 2011).

Descriptive, survey data was collected from 454 MTurk product reviewers. Findings indicate these product reviewers do not appear to use any standards. The MTurk product reviewers that participated in this survey are using personal, experience-based opinions as a basis for their online reviews. The literature tells us, however, these opinions are not reliable, as they change with the providers experience and knowledge of the product.

Results further indicate participants appear to not be very procedural. Document management seemed to be reviewer dependent. Moreover, open-ended follow-up questions reveal that when asked if they used more technical review designs, the majority of participants answered “often,” while simultaneously indicating their reviews were based on personal experience. This result was very conflictive with survey results, and further points to a misperception that MTurk product reviewers are providing reliable online product reviews.

Indexing (document details)
Advisor: Schroeter, Daniela
Commitee:
School: Western Michigan University
School Location: United States -- Michigan
Source: DAI-A 78/04(E), Dissertation Abstracts International
Source Type: DISSERTATION
Subjects: Business administration, Marketing, Management
Keywords: Amazon.com, Crowdsourcing, MTurk, On-line, Product, Reviews
Publication Number: 10297219
ISBN: 9781369409246
Copyright © 2019 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest