People are using online product reviews and evaluations more and more (Chandler et al., 2013). As the usage of online reviews persists and more smart phone applications are created, the demand for online product review continues to increase; yet, there is no indication of the quality of these reviews. In fact, some online reviews have been found to be fraudulent and misleading. Many online product reviews come from Internet-based crowdsource organizations. Few studies have explored evaluation practices among these organizations, and as a result, it is unclear what, if any, evaluation standards are used by crowdsource reviewers, particularly those found on open, self-serve sites such as MTurk. The purpose of this study, therefore, was to determine (a) what, if any, evaluation standards are used by crowdsource organizations and their requesters, and (b) to what extent these standards adhere to the Joint Committee on Standards for Education Evaluation (JCSEE) Program Evaluation Standards (Yarbrough et al., 2011).
Descriptive, survey data was collected from 454 MTurk product reviewers. Findings indicate these product reviewers do not appear to use any standards. The MTurk product reviewers that participated in this survey are using personal, experience-based opinions as a basis for their online reviews. The literature tells us, however, these opinions are not reliable, as they change with the providers experience and knowledge of the product.
Results further indicate participants appear to not be very procedural. Document management seemed to be reviewer dependent. Moreover, open-ended follow-up questions reveal that when asked if they used more technical review designs, the majority of participants answered “often,” while simultaneously indicating their reviews were based on personal experience. This result was very conflictive with survey results, and further points to a misperception that MTurk product reviewers are providing reliable online product reviews.
|School:||Western Michigan University|
|School Location:||United States -- Michigan|
|Source:||DAI-A 78/04(E), Dissertation Abstracts International|
|Subjects:||Business administration, Marketing, Management|
|Keywords:||Amazon.com, Crowdsourcing, MTurk, On-line, Product, Reviews|
Copyright in each Dissertation and Thesis is retained by the author. All Rights Reserved
The supplemental file or files you are about to download were provided to ProQuest by the author as part of a
dissertation or thesis. The supplemental files are provided "AS IS" without warranty. ProQuest is not responsible for the
content, format or impact on the supplemental file(s) on our system. in some cases, the file type may be unknown or
may be a .exe file. We recommend caution as you open such files.
Copyright of the original materials contained in the supplemental file is retained by the author and your access to the
supplemental files is subject to the ProQuest Terms and Conditions of use.
Depending on the size of the file(s) you are downloading, the system may take some time to download them. Please be