COMING SOON! PQDT Open is getting a new home!

ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at

Questions? Please refer to this FAQ.

Dissertation/Thesis Abstract

A Comparison of Variance and Renyi's Entropy with Application to Machine Learning
by Peccarelli, Adric M., M.S., Northern Illinois University, 2017, 38; 10603911
Abstract (Summary)

This research explores parametric and nonparametric similarities and disagreements between variance and the information theoretic measure of entropy, specifically Renyi’s entropy. A history and known relationships of the two different uncertainty measures is examined. Then, twenty discrete and continuous parametric families are tabulated with their respective variance and Renyi entropy functions ordered to understand the behavior of these two measures of uncertainty. Finally, an algorithm for variable selection using Renyi’s Quadratic Entropy and its kernel estimation is explored and compared to other popular selection methods using real data.

Indexing (document details)
Advisor: Ebrahimi, Nader
Commitee: Polansky, Alan, Ryu, Duchwan
School: Northern Illinois University
Department: Statistics
School Location: United States -- Illinois
Source: MAI 57/01M(E), Masters Abstracts International
Subjects: Statistics
Keywords: Entropy, Information theory, Machine learning, Nonparametric, Parametric, Renyi
Publication Number: 10603911
ISBN: 978-0-355-29901-4
Copyright © 2021 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy