Dissertation/Thesis Abstract

Deep Learning-Based Exploration Path Planning
by Reinhart, Russell, M.S., University of Nevada, Reno, 2020, 56; 28022208
Abstract (Summary)

In this thesis, two deep learning-based path planning methods for autonomous exploration of subterranean environments using aerial robots are presented. One approach utilizes imitation learning, where training samples are generated by a sampling-based state of the art exploration path planner, to construct a model which proposes comparable trajectories to those of the expert planner in many underground tunnel environments. This imitation learning based method uses a small window of recent LiDAR measurements to infer trajectories at a fraction of the computational cost of the expert training planner while also removing the requirement for an online map reconstruction of the environment. The second proposed approach utilizes a deep reinforcement learning algorithm applicable to continuous state and action spaces and partially observed Markov decision processes; the reward for the agent is contingent upon the agent's efficient exploration of the environment. The proposed methods are evaluated in simulated and real-world environments.

Indexing (document details)
Advisor: Alexis, Konstantinos
Commitee: Hand, Emily, Schmidt, Deena
School: University of Nevada, Reno
Department: Computer Science
School Location: United States -- Nevada
Source: MAI 82/3(E), Masters Abstracts International
Source Type: DISSERTATION
Subjects: Robotics, Computer science
Keywords: Aerial robotics, Autonomous systems, Deep learning, Path planning
Publication Number: 28022208
ISBN: 9798672110769
Copyright © 2020 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy
ProQuest