Electrical and Computer Engineering ETDs

Publication Date

Fall 11-2019

Abstract

The advances introduced by Unmanned Aerial Vehicles (UAVs) are manifold and have paved the path for the full integration of UAVs, as intelligent objects, into the Internet of Things (IoT). This paper brings artificial intelligence into the UAVs data offloading process in a multi-server Mobile Edge Computing (MEC) environment, by adopting principles and concepts from game theory and reinforcement learning. Initially, the autonomous MEC server selection for partial data offloading is performed by the UAVs, based on the theory of the stochastic learning automata. A non-cooperative game among the UAVs is then formulated to determine the UAVs' data to be offloaded to the selected MEC servers, while the existence of at least one Nash Equilibrium (NE) is proven exploiting the power of submodular games. A best response dynamics framework and two alternative reinforcement learning algorithms are introduced that converge to a NE, and their trade-offs are discussed. The overall framework performance evaluation is achieved via modeling and simulation, in terms of its efficiency and effectiveness, under different operation approaches and scenarios.

Keywords

Artificial Intelligence, Game Theory, Reinforcement Learning, Mobile Edge Computing

Document Type

Thesis

Language

English

Degree Name

Electrical Engineering

Level of Degree

Masters

Department Name

Electrical and Computer Engineering

First Committee Member (Chair)

Eirini Eleni Tsiropoulou

Second Committee Member

Xiang Sun

Third Committee Member

Michael Devetsikiotis

Share

COinS