Publication Date

Fall 11-15-2022

Abstract

Debiased Sinkhorn divergence (DS divergence) is a distance function of

regularized optimal transport that measures the dissimilarity between two

probability measures of optimal transport. This thesis analyzes the advantages of

using DS divergence when compared to the more computationally expensive

Wasserstein distance as well as the classical Euclidean norm. Specifically, theory

and numerical experiments are used to show that Debiased Sinkhorn divergence

has geometrically desirable properties such as maintained convexity after data

normalization. Data normalization is often needed to calculate Sinkhorn

divergence as well as Wasserstein distance, as these formulas only accept

probability distributions as inputs and do not directly apply to signed data such as

time signals and seismic waves; however, in doing so one may lose or distort

information about the original signal. The investigations in this paper show that for

high frequency signal inputs, Wasserstein distance may need a much more

dramatic normalization compared to Debiased Sinkhorn in order to preserve

convexity, leading to a loss of information about the original signal, the

amplification of noise, and possibly machine overload, thus posing the desirability

of the Debiased Sinkhorn divergence method.

Degree Name

Mathematics

Level of Degree

Masters

Department Name

Mathematics & Statistics

First Committee Member (Chair)

Mohammad Motamed

Second Committee Member

Stephen Lau

Third Committee Member

Gabriel Huerta

Language

English

Keywords

Optimal Transport, Debiased Sinkhorn Divergence, Sinkhorn's Algorithm, Sinkhorn Divergence, Wasserstein Distance, Probability Distributions, Dissimilarity Measures, Regularization, Convexity, Numerics.

Document Type

Thesis

Share

COinS