Etd

Cycles Improve Conditional Generators

Public

Contenu téléchargeable

open in viewer

Learning information-dense, low-dimensional latent representations of high-dimensional data is the thesis of deep learning. The inverse problem of learning latent representations is data generation, in which machines learn a mapping from information-dense latent representations to high-dimensional data spaces. Conditional generation extends data generation to account for labelled data by estimating joint distributions of samples and labels. This thesis connects learning meaningful latent representations through compressive and generative algorithms and contains three primary contributions to the improvement and usage of conditional GANs. The first is three novel architectures for conditional data generation which improve on baseline generation quality for a natural image dataset. The second is a novel approach to structure latent representations by learning a paired structured condition space and weakly structured variation space with desirable properties. Third, a novel application of conditional data generation to a chemical sensing task with beneficial leaking augmentations for extremely low-data paradigms (n < 100) demonstrates that conditional data generation improves the testing performance of downstream supervised models.

Creator
Contributeurs
Degree
Unit
Publisher
Identifier
  • etd-20886
Mot-clé
Advisor
Defense date
Year
  • 2021
Date created
  • 2021-04-30
Resource type
Rights statement
License
Dernière modification
  • 2021-09-15

Relations

Dans Collection:

Contenu

Articles

Permanent link to this page: https://digital.wpi.edu/show/vq27zr634