Machine Learning and Knowledge Extraction (Oct 2023)

Mssgan: Enforcing Multiple Generators to Learn Multiple Subspaces to Avoid the Mode Collapse

  • Miguel S. Soriano-Garcia,
  • Ricardo Sevilla-Escoboza,
  • Angel Garcia-Pedrero

DOI
https://doi.org/10.3390/make5040073
Journal volume & issue
Vol. 5, no. 4
pp. 1456 – 1473

Abstract

Read online

Generative Adversarial Networks are powerful generative models that are used in different areas and with multiple applications. However, this type of model has a training problem called mode collapse. This problem causes the generator to not learn the complete distribution of the data with which it is trained. To force the network to learn the entire data distribution, MSSGAN is introduced. This model has multiple generators and distributes the training data in multiple subspaces, where each generator is enforced to learn only one of the groups with the help of a classifier. We demonstrate that our model performs better on the FID and Sample Distribution metrics compared to previous models to avoid mode collapse. Experimental results show how each of the generators learns different information and, in turn, generates satisfactory quality samples.

Keywords