Mind the (optimality) Gap: A Gap-Aware Learning Rate Scheduler for Adversarial Nets

Hussein Hazimeh, Natalia Ponomareva
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, PMLR 206:3018-3033, 2023.

Abstract

Adversarial nets have proved to be powerful in various domains including generative modeling (GANs), transfer learning, and fairness. However, successfully training adversarial nets using first-order methods remains a major challenge. Typically, careful choices of the learning rates are needed to maintain the delicate balance between the competing networks. In this paper, we design a novel learning rate scheduler that dynamically adapts the learning rate of the adversary to maintain the right balance. The scheduler is driven by the fact that the loss of an ideal adversarial net is a constant known a priori. The scheduler is thus designed to keep the loss of the optimized adversarial net close to that of an ideal network. We run large-scale experiments to study the effectiveness of the scheduler on two popular applications: GANs for image generation and adversarial nets for domain adaptation. Our experiments indicate that adversarial nets trained with the scheduler are less likely to diverge and require significantly less tuning. For example, on CelebA, a GAN with the scheduler requires only one-tenth of the tuning budget needed without a scheduler. Moreover, the scheduler leads to statistically significant improvements in model quality, reaching up to 27$%$ in Frechet Inception Distance for image generation and 3$%$ in test accuracy for domain adaptation.

Cite this Paper


BibTeX
@InProceedings{pmlr-v206-hazimeh23a, title = {Mind the (optimality) Gap: A Gap-Aware Learning Rate Scheduler for Adversarial Nets}, author = {Hazimeh, Hussein and Ponomareva, Natalia}, booktitle = {Proceedings of The 26th International Conference on Artificial Intelligence and Statistics}, pages = {3018--3033}, year = {2023}, editor = {Ruiz, Francisco and Dy, Jennifer and van de Meent, Jan-Willem}, volume = {206}, series = {Proceedings of Machine Learning Research}, month = {25--27 Apr}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v206/hazimeh23a/hazimeh23a.pdf}, url = {https://proceedings.mlr.press/v206/hazimeh23a.html}, abstract = {Adversarial nets have proved to be powerful in various domains including generative modeling (GANs), transfer learning, and fairness. However, successfully training adversarial nets using first-order methods remains a major challenge. Typically, careful choices of the learning rates are needed to maintain the delicate balance between the competing networks. In this paper, we design a novel learning rate scheduler that dynamically adapts the learning rate of the adversary to maintain the right balance. The scheduler is driven by the fact that the loss of an ideal adversarial net is a constant known a priori. The scheduler is thus designed to keep the loss of the optimized adversarial net close to that of an ideal network. We run large-scale experiments to study the effectiveness of the scheduler on two popular applications: GANs for image generation and adversarial nets for domain adaptation. Our experiments indicate that adversarial nets trained with the scheduler are less likely to diverge and require significantly less tuning. For example, on CelebA, a GAN with the scheduler requires only one-tenth of the tuning budget needed without a scheduler. Moreover, the scheduler leads to statistically significant improvements in model quality, reaching up to 27$%$ in Frechet Inception Distance for image generation and 3$%$ in test accuracy for domain adaptation.} }
Endnote
%0 Conference Paper %T Mind the (optimality) Gap: A Gap-Aware Learning Rate Scheduler for Adversarial Nets %A Hussein Hazimeh %A Natalia Ponomareva %B Proceedings of The 26th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2023 %E Francisco Ruiz %E Jennifer Dy %E Jan-Willem van de Meent %F pmlr-v206-hazimeh23a %I PMLR %P 3018--3033 %U https://proceedings.mlr.press/v206/hazimeh23a.html %V 206 %X Adversarial nets have proved to be powerful in various domains including generative modeling (GANs), transfer learning, and fairness. However, successfully training adversarial nets using first-order methods remains a major challenge. Typically, careful choices of the learning rates are needed to maintain the delicate balance between the competing networks. In this paper, we design a novel learning rate scheduler that dynamically adapts the learning rate of the adversary to maintain the right balance. The scheduler is driven by the fact that the loss of an ideal adversarial net is a constant known a priori. The scheduler is thus designed to keep the loss of the optimized adversarial net close to that of an ideal network. We run large-scale experiments to study the effectiveness of the scheduler on two popular applications: GANs for image generation and adversarial nets for domain adaptation. Our experiments indicate that adversarial nets trained with the scheduler are less likely to diverge and require significantly less tuning. For example, on CelebA, a GAN with the scheduler requires only one-tenth of the tuning budget needed without a scheduler. Moreover, the scheduler leads to statistically significant improvements in model quality, reaching up to 27$%$ in Frechet Inception Distance for image generation and 3$%$ in test accuracy for domain adaptation.
APA
Hazimeh, H. & Ponomareva, N.. (2023). Mind the (optimality) Gap: A Gap-Aware Learning Rate Scheduler for Adversarial Nets. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:3018-3033 Available from https://proceedings.mlr.press/v206/hazimeh23a.html.

Related Material