Activation-Descent Regularization for Input Optimization of ReLU Networks

Hongzhan Yu, Sicun Gao
Proceedings of the 41st International Conference on Machine Learning, PMLR 235:57441-57458, 2024.

Abstract

We present a new approach for input optimization of ReLU networks that explicitly takes into account the effect of changes in activation patterns. We analyze local optimization steps in both the input space and the space of activation patterns to propose methods with superior local descent properties. To accomplish this, we convert the discrete space of activation patterns into differentiable representations and propose regularization terms that improve each descent step. Our experiments demonstrate the effectiveness of the proposed input-optimization methods for improving the state-of-the-art in various areas, such as adversarial learning, generative modeling, and reinforcement learning.

Cite this Paper


BibTeX
@InProceedings{pmlr-v235-yu24c, title = {Activation-Descent Regularization for Input Optimization of {R}e{LU} Networks}, author = {Yu, Hongzhan and Gao, Sicun}, booktitle = {Proceedings of the 41st International Conference on Machine Learning}, pages = {57441--57458}, year = {2024}, editor = {Salakhutdinov, Ruslan and Kolter, Zico and Heller, Katherine and Weller, Adrian and Oliver, Nuria and Scarlett, Jonathan and Berkenkamp, Felix}, volume = {235}, series = {Proceedings of Machine Learning Research}, month = {21--27 Jul}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v235/main/assets/yu24c/yu24c.pdf}, url = {https://proceedings.mlr.press/v235/yu24c.html}, abstract = {We present a new approach for input optimization of ReLU networks that explicitly takes into account the effect of changes in activation patterns. We analyze local optimization steps in both the input space and the space of activation patterns to propose methods with superior local descent properties. To accomplish this, we convert the discrete space of activation patterns into differentiable representations and propose regularization terms that improve each descent step. Our experiments demonstrate the effectiveness of the proposed input-optimization methods for improving the state-of-the-art in various areas, such as adversarial learning, generative modeling, and reinforcement learning.} }
Endnote
%0 Conference Paper %T Activation-Descent Regularization for Input Optimization of ReLU Networks %A Hongzhan Yu %A Sicun Gao %B Proceedings of the 41st International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2024 %E Ruslan Salakhutdinov %E Zico Kolter %E Katherine Heller %E Adrian Weller %E Nuria Oliver %E Jonathan Scarlett %E Felix Berkenkamp %F pmlr-v235-yu24c %I PMLR %P 57441--57458 %U https://proceedings.mlr.press/v235/yu24c.html %V 235 %X We present a new approach for input optimization of ReLU networks that explicitly takes into account the effect of changes in activation patterns. We analyze local optimization steps in both the input space and the space of activation patterns to propose methods with superior local descent properties. To accomplish this, we convert the discrete space of activation patterns into differentiable representations and propose regularization terms that improve each descent step. Our experiments demonstrate the effectiveness of the proposed input-optimization methods for improving the state-of-the-art in various areas, such as adversarial learning, generative modeling, and reinforcement learning.
APA
Yu, H. & Gao, S.. (2024). Activation-Descent Regularization for Input Optimization of ReLU Networks. Proceedings of the 41st International Conference on Machine Learning, in Proceedings of Machine Learning Research 235:57441-57458 Available from https://proceedings.mlr.press/v235/yu24c.html.

Related Material