Functional Gradient Boosting based on Residual Network Perception

Atsushi Nitanda, Taiji Suzuki
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3819-3828, 2018.

Abstract

Residual Networks (ResNets) have become state-of-the-art models in deep learning and several theoretical studies have been devoted to understanding why ResNet works so well. One attractive viewpoint on ResNet is that it is optimizing the risk in a functional space by consisting of an ensemble of effective features. In this paper, we adopt this viewpoint to construct a new gradient boosting method, which is known to be very powerful in data analysis. To do so, we formalize the boosting perspective of ResNet mathematically using the notion of functional gradients and propose a new method called ResFGB for classification tasks by leveraging ResNet perception. Two types of generalization guarantees are provided from the optimization perspective: one is the margin bound and the other is the expected risk bound by the sample-splitting technique. Experimental results show superior performance of the proposed method over state-of-the-art methods such as LightGBM.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-nitanda18a, title = {Functional Gradient Boosting based on Residual Network Perception}, author = {Nitanda, Atsushi and Suzuki, Taiji}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3819--3828}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/nitanda18a/nitanda18a.pdf}, url = {https://proceedings.mlr.press/v80/nitanda18a.html}, abstract = {Residual Networks (ResNets) have become state-of-the-art models in deep learning and several theoretical studies have been devoted to understanding why ResNet works so well. One attractive viewpoint on ResNet is that it is optimizing the risk in a functional space by consisting of an ensemble of effective features. In this paper, we adopt this viewpoint to construct a new gradient boosting method, which is known to be very powerful in data analysis. To do so, we formalize the boosting perspective of ResNet mathematically using the notion of functional gradients and propose a new method called ResFGB for classification tasks by leveraging ResNet perception. Two types of generalization guarantees are provided from the optimization perspective: one is the margin bound and the other is the expected risk bound by the sample-splitting technique. Experimental results show superior performance of the proposed method over state-of-the-art methods such as LightGBM.} }
Endnote
%0 Conference Paper %T Functional Gradient Boosting based on Residual Network Perception %A Atsushi Nitanda %A Taiji Suzuki %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-nitanda18a %I PMLR %P 3819--3828 %U https://proceedings.mlr.press/v80/nitanda18a.html %V 80 %X Residual Networks (ResNets) have become state-of-the-art models in deep learning and several theoretical studies have been devoted to understanding why ResNet works so well. One attractive viewpoint on ResNet is that it is optimizing the risk in a functional space by consisting of an ensemble of effective features. In this paper, we adopt this viewpoint to construct a new gradient boosting method, which is known to be very powerful in data analysis. To do so, we formalize the boosting perspective of ResNet mathematically using the notion of functional gradients and propose a new method called ResFGB for classification tasks by leveraging ResNet perception. Two types of generalization guarantees are provided from the optimization perspective: one is the margin bound and the other is the expected risk bound by the sample-splitting technique. Experimental results show superior performance of the proposed method over state-of-the-art methods such as LightGBM.
APA
Nitanda, A. & Suzuki, T.. (2018). Functional Gradient Boosting based on Residual Network Perception. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3819-3828 Available from https://proceedings.mlr.press/v80/nitanda18a.html.

Related Material