On Speeding Up the Training of Deep Neural Networks Using the Streaming Approach: The Base-Values Mechanism

Mateusz Wojtulewicz, Piotr Duda, Robert Nowicki, Leszek Rutkowski
Proceedings of The Workshop on Classifier Learning from Difficult Data, PMLR 263:17-24, 2024.

Abstract

Efficient and stable neural network training is crucial for advancing machine learning applications. This study explores the promising streaming approach as an alternative to traditional epoch-based methods. This paradigm shift involves transforming training data into a continuous stream, prioritizing challenging examples to enhance the learning process. Building upon this approach, we introduce an innovative Base-Values mechanism aimed at further improving the speed and stability of the streaming training process. We apply this framework to the original streaming training algorithms, resulting in algorithms such as Persistent Loss-Based (PLB) and Persistent Entropy-Based algorithm (PEB). We conduct a comprehensive experimental comparison on EMNIST dataset, analyzing traditional epoch-based methods and the streaming approach, including both original and new methods employing the Base-Values mechanism. The exploration of various hyperparameters, including pre-training length, mini-batch size, and learning rate, provides valuable insights into their impact on the performance and stability of those neural network training methods. The results demonstrate the superior performance of a novel streaming algorithm that incorporates the Base-Values mechanism compared to both a traditional epoch-based method and other methods.

Cite this Paper


BibTeX
@InProceedings{pmlr-v263-wojtulewicz24a, title = {On Speeding Up the Training of Deep Neural Networks Using the Streaming Approach: The Base-Values Mechanism}, author = {Wojtulewicz, Mateusz and Duda, Piotr and Nowicki, Robert and Rutkowski, Leszek}, booktitle = {Proceedings of The Workshop on Classifier Learning from Difficult Data}, pages = {17--24}, year = {2024}, editor = {Zyblewski, Pawel and Grana, Manuel and Pawel, Ksieniewicz and Minku, Leandro}, volume = {263}, series = {Proceedings of Machine Learning Research}, month = {19--20 Oct}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v263/main/assets/wojtulewicz24a/wojtulewicz24a.pdf}, url = {https://proceedings.mlr.press/v263/wojtulewicz24a.html}, abstract = {Efficient and stable neural network training is crucial for advancing machine learning applications. This study explores the promising streaming approach as an alternative to traditional epoch-based methods. This paradigm shift involves transforming training data into a continuous stream, prioritizing challenging examples to enhance the learning process. Building upon this approach, we introduce an innovative Base-Values mechanism aimed at further improving the speed and stability of the streaming training process. We apply this framework to the original streaming training algorithms, resulting in algorithms such as Persistent Loss-Based (PLB) and Persistent Entropy-Based algorithm (PEB). We conduct a comprehensive experimental comparison on EMNIST dataset, analyzing traditional epoch-based methods and the streaming approach, including both original and new methods employing the Base-Values mechanism. The exploration of various hyperparameters, including pre-training length, mini-batch size, and learning rate, provides valuable insights into their impact on the performance and stability of those neural network training methods. The results demonstrate the superior performance of a novel streaming algorithm that incorporates the Base-Values mechanism compared to both a traditional epoch-based method and other methods.} }
Endnote
%0 Conference Paper %T On Speeding Up the Training of Deep Neural Networks Using the Streaming Approach: The Base-Values Mechanism %A Mateusz Wojtulewicz %A Piotr Duda %A Robert Nowicki %A Leszek Rutkowski %B Proceedings of The Workshop on Classifier Learning from Difficult Data %C Proceedings of Machine Learning Research %D 2024 %E Pawel Zyblewski %E Manuel Grana %E Ksieniewicz Pawel %E Leandro Minku %F pmlr-v263-wojtulewicz24a %I PMLR %P 17--24 %U https://proceedings.mlr.press/v263/wojtulewicz24a.html %V 263 %X Efficient and stable neural network training is crucial for advancing machine learning applications. This study explores the promising streaming approach as an alternative to traditional epoch-based methods. This paradigm shift involves transforming training data into a continuous stream, prioritizing challenging examples to enhance the learning process. Building upon this approach, we introduce an innovative Base-Values mechanism aimed at further improving the speed and stability of the streaming training process. We apply this framework to the original streaming training algorithms, resulting in algorithms such as Persistent Loss-Based (PLB) and Persistent Entropy-Based algorithm (PEB). We conduct a comprehensive experimental comparison on EMNIST dataset, analyzing traditional epoch-based methods and the streaming approach, including both original and new methods employing the Base-Values mechanism. The exploration of various hyperparameters, including pre-training length, mini-batch size, and learning rate, provides valuable insights into their impact on the performance and stability of those neural network training methods. The results demonstrate the superior performance of a novel streaming algorithm that incorporates the Base-Values mechanism compared to both a traditional epoch-based method and other methods.
APA
Wojtulewicz, M., Duda, P., Nowicki, R. & Rutkowski, L.. (2024). On Speeding Up the Training of Deep Neural Networks Using the Streaming Approach: The Base-Values Mechanism. Proceedings of The Workshop on Classifier Learning from Difficult Data, in Proceedings of Machine Learning Research 263:17-24 Available from https://proceedings.mlr.press/v263/wojtulewicz24a.html.

Related Material