Implicit Neural Representation as vectorizer for classification task applied to diverse data structures

Thibault Malherbe
Proceedings of the 1st ContinualAI Unconference, 2023, PMLR 249:62-76, 2024.

Abstract

Implicit neural representations have recently emerged as a promising tool in data science research for their ability to learn complex, high-dimensional functions without requiring explicit equations or hand-crafted features. Here we aim to use these implicit neural representations weights to represent batch of data and use it to classify these batch based only on these weights, without any feature engineering on the raw data. In this study, we demonstrate that this method yields very promising results in data classification of several type of data, such as sound, images, videos or human activities, without any prior knowledge in the relative field.

Cite this Paper


BibTeX
@InProceedings{pmlr-v249-malherbe24a, title = {Implicit Neural Representation as vectorizer for classification task applied to diverse data structures}, author = {Malherbe, Thibault}, booktitle = {Proceedings of the 1st ContinualAI Unconference, 2023}, pages = {62--76}, year = {2024}, editor = {Swaroop, Siddharth and Mundt, Martin and Aljundi, Rahaf and Khan, Mohammad Emtiyaz}, volume = {249}, series = {Proceedings of Machine Learning Research}, month = {09 Oct}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v249/main/assets/malherbe24a/malherbe24a.pdf}, url = {https://proceedings.mlr.press/v249/malherbe24a.html}, abstract = {Implicit neural representations have recently emerged as a promising tool in data science research for their ability to learn complex, high-dimensional functions without requiring explicit equations or hand-crafted features. Here we aim to use these implicit neural representations weights to represent batch of data and use it to classify these batch based only on these weights, without any feature engineering on the raw data. In this study, we demonstrate that this method yields very promising results in data classification of several type of data, such as sound, images, videos or human activities, without any prior knowledge in the relative field.} }
Endnote
%0 Conference Paper %T Implicit Neural Representation as vectorizer for classification task applied to diverse data structures %A Thibault Malherbe %B Proceedings of the 1st ContinualAI Unconference, 2023 %C Proceedings of Machine Learning Research %D 2024 %E Siddharth Swaroop %E Martin Mundt %E Rahaf Aljundi %E Mohammad Emtiyaz Khan %F pmlr-v249-malherbe24a %I PMLR %P 62--76 %U https://proceedings.mlr.press/v249/malherbe24a.html %V 249 %X Implicit neural representations have recently emerged as a promising tool in data science research for their ability to learn complex, high-dimensional functions without requiring explicit equations or hand-crafted features. Here we aim to use these implicit neural representations weights to represent batch of data and use it to classify these batch based only on these weights, without any feature engineering on the raw data. In this study, we demonstrate that this method yields very promising results in data classification of several type of data, such as sound, images, videos or human activities, without any prior knowledge in the relative field.
APA
Malherbe, T.. (2024). Implicit Neural Representation as vectorizer for classification task applied to diverse data structures. Proceedings of the 1st ContinualAI Unconference, 2023, in Proceedings of Machine Learning Research 249:62-76 Available from https://proceedings.mlr.press/v249/malherbe24a.html.

Related Material