Training Neural Machines with Trace-Based Supervision

Matthew Mirman, Dimitar Dimitrov, Pavle Djordjevic, Timon Gehr, Martin Vechev
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:3569-3577, 2018.

Abstract

We investigate the effectiveness of trace-based supervision methods for training existing neural abstract machines. To define the class of neural machines amenable to trace-based supervision, we introduce the concept of a differential neural computational machine (dNCM) and show that several existing architectures (NTMs, NRAMs) can be described as dNCMs. We performed a detailed experimental evaluation with NTM and NRAM machines, showing that additional supervision on the interpretable portions of these architectures leads to better convergence and generalization capabilities of the learning phase than standard training, in both noise-free and noisy scenarios.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-mirman18a, title = {Training Neural Machines with Trace-Based Supervision}, author = {Mirman, Matthew and Dimitrov, Dimitar and Djordjevic, Pavle and Gehr, Timon and Vechev, Martin}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {3569--3577}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/mirman18a/mirman18a.pdf}, url = {https://proceedings.mlr.press/v80/mirman18a.html}, abstract = {We investigate the effectiveness of trace-based supervision methods for training existing neural abstract machines. To define the class of neural machines amenable to trace-based supervision, we introduce the concept of a differential neural computational machine (dNCM) and show that several existing architectures (NTMs, NRAMs) can be described as dNCMs. We performed a detailed experimental evaluation with NTM and NRAM machines, showing that additional supervision on the interpretable portions of these architectures leads to better convergence and generalization capabilities of the learning phase than standard training, in both noise-free and noisy scenarios.} }
Endnote
%0 Conference Paper %T Training Neural Machines with Trace-Based Supervision %A Matthew Mirman %A Dimitar Dimitrov %A Pavle Djordjevic %A Timon Gehr %A Martin Vechev %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-mirman18a %I PMLR %P 3569--3577 %U https://proceedings.mlr.press/v80/mirman18a.html %V 80 %X We investigate the effectiveness of trace-based supervision methods for training existing neural abstract machines. To define the class of neural machines amenable to trace-based supervision, we introduce the concept of a differential neural computational machine (dNCM) and show that several existing architectures (NTMs, NRAMs) can be described as dNCMs. We performed a detailed experimental evaluation with NTM and NRAM machines, showing that additional supervision on the interpretable portions of these architectures leads to better convergence and generalization capabilities of the learning phase than standard training, in both noise-free and noisy scenarios.
APA
Mirman, M., Dimitrov, D., Djordjevic, P., Gehr, T. & Vechev, M.. (2018). Training Neural Machines with Trace-Based Supervision. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:3569-3577 Available from https://proceedings.mlr.press/v80/mirman18a.html.

Related Material