RobustFill: Neural Program Learning under Noisy I/O

Jacob Devlin, Jonathan Uesato, Surya Bhupatiraju, Rishabh Singh, Abdel-rahman Mohamed, Pushmeet Kohli
Proceedings of the 34th International Conference on Machine Learning, PMLR 70:990-998, 2017.

Abstract

The problem of automatically generating a computer program from some specification has been studied since the early days of AI. Recently, two competing approaches for `automatic program learning’ have received significant attention: (1) `neural program synthesis’, where a neural network is conditioned on input/output (I/O) examples and learns to generate a program, and (2) `neural program induction’, where a neural network generates new outputs directly using a latent program representation. Here, for the first time, we directly compare both approaches on a large-scale, real-world learning task and we additionally contrast to rule-based program synthesis, which uses hand-crafted semantics to guide the program generation. Our neural models use a modified attention RNN to allow encoding of variable-sized sets of I/O pairs, which achieve 92\% accuracy on a real-world test set, compared to the 34\% accuracy of the previous best neural synthesis approach. The synthesis model also outperforms a comparable induction model on this task, but we more importantly demonstrate that the strength of each approach is highly dependent on the evaluation metric and end-user application. Finally, we show that we can train our neural models to remain very robust to the type of noise expected in real-world data (e.g., typos), while a highly-engineered rule-based system fails entirely.

Cite this Paper


BibTeX
@InProceedings{pmlr-v70-devlin17a, title = {{R}obust{F}ill: Neural Program Learning under Noisy {I}/{O}}, author = {Jacob Devlin and Jonathan Uesato and Surya Bhupatiraju and Rishabh Singh and Abdel-rahman Mohamed and Pushmeet Kohli}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {990--998}, year = {2017}, editor = {Precup, Doina and Teh, Yee Whye}, volume = {70}, series = {Proceedings of Machine Learning Research}, month = {06--11 Aug}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v70/devlin17a/devlin17a.pdf}, url = {https://proceedings.mlr.press/v70/devlin17a.html}, abstract = {The problem of automatically generating a computer program from some specification has been studied since the early days of AI. Recently, two competing approaches for `automatic program learning’ have received significant attention: (1) `neural program synthesis’, where a neural network is conditioned on input/output (I/O) examples and learns to generate a program, and (2) `neural program induction’, where a neural network generates new outputs directly using a latent program representation. Here, for the first time, we directly compare both approaches on a large-scale, real-world learning task and we additionally contrast to rule-based program synthesis, which uses hand-crafted semantics to guide the program generation. Our neural models use a modified attention RNN to allow encoding of variable-sized sets of I/O pairs, which achieve 92\% accuracy on a real-world test set, compared to the 34\% accuracy of the previous best neural synthesis approach. The synthesis model also outperforms a comparable induction model on this task, but we more importantly demonstrate that the strength of each approach is highly dependent on the evaluation metric and end-user application. Finally, we show that we can train our neural models to remain very robust to the type of noise expected in real-world data (e.g., typos), while a highly-engineered rule-based system fails entirely.} }
Endnote
%0 Conference Paper %T RobustFill: Neural Program Learning under Noisy I/O %A Jacob Devlin %A Jonathan Uesato %A Surya Bhupatiraju %A Rishabh Singh %A Abdel-rahman Mohamed %A Pushmeet Kohli %B Proceedings of the 34th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2017 %E Doina Precup %E Yee Whye Teh %F pmlr-v70-devlin17a %I PMLR %P 990--998 %U https://proceedings.mlr.press/v70/devlin17a.html %V 70 %X The problem of automatically generating a computer program from some specification has been studied since the early days of AI. Recently, two competing approaches for `automatic program learning’ have received significant attention: (1) `neural program synthesis’, where a neural network is conditioned on input/output (I/O) examples and learns to generate a program, and (2) `neural program induction’, where a neural network generates new outputs directly using a latent program representation. Here, for the first time, we directly compare both approaches on a large-scale, real-world learning task and we additionally contrast to rule-based program synthesis, which uses hand-crafted semantics to guide the program generation. Our neural models use a modified attention RNN to allow encoding of variable-sized sets of I/O pairs, which achieve 92\% accuracy on a real-world test set, compared to the 34\% accuracy of the previous best neural synthesis approach. The synthesis model also outperforms a comparable induction model on this task, but we more importantly demonstrate that the strength of each approach is highly dependent on the evaluation metric and end-user application. Finally, we show that we can train our neural models to remain very robust to the type of noise expected in real-world data (e.g., typos), while a highly-engineered rule-based system fails entirely.
APA
Devlin, J., Uesato, J., Bhupatiraju, S., Singh, R., Mohamed, A. & Kohli, P.. (2017). RobustFill: Neural Program Learning under Noisy I/O. Proceedings of the 34th International Conference on Machine Learning, in Proceedings of Machine Learning Research 70:990-998 Available from https://proceedings.mlr.press/v70/devlin17a.html.

Related Material