Neural Program Synthesis from Diverse Demonstration Videos

Shao-Hua Sun, Hyeonwoo Noh, Sriram Somasundaram, Joseph Lim
Proceedings of the 35th International Conference on Machine Learning, PMLR 80:4790-4799, 2018.

Abstract

Interpreting decision making logic in demonstration videos is key to collaborating with and mimicking humans. To empower machines with this ability, we propose a neural program synthesizer that is able to explicitly synthesize underlying programs from behaviorally diverse and visually complicated demonstration videos. We introduce a summarizer module as part of our model to improve the network’s ability to integrate multiple demonstrations varying in behavior. We also employ a multi-task objective to encourage the model to learn meaningful intermediate representations for end-to-end training. We show that our model is able to reliably synthesize underlying programs as well as capture diverse behaviors exhibited in demonstrations. The code is available at https://shaohua0116.github.io/demo2program.

Cite this Paper


BibTeX
@InProceedings{pmlr-v80-sun18a, title = {Neural Program Synthesis from Diverse Demonstration Videos}, author = {Sun, Shao-Hua and Noh, Hyeonwoo and Somasundaram, Sriram and Lim, Joseph}, booktitle = {Proceedings of the 35th International Conference on Machine Learning}, pages = {4790--4799}, year = {2018}, editor = {Dy, Jennifer and Krause, Andreas}, volume = {80}, series = {Proceedings of Machine Learning Research}, month = {10--15 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v80/sun18a/sun18a.pdf}, url = {https://proceedings.mlr.press/v80/sun18a.html}, abstract = {Interpreting decision making logic in demonstration videos is key to collaborating with and mimicking humans. To empower machines with this ability, we propose a neural program synthesizer that is able to explicitly synthesize underlying programs from behaviorally diverse and visually complicated demonstration videos. We introduce a summarizer module as part of our model to improve the network’s ability to integrate multiple demonstrations varying in behavior. We also employ a multi-task objective to encourage the model to learn meaningful intermediate representations for end-to-end training. We show that our model is able to reliably synthesize underlying programs as well as capture diverse behaviors exhibited in demonstrations. The code is available at https://shaohua0116.github.io/demo2program.} }
Endnote
%0 Conference Paper %T Neural Program Synthesis from Diverse Demonstration Videos %A Shao-Hua Sun %A Hyeonwoo Noh %A Sriram Somasundaram %A Joseph Lim %B Proceedings of the 35th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2018 %E Jennifer Dy %E Andreas Krause %F pmlr-v80-sun18a %I PMLR %P 4790--4799 %U https://proceedings.mlr.press/v80/sun18a.html %V 80 %X Interpreting decision making logic in demonstration videos is key to collaborating with and mimicking humans. To empower machines with this ability, we propose a neural program synthesizer that is able to explicitly synthesize underlying programs from behaviorally diverse and visually complicated demonstration videos. We introduce a summarizer module as part of our model to improve the network’s ability to integrate multiple demonstrations varying in behavior. We also employ a multi-task objective to encourage the model to learn meaningful intermediate representations for end-to-end training. We show that our model is able to reliably synthesize underlying programs as well as capture diverse behaviors exhibited in demonstrations. The code is available at https://shaohua0116.github.io/demo2program.
APA
Sun, S., Noh, H., Somasundaram, S. & Lim, J.. (2018). Neural Program Synthesis from Diverse Demonstration Videos. Proceedings of the 35th International Conference on Machine Learning, in Proceedings of Machine Learning Research 80:4790-4799 Available from https://proceedings.mlr.press/v80/sun18a.html.

Related Material