IRIS: An Immersive Robot Interaction System

Xinkai Jiang, Qihao Yuan, Enes Ulas Dincer, Hongyi Zhou, Ge Li, Xueyin Li, Xiaogang Jia, Timo Schnizer, Nicolas Schreiber, Weiran Liao, Julius Haag, Kailai Li, Gerhard Neumann, Rudolf Lioutikov
Proceedings of The 9th Conference on Robot Learning, PMLR 305:2555-2582, 2025.

Abstract

This paper introduces IRIS, an Immersive Robot Interaction System leveraging Extended Reality (XR). Existing XR-based systems enable efficient data collection but are often challenging to reproduce and reuse due to their specificity to particular robots, objects, simulators, and environments. IRIS addresses these issues by supporting immersive interaction and data collection across diverse simulators and real-world scenarios. It visualizes arbitrary rigid and deformable objects, robots from simulation, and integrates real-time sensor-generated point clouds for real-world applications. Additionally, IRIS enhances collaborative capabilities by enabling multiple users to simultaneously interact within the same virtual scene. Extensive experiments demonstrate that IRIS offers efficient and intuitive data collection in both simulated and real-world settings.

Cite this Paper


BibTeX
@InProceedings{pmlr-v305-jiang25c, title = {IRIS: An Immersive Robot Interaction System}, author = {Jiang, Xinkai and Yuan, Qihao and Dincer, Enes Ulas and Zhou, Hongyi and Li, Ge and Li, Xueyin and Jia, Xiaogang and Schnizer, Timo and Schreiber, Nicolas and Liao, Weiran and Haag, Julius and Li, Kailai and Neumann, Gerhard and Lioutikov, Rudolf}, booktitle = {Proceedings of The 9th Conference on Robot Learning}, pages = {2555--2582}, year = {2025}, editor = {Lim, Joseph and Song, Shuran and Park, Hae-Won}, volume = {305}, series = {Proceedings of Machine Learning Research}, month = {27--30 Sep}, publisher = {PMLR}, pdf = {https://raw.githubusercontent.com/mlresearch/v305/main/assets/jiang25c/jiang25c.pdf}, url = {https://proceedings.mlr.press/v305/jiang25c.html}, abstract = {This paper introduces IRIS, an Immersive Robot Interaction System leveraging Extended Reality (XR). Existing XR-based systems enable efficient data collection but are often challenging to reproduce and reuse due to their specificity to particular robots, objects, simulators, and environments. IRIS addresses these issues by supporting immersive interaction and data collection across diverse simulators and real-world scenarios. It visualizes arbitrary rigid and deformable objects, robots from simulation, and integrates real-time sensor-generated point clouds for real-world applications. Additionally, IRIS enhances collaborative capabilities by enabling multiple users to simultaneously interact within the same virtual scene. Extensive experiments demonstrate that IRIS offers efficient and intuitive data collection in both simulated and real-world settings.} }
Endnote
%0 Conference Paper %T IRIS: An Immersive Robot Interaction System %A Xinkai Jiang %A Qihao Yuan %A Enes Ulas Dincer %A Hongyi Zhou %A Ge Li %A Xueyin Li %A Xiaogang Jia %A Timo Schnizer %A Nicolas Schreiber %A Weiran Liao %A Julius Haag %A Kailai Li %A Gerhard Neumann %A Rudolf Lioutikov %B Proceedings of The 9th Conference on Robot Learning %C Proceedings of Machine Learning Research %D 2025 %E Joseph Lim %E Shuran Song %E Hae-Won Park %F pmlr-v305-jiang25c %I PMLR %P 2555--2582 %U https://proceedings.mlr.press/v305/jiang25c.html %V 305 %X This paper introduces IRIS, an Immersive Robot Interaction System leveraging Extended Reality (XR). Existing XR-based systems enable efficient data collection but are often challenging to reproduce and reuse due to their specificity to particular robots, objects, simulators, and environments. IRIS addresses these issues by supporting immersive interaction and data collection across diverse simulators and real-world scenarios. It visualizes arbitrary rigid and deformable objects, robots from simulation, and integrates real-time sensor-generated point clouds for real-world applications. Additionally, IRIS enhances collaborative capabilities by enabling multiple users to simultaneously interact within the same virtual scene. Extensive experiments demonstrate that IRIS offers efficient and intuitive data collection in both simulated and real-world settings.
APA
Jiang, X., Yuan, Q., Dincer, E.U., Zhou, H., Li, G., Li, X., Jia, X., Schnizer, T., Schreiber, N., Liao, W., Haag, J., Li, K., Neumann, G. & Lioutikov, R.. (2025). IRIS: An Immersive Robot Interaction System. Proceedings of The 9th Conference on Robot Learning, in Proceedings of Machine Learning Research 305:2555-2582 Available from https://proceedings.mlr.press/v305/jiang25c.html.

Related Material