Grounding Robot Plans from Natural Language Instructions with Incomplete World Knowledge

[edit]

Daniel Nyga, Subhro Roy, Rohan Paul, Daehyung Park, Mihai Pomarlan, Michael Beetz, Nicholas Roy ;
Proceedings of The 2nd Conference on Robot Learning, PMLR 87:714-723, 2018.

Abstract

Our goal is to enable robots to interpret and execute high-level tasks conveyed using natural language instructions. For example, consider tasking a household robot to, “prepare my breakfast”, “clear the boxes on the table” or “make me a fruit milkshake”. Interpreting such underspecified instructions requires environmental context and background knowledge about how to accomplish complex tasks. Further, the robot’s workspace knowledge may be incomplete: the environment may only be partially-observed or background knowledge may be missing causing a failure in plan synthesis. We introduce a probabilistic model that utilizes background knowledge to infer latent or missing plan constituents based on semantic co-associations learned from noisy textual corpora of task descriptions. The ability to infer missing plan constituents enables information-seeking actions such as visual exploration or dialogue with the human to acquire new knowledge to fill incomplete plans. Results indicate robust plan inference from under-specified instructions in partially-known worlds.

Related Material