Skip to main content

Simultaneously Learning Transferable Symbols and Language Groundings from Perceptual Data for Instruction Following

Publication ,  Conference
Gopalan, N; Rosen, E; Konidaris, G; Tellex, S
Published in: Robotics: Science and Systems
January 1, 2020

Enabling robots to learn tasks and follow instructions as easily as humans is important for many real-world robot applications. Previous approaches have applied machine learning to teach the mapping from language to low dimensional symbolic representations constructed by hand, using demonstration trajectories paired with accompanying instructions. These symbolic methods lead to data efficient learning. Other methods map language directly to high-dimensional control behavior, which requires less design effort but is data-intensive. We propose to first learning symbolic abstractions from demonstration data and then mapping language to those learned abstractions. These symbolic abstractions can be learned with significantly less data than end-to-end approaches, and support partial behavior specification via natural language since they permit planning using traditional planners. During training, our approach requires only a small number of demonstration trajectories paired with natural language—without the use of a simulator—and results in a representation capable of planning to fulfill natural language instructions specifying a goal or partial plan. We apply our approach to two domains, including a mobile manipulator, where a small number of demonstrations enable the robot to follow navigation commands like “Take left at the end of the hallway,” in environments it has not encountered before.

Duke Scholars

Published In

Robotics: Science and Systems

DOI

EISSN

2330-765X

ISBN

9780992374761

Publication Date

January 1, 2020
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Gopalan, N., Rosen, E., Konidaris, G., & Tellex, S. (2020). Simultaneously Learning Transferable Symbols and Language Groundings from Perceptual Data for Instruction Following. In Robotics: Science and Systems. https://doi.org/10.15607/RSS.2020.XVI.102
Gopalan, N., E. Rosen, G. Konidaris, and S. Tellex. “Simultaneously Learning Transferable Symbols and Language Groundings from Perceptual Data for Instruction Following.” In Robotics: Science and Systems, 2020. https://doi.org/10.15607/RSS.2020.XVI.102.
Gopalan N, Rosen E, Konidaris G, Tellex S. Simultaneously Learning Transferable Symbols and Language Groundings from Perceptual Data for Instruction Following. In: Robotics: Science and Systems. 2020.
Gopalan, N., et al. “Simultaneously Learning Transferable Symbols and Language Groundings from Perceptual Data for Instruction Following.” Robotics: Science and Systems, 2020. Scopus, doi:10.15607/RSS.2020.XVI.102.
Gopalan N, Rosen E, Konidaris G, Tellex S. Simultaneously Learning Transferable Symbols and Language Groundings from Perceptual Data for Instruction Following. Robotics: Science and Systems. 2020.

Published In

Robotics: Science and Systems

DOI

EISSN

2330-765X

ISBN

9780992374761

Publication Date

January 1, 2020