Skip to main content

Demo: Control the ARmada: LMM Coordination for Multi-Robot AR-HRC

Publication ,  Conference
Fronk, C; Ye, H; Pajic, M; Gorlatova, M
Published in: Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications
March 2, 2026

Augmented reality (AR) has shown strong potential to enable more flexible interaction in human-robot collaboration (HRC) by conveying a robot’s state and intent [1, 3] and enabling intuitive control [1]. However, while many systems exist that demonstrate AR’s ability to enhance robot control, they often incorporate control for only a singular robotic collaborator, with interaction techniques that require direct manipulation of virtual content; in scenarios with many robots, these manipulations would become much more frequent, lowering task efficiency through repeated manual interventions. Moreover, increased focus on virtual content could reduce awareness of the real environment [2], posing safety risks if this environment contains hazards. Motivated by these challenges, we present ARmada, an AR-HRC system leveraging edge scene understanding and an LMM to enhance environmental awareness and enable AR control in multi-robot settings. System Design: ARmada has 4 primary components: a Meta Quest 3 AR headset, a Unitree Go2 quadruped robot, an edge server, and a cloud LMM. Due to space constraints, the additional robot collaborators are 6 virtual drones; however, the system can include multiple physical robots of diverse form factors. Through Quest depth and image data processing by the LMM and the edge, ARmada detects and virtually marks key landmarks such as environmental hazards, improving awareness for both the user and robots. Following detection, a user can issue concise, high-level commands relative to landmarks in the environment. Using LMM-based robot control, the system can autonomously direct robots to individual destinations or coordinate them into complex formations, such as geometric shapes, without further user input.

Duke Scholars

Published In

Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications

DOI

Publication Date

March 2, 2026

Start / End Page

174
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Fronk, C., Ye, H., Pajic, M., & Gorlatova, M. (2026). Demo: Control the ARmada: LMM Coordination for Multi-Robot AR-HRC. In Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications (p. 174). https://doi.org/10.1145/3789514.3796258
Fronk, C., H. Ye, M. Pajic, and M. Gorlatova. “Demo: Control the ARmada: LMM Coordination for Multi-Robot AR-HRC.” In Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications, 174, 2026. https://doi.org/10.1145/3789514.3796258.
Fronk C, Ye H, Pajic M, Gorlatova M. Demo: Control the ARmada: LMM Coordination for Multi-Robot AR-HRC. In: Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications. 2026. p. 174.
Fronk, C., et al. “Demo: Control the ARmada: LMM Coordination for Multi-Robot AR-HRC.” Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications, 2026, p. 174. Scopus, doi:10.1145/3789514.3796258.
Fronk C, Ye H, Pajic M, Gorlatova M. Demo: Control the ARmada: LMM Coordination for Multi-Robot AR-HRC. Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications. 2026. p. 174.

Published In

Hotmobile 2026 Proceedings of the 2026 ACM 27th International Workshop on Mobile Computing Systems and Applications

DOI

Publication Date

March 2, 2026

Start / End Page

174