Skip to main content

Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective

Publication ,  Conference
Ibrahim, M; Wan, Z; Li, H; Panda, P; Krishna, T; Kanerva, P; Chen, Y; Raychowdhury, A
Published in: Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024
January 1, 2024

Large language models (LLMs) have significantly transformed the landscape of artificial intelligence, demonstrating exceptional capabilities in natural language understanding and generation. Recently, the integration of LLMs with neurosymbolic architectures has gained traction to enhance contextual awareness and planning capabilities. However, this integration faces computational challenges that hinder scalability and efficiency, especially in edge computing environments. This paper provides an in-depth analysis of these challenges and explores state-of-the-art solutions, focusing on memory-centric computing principles at both algorithmic and hardware levels. Our exploration is centered around the key computational elements of the Transformer, the foundation of all LLMs, and vector-symbolic architecture, the leading neuro-symbolic model for edge applications. Additionally, we propose potential research directions for further investigation. By examining these aspects, this paper aims to bridge critical gaps in the path toward effective artificial general intelligence at the edge.

Duke Scholars

Published In

Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024

DOI

Publication Date

January 1, 2024

Start / End Page

11 / 20
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Ibrahim, M., Wan, Z., Li, H., Panda, P., Krishna, T., Kanerva, P., … Raychowdhury, A. (2024). Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective. In Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024 (pp. 11–20). https://doi.org/10.1109/CODES-ISSS60120.2024.00012
Ibrahim, M., Z. Wan, H. Li, P. Panda, T. Krishna, P. Kanerva, Y. Chen, and A. Raychowdhury. “Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective.” In Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024, 11–20, 2024. https://doi.org/10.1109/CODES-ISSS60120.2024.00012.
Ibrahim M, Wan Z, Li H, Panda P, Krishna T, Kanerva P, et al. Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective. In: Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024. 2024. p. 11–20.
Ibrahim, M., et al. “Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective.” Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024, 2024, pp. 11–20. Scopus, doi:10.1109/CODES-ISSS60120.2024.00012.
Ibrahim M, Wan Z, Li H, Panda P, Krishna T, Kanerva P, Chen Y, Raychowdhury A. Special Session: Neuro-Symbolic Architecture Meets Large Language Models: A Memory-Centric Perspective. Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024. 2024. p. 11–20.

Published In

Proceedings - 2024 International Conference on Hardware/Software Codesign and System Synthesis, CODES+ISSS 2024

DOI

Publication Date

January 1, 2024

Start / End Page

11 / 20