Recent advancements in Large Language Models (LLMs) have demonstrated remarkable capabilities in natural language understanding and generation. To further enhance their factual grounding and reasoning fidelity, integrating LLMs with Knowledge Graphs (KGs) has emerged as a promising direction. Significant progress has been made in leveraging KGs to augment LLM reasoning through methods like Retrieval-Augmented Generation. However, effectively harnessing the synergy between LLMs and KGs for robust and reliable reasoning still presents critical challenges. Specifically: (1) LLMs struggle to effectively interpret and utilize the structured nature of KGs , due to the discrepancy between their text-based training and KG's symbolic representations; (2) querying and reasoning over structured knowledge in KGs remains inefficient for LLMs , hindering complex inference. To address these limitations, we introduce Meta-Knowledge enhanced Knowledge Graph (MKG), a novel framework that empowers LLMs to effectively leverage structured knowledge from KGs. MKG employs Meta-Knowledge , stored in a multi-store memory with a Self-Correcting Mechanism, to guide LLMs in KG retrieval and reasoning. Our experimental evaluations on complex question answering benchmarks demonstrate that MKG achieves significant performance gains, outperforms the baseline Original LLM, Retrieval-Augmented Generation (RAG), ReAct, GraphRAG and ToG frameworks by 25%, 17%, 11%, 3.3% and 2.6%, respectively.
Building similarity graph...
Analyzing shared references across papers
Loading...
Zhang et al. (Wed,) studied this question.
www.synapsesocial.com/papers/69c6201515a0a509bde18757 — DOI: https://doi.org/10.1145/3797906
Wei Zhang
Guojun Dai
Deli Luo
ACM Transactions on Intelligent Systems and Technology
Hangzhou Dianzi University
Building similarity graph...
Analyzing shared references across papers
Loading...