LLM Memory & RAG 相关度: 6/10

Towards Personalized Bangla Book Recommendation: A Large-Scale Multi-Entity Book Graph Dataset

Rahin Arefin Ahmed, Md. Anik Chowdhury, Sakil Ahmed Sheikh Reza, Devnil Bhattacharjee, Muhammad Abdullah Adnan, Nafis Sadeq
arXiv: 2602.12129v1 发布: 2026-02-12 更新: 2026-02-12

AI 摘要

构建了大规模孟加拉语图书知识图谱数据集,并进行了推荐模型基准测试。

主要贡献

  • 构建大规模孟加拉语图书知识图谱数据集RokomariBG
  • 提供了Top-N推荐任务的基准测试
  • 评估了多种推荐模型,包括GNN

方法论

构建包含用户、书籍、作者等多种实体和关系的知识图谱,并使用多种推荐模型进行评估,比较性能。

原文摘要

Personalized book recommendation in Bangla literature has been constrained by the lack of structured, large-scale, and publicly available datasets. This work introduces RokomariBG, a large-scale, multi-entity heterogeneous book graph dataset designed to support research on personalized recommendation in a low-resource language setting. The dataset comprises 127,302 books, 63,723 users, 16,601 authors, 1,515 categories, 2,757 publishers, and 209,602 reviews, connected through eight relation types and organized as a comprehensive knowledge graph. To demonstrate the utility of the dataset, we provide a systematic benchmarking study on the Top-N recommendation task, evaluating a diverse set of representative recommendation models, including classical collaborative filtering methods, matrix factorization models, content-based approaches, graph neural networks, a hybrid matrix factorization model with side information, and a neural two-tower retrieval architecture. The benchmarking results highlight the importance of leveraging multi-relational structure and textual side information, with neural retrieval models achieving the strongest performance (NDCG@10 = 0.204). Overall, this work establishes a foundational benchmark and a publicly available resource for Bangla book recommendation research, enabling reproducible evaluation and future studies on recommendation in low-resource cultural domains. The dataset and code are publicly available at https://github.com/backlashblitz/Bangla-Book-Recommendation-Dataset

标签

推荐系统 知识图谱 孟加拉语 数据集

arXiv 分类

cs.IR cs.LG