RAG-Based Query Engine using LLM and Vector DB for College Details
Abstract
Large Language Models (LLMs) and other developments in generative AI technology have broadened the application of AI in a number of fields in recent years.This paper suggests a multipronged strategy to address this, utilising LLM's strengths to minimise data insufficiency through specific solutions, including document integration and fine-tuning.The creation and implementation of a Retrieval-Augmented Generation (RAG) model, a chat application intended to enhance information retrieval and storage procedures and support the production of high-quality material, is the main goal.The work offers a thorough examination of the improved information retrieval phases by integrating a RAG model, highlighting their functions in getting over data constraints.This research presents the development of an efficient chatbot utilizing large language models (LLMs) such as LLaMA and Gemini, integrated with the Retrieval-Augmented Generation (RAG) framework to enhance response accuracy and contextual understanding and also uses Groq AI to reduce the response time. The chatbot is supported by a Facebook AI Similarity Search (FAISS) for efficient similarity search and clustering of dense vectors, enabling fast and scalable searches. Using Beautiful Soup, data was gathered via web scraping college website to extract important information such as course offers, faculty biographies, admissions details.In order to create an intelligent, domain-specific chatbot, this paper demonstrates how well LLMs, vectorized search, and web scraping can be integrated with the RAG framework. The results show how such a system could improve user involvement in academic settings, automate institutional communication, and reduce human labor.
How to Cite This Article
Arshia Tahseen, Joanna Sumathi, Shriya Reddy, Dr. D. Shravani (2025). RAG-Based Query Engine using LLM and Vector DB for College Details . Journal of Frontiers in Multidisciplinary Research (JFMR), 6(1), 86-91. DOI: https://doi.org/10.54660/.IJFMR.2025.6.1.86-91