Shaurya // Lab
Back to projects

JIIT AI Assistant

A localized, context-aware AI assistant application built for university environments.

Private Source

Problem

University ecosystems are sprawling and complex. Students and faculty constantly struggle to find specific, localized information buried in PDFs and unorganized department websites.

Why it matters

Generic LLMs hallucinate when asked about specific university policies. An assistant must be grounded in verified, local truth to be useful rather than dangerous.

Insight

By building a localized, Retrieval-Augmented Generation (RAG) AI assistant, we can ground a large language model strictly in the university's actual, verified documentation, bypassing the hallucination problem.

Solution

Aggregated university-specific data into a vector index and built a querying pipeline that retrieves the most relevant institutional data. This context is injected into the LLM's prompt before generation, served via a lightweight Flask backend.

Technology used

Python, Flask for the API layer, LangChain/LlamaIndex for the RAG pipeline, and local LLM embeddings for secure data processing.

Impact / Result

Created a highly accurate, institution-specific digital aide that successfully answers nuanced queries regarding university operations without hallucinating, functioning perfectly within a local development environment.