Skip to content
#

llm-rag

Here are 13 public repositories matching this topic...

Chat With Documents is a Streamlit application designed to facilitate interactive, context-aware conversations with large language models (LLMs) by leveraging Retrieval-Augmented Generation (RAG). Users can upload documents or provide URLs, and the app indexes the content using a vector store called Chroma to supply relevant context during chats.

  • Updated Feb 18, 2025
  • Python

FileChat-RAG is a simple Retrieval-Augmented Generation (RAG) system that allows users to ask questions about the contents of various file formats. It extracts text from PDFs, JSON, text files(.txt,, .docx, .odt, .md), and code files, then enables interactive conversations using an LLM powered by Ollama.

  • Updated Mar 18, 2025
  • Python

Improve this page

Add a description, image, and links to the llm-rag topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llm-rag topic, visit your repo's landing page and select "manage topics."

Learn more