Explore Projects

Discover 11 open source projects

Active filters (1):
Search: hallucinationsร—
Clear all

Showing 1-11 of 11 projects

idosal/git-mcp

A free, open-source MCP server for any GitHub project that helps developers with AI-powered code completion and exploration.

7.7K
Stable
TypeScript
MCP Servers
AI Coding Agents
TypeScript
#ai#llm#mcp

uptrain-ai/uptrain

An open-source platform for evaluating and improving Generative AI applications with 20+ preconfigured checks and root cause analysis.

2.3K
Archived
Python
LLM Frameworks
Testing
Python
#llm-eval#prompt-engineering#root-cause-analysis

onestardao/WFGY

Semantic Reasoning Engine for Large Language Models (LLMs) with AI interpretability and alignment features.

1.6K
Active
Python
Prompt Engineering
React
#semantic-reasoning#ai-interpretability#alignment

stanford-oval/WikiChat

WikiChat is an improved Retrieval Augmented Generation (RAG) model that reduces language model hallucination by retrieving data from a corpus.

1.6K
Active
Python
RAG & Vector
LLM Frameworks
Python
#chatbot#factuality#emnlp2023

emonney/QuickApp

A secure, production-ready Angular/ASP.NET Core template for building AI-powered web applications

1.3K
Active
TypeScript
Agents & Orchestration
Authentication
Angular
#ai-ready#authentication#authorization

PleasePrompto/notebooklm-mcp

MCP server enabling AI agents to research NotebookLM with citation-backed answers via Gemini API

1.2K
Stable
TypeScript
MCP Servers
AI Coding Agents
Claude
#mcp-server#notebooklm#gemini

Libr-AI/OpenFactVerification

An open-source solution to automate the process of verifying factuality, designed for AI-focused developers.

1.1K
Archived
Python
LLM Frameworks
AI SDKs & Wrappers
Python
#ai#factuality#hallucination

cvs-health/uqlm

A Python package for uncertainty quantification and hallucination detection in large language models (LLMs)

1.1K
Active
Python
LLM Frameworks
LLM Wrappers & SDKs
Python
#ai-safety#confidence-estimation#hallucination-detection

HillZhang1999/llm-hallucination-survey

A reading list and survey paper on hallucination in large language models (LLMs) for AI-focused developers.

1.1K
Stable
LLM Frameworks
Tutorials & Courses
#large-language-models#hallucination#survey

darrencxl0301/StageRAG

A blueprint for building production-ready RAG systems that minimize hallucination, with switchable pipelines.

1.1K
Stable
Python
RAG & Vector
API Frameworks
Python
#rag#vector#hallucination-reduction

EdinburghNLP/awesome-hallucination-detection

A curated list of papers on detecting hallucinations in large language models (LLMs).

1.1K
Active
LLM Frameworks
Agents & Orchestration
#hallucinations#llms#nlp

Stay in the loop

Get weekly updates on trending AI coding tools and projects.