State-of-the-art Machine Learning for JAX, PyTorch, and TensorFlow. The go-to library for pretrained models.

Transformers provides APIs and tools to easily download and train state-of-the-art pretrained models. It supports PyTorch, TensorFlow, and JAX frameworks, offering thousands of pre-trained models for various tasks including text classification, named entity recognition, question answering, translation, and more. The library has become the foundation of the modern NLP ecosystem.
🤖 NLP 🔄 PyTorch 📚 Pre-trained
Python 🔄 Active 📅 Updated today
Visit Repository →

Stable Diffusion web UI. The most popular and feature-rich stable diffusion web interface.

This is the most widely used web interface for Stable Diffusion, featuring a sleek UI and extensive features including img2img, inpainting, outpainting, negative prompts, prompt editing, interleaved rendering, and more. It supports various Stable Diffusion versions and checkpoints. The project has spawned an entire ecosystem of extensions and customizations.
🖼️ Image Gen 🎨 Art 🌐 Web UI
Python 🔄 Active 📅 Updated today
Visit Repository →

Build applications using LLMs through composable components. The standard for LLM app development.

LangChain is a framework for developing applications powered by large language models. It provides modular components for working with LLMs, including prompt templates, memory systems, chain abstractions, and agents. The library connects LLMs to other data sources and enables building complex workflows. It supports integration with dozens of LLM providers and data sources.
⛓️ LLM Apps 🧩 Components 🔌 Integrations
Python 🔄 Active 📅 Updated today
Visit Repository →
⭐ 95k

Get up and running with Llama 3, Mistral, Gemma, and other large language models locally.

Ollama makes it incredibly easy to run large language models locally on your Mac or Linux machine. It bundles model weights, configuration, and data into a single package and provides a simple API for running and managing models. Supports Llama 3, Mistral, Gemma, and many other models. Perfect for local development and privacy-conscious applications.
🏠 Local LLM 🍎 macOS 🔒 Privacy
Go 🔄 Active 📅 Updated today
Visit Repository →

Data framework for LLM applications to ingest, structure, and retrieve private data.

LlamaIndex (formerly GPT Index) is a data framework specifically designed for building LLM applications over private data. It provides tools for loading data from various sources, parsing documents, creating vector indices, and building efficient retrieval systems. Essential for building RAG (Retrieval Augmented Generation) applications with your own data.
📊 Data 🔍 Retrieval 📁 RAG
Python 🔄 Active 📅 Updated today
Visit Repository →

Let large language models run code on your computer. A local-first alternative to ChatGPT's Code Interpreter.

Open Interpreter gives LLMs the ability to execute code locally on your machine. Unlike cloud-based alternatives, your data stays on your computer. It can write files, run commands, browse web, and more. Supports multiple LLM providers and can be extended with custom tools. Perfect for AI-powered development and automation workflows.
💻 Code 🔄 Local ⚡ Automation
Python 🔄 Active 📅 Updated 2 days ago
Visit Repository →
⭐ 48k

Jan is an open-source alternative to ChatGPT that runs locally. 100% offline, privacy-first AI.

Jan runs entirely offline, ensuring your conversations never leave your device. It's a self-hosted AI chatbot that supports various models through Ollama or custom backends. The desktop app provides a native experience similar to ChatGPT. Jan is fully open-source with no telemetry or data collection, making it ideal for privacy-conscious users and enterprises.
🤖 Chatbot 🔒 Privacy 🏠 Self-hosted
TypeScript 🔄 Active 📅 Updated today
Visit Repository →
⭐ 52k

AI-powered SQL assistant. Ask questions about your database in natural language.

Vanna is a Python library that enables non-technical users to query databases using natural language. It works by training a model on your database schema and documentation, then generates SQL queries based on user questions. Supports Snowflake, BigQuery, PostgreSQL, and other databases. Perfect for building AI-powered analytics tools.
🗄️ SQL 📊 Database 💬 NLP
Python 🔄 Active 📅 Updated yesterday
Visit Repository →

The open-source autopilot for your IDE. AI code completion, chat, and edit at your fingertips.

Continue brings AI-powered development to VS Code and JetBrains IDEs. It provides intelligent code completion, natural language code editing, chat assistance, and custom AI workflows. The open-source nature allows full customization and self-hosting. Integrates with Ollama, GPT-4, Claude, and other models for local or cloud inference.
💻 IDE ✨ Autocomplete 💬 Chat
TypeScript 🔄 Active 📅 Updated yesterday
Visit Repository →

Build & share delightful machine learning web apps entirely in Python.

Gradio is the fastest way to demo machine learning models. Create beautiful web interfaces for your AI models in pure Python, with no HTML or CSS knowledge required. It's become the standard for sharing AI demos and has been used by OpenAI, Google, and Meta for their public demos. Supports text, image, audio, video inputs and outputs.
🌐 Web App 🐍 Python 🎯 Demo
Python 🔄 Active 📅 Updated today
Visit Repository →

AI agents framework for building autonomous agents with tool integrations. Connect AI to 200+ apps.

Composio provides a powerful framework for building AI agents that can interact with external tools and services. It offers integrations with over 200 applications including GitHub, Slack, Gmail, Salesforce, and more. The framework handles authentication, rate limiting, and tool execution, allowing developers to focus on agent logic. Supports major LLM providers and enables complex multi-step workflows.
🤖 Agents 🔌 Integrations ⚡ Automation
Python 🔄 Active 📅 Updated today
Visit Repository →

Claude Code is the CLI for interacting with Anthropic's Claude and the Model Context Protocol.

Claude Code provides a command-line interface for interacting with Claude and implementing the Model Context Protocol (MCP). MCP enables AI models to connect with development tools and data sources securely. The CLI supports code editing, command execution, and integration with popular development environments. It's the foundation for building AI-powered development workflows with Claude.
⌨️ CLI 🔧 Dev Tools 📡 MCP
TypeScript 🔄 Active 📅 Updated today
Visit Repository →

Memory layer for AI agents. Store and retrieve context across conversations and sessions.

Mem0rias provides intelligent memory management for AI agents, enabling them to remember past conversations and learn from interactions. It supports semantic memory (facts), episodic memory (events), and procedural memory (patterns). The system uses vector storage for efficient retrieval and can integrate with various LLM providers. Essential for building persistent, context-aware AI applications.
🧠 Memory 💾 Storage 🔄 Context
Python 🔄 Active 📅 Updated 3 days ago
Visit Repository →

AI-powered GitHub pull request automation. Auto-merge, review, and manage PRs intelligently.

Auto Merge uses AI to automatically review, test, and merge pull requests. It analyzes code changes, runs appropriate tests, checks for conflicts, and handles the entire PR lifecycle automatically. Supports custom merge strategies, required checks, and team-specific workflows. Reduces manual overhead and accelerates development cycles.
🔀 Git 🤖 Automation 📋 PR Review
Python 🔄 Active 📅 Updated yesterday
Visit Repository →

Your personal AI portfolio manager. Track investments, analyze performance, and get AI-powered insights.

LlamaFolio is an AI-powered investment portfolio management tool that helps you track and analyze your investments across multiple asset classes. It integrates with major exchanges and wallets to provide real-time portfolio tracking, performance analysis, and AI-generated insights. Features include automated rebalancing suggestions, tax optimization, and customizable alerts.
💰 Finance 📈 Portfolio 🤖 AI Insights
Python 🔄 Active 📅 Updated today
Visit Repository →

Multi-model AI orchestration. Route requests across GPT, Claude, Gemini, and local models intelligently.

Switchboard is an intelligent routing layer for AI applications that automatically selects the best model for each request. It considers cost, latency, capability, and context to optimize every call. Supports fallback chains, A/B testing, and custom routing rules. Reduces AI costs by up to 70% while improving response quality through intelligent model selection.
🔀 Routing 💵 Cost Savings ⚡ Performance
Python 🔄 Active 📅 Updated today
Visit Repository →