Tools & Stack
A curated, descriptive list of open-source AI/ML tools and projects!
What it is
LangGraph is a library from the LangChain team that lets you build agents and workflows using a stateful graph abstraction. Each node in the graph can be an LLM call, retriever, or any LangChain component.
Key features
- Graph-based logic: Create multi-step workflows with branches, memory, and loops.
- Built on LangChain: Seamlessly integrates with LangChain components.
- Concurrency support: Run parts of the graph in parallel.
- Fine control: Handle state transitions and failures easily.
- Perfect for agents: Design complex tool-using agents with long-term memory.
What it is
DSPy is an open-source framework from Stanford for programming, rather than prompting, language-model workflows. You define small, natural-language Python modules, and DSPy compiles them into pipelines whose prompts (and even weights) are tuned automatically
Key features
- Declarative modules: Write jobs as readable Python functions with NL “signatures”; no handcrafted prompt fiddling.
- Auto-optimization: Built-in Teleprompter algorithms (e.g., BootstrapFewShot) learn optimal prompts/weights from data.
- Composable pipelines: Chain modules to build RAG flows, agent loops, or evaluators out-of-the-box.
- Model-agnostic: Swap backends, OpenAI, Anthropic, local Llama-family models, without code changes.
- Production-ready: MIT-licensed, light-weight (
pip install dspy-ai
), latest v2.6.27 (released Jun 3 2025).
What it is
Guardrails is a Python library that validates and structures LLM outputs to ensure they meet schema, quality, and safety requirements. Great for production pipelines.
Key features
- Pydantic-style schemas: Define expected output formats easily.
- Re-asks: Automatically re-run the LLM until the output is valid.
- Custom validators: Add checks like regex, length, or semantic constraints.
- Streaming & multi-modal support: Works with audio, image, and streaming data.
- Observability: Track, log, and debug generations in real-time.
What it is
LangChain is a powerful open-source Python framework that helps developers build applications with Large Language Models (LLMs) by combining prompts, tools, memory, and control logic into modular workflows.
Key features
- Prompt chaining: Chain prompts and models together to build multi-step reasoning flows.
- Tool integration: Plug in APIs, databases, search engines, and more using prebuilt tool wrappers.
- RAG support: Retrieval-augmented generation pipelines with vector store support (FAISS, Qdrant, etc).
- Agent framework: Dynamic decision-making agents that can choose tools and actions at runtime.
- Memory modules: Enable stateful conversations with long-term memory and context handling.
- Observability: Debug, visualize, and trace your LLM chains using LangSmith.
What it is
Unsloth is an open-source library designed to make QLoRA fine-tuning of LLaMA models 2x faster and memory-efficient. Ideal for running on Colab, Kaggle, or low-VRAM machines.
Key features
- Up to 2x speedup: Optimized CUDA kernels and attention rewrites.
- Low RAM/GPU support: Fine-tune LLaMA 2/3 on 8GB or even 4GB GPUs.
- QLoRA optimized: Flash-attn, 4-bit quantization, paged attention built-in.
- Trainer integration: Works with Hugging Face
transformers
and trl
.
- Colab/Kaggle-ready: Official notebooks and support for budget training.