A Practical Guide to uv
uv is a blazingly fast Python package & project manager written in Rust, created by Astral (the folks behind ruff)
A package manager is basically the app store for your code. Instead of manually hunting libraries, downloading zips, and praying versions match, you just say:
uv add openai…and boom, dependency installed, version resolved, ready to use.
What’s a package manager, and why do we need one?
A package manager:
Downloads libraries (e.g.
openai,httpx,langchain)Resolves dependencies and versions
Keeps different projects’ dependencies isolated (so your old project doesn’t break when you upgrade something for a new one)
Without a package manager, you’d be:
Manually downloading packages from the web
Managing version conflicts yourself
Copy–pasting libraries between projects (which is.. chaos)
For AI & agentic projects, this gets even more important because:
You quickly pile up dependencies:
openai,anthropic,pydantic,fastapi,httpx,langchain,playwright, vector DB clients, etc.You often need different versions of the same packages across projects.
You care a LOT about reproducibility: if you’re shipping an agent to production, “it worked on my machine” is not good enough.
What is uv?
uv is a blazingly fast Python package & project manager written in Rust, created by Astral (the folks behind ruff). It’s designed as:
A drop-in replacement for
pipandpip-toolsA combined tool for:
Managing dependencies
Creating virtual environments
Managing Python versions
Initializing and building projects
Benchmarks show uv can be 10-100x faster than pip in many workflows, especially with caching.
Installing uv
There are multiple ways to install uv on your machine. Refer to the latest documentation for the most relevant option for your setup.
uv basics: the mental model
Here’s the core workflow you’ll use 90% of the time:
Create a project
Install / add dependencies
Run your code in that environment
Lock / sync dependencies for reproducibility
We’ll walk through each, then we’ll plug this into real AI/agent examples.
Creating a new project with uv init
uv can scaffold a project for you.
uv init ai-agent-demo
cd ai-agent-demoThis creates something like:
ai-agent-demo/
├── .gitignore
├── .python-version
├── README.md
├── main.py
└── pyproject.tomlYou already have a main.py and a pyproject.toml ready.
Run the starter script:
uv run main.pyuv run makes sure:
The environment exists
Dependencies match the lockfile (if any)
Your script runs inside that environment
Installing Python with uv
You don’t even need Python installed beforehand, uv can manage Python versions for you.
Install the latest Python:
uv python installOr a specific version (useful for AI libraries that want 3.10+ etc.):
uv python install 3.12Then uv uses that in your projects automatically.
Managing dependencies with uv add
Inside your project folder:
uv add httpxThis will:
Add
httpxtopyproject.tomlCreate a virtual environment (if needed)
Create/update
uv.lockInstall
httpxinto that environment
You can also install multiple in one go:
uv add openai python-dotenv httpxExample - a simple AI script calling an LLM API
Let’s say we’re building a tiny script that calls the OpenAI API (or any LLM API with a Python client).
Set up the project
This gives you an isolated environment with:
uv init ai-chat
cd ai-chat
uv add openai python-dotenvopenai– for the APIpython-dotenv– for.env-based secrets
Create .env
OPENAI_API_KEY=sk-your-key-hereWrite main.py
from dotenv import load_dotenv
import os
from openai import OpenAI
load_dotenv()
client = OpenAI(api_key=os.getenv(”OPENAI_API_KEY”))
def ask_llm(prompt: str) -> str:
response = client.chat.completions.create(
model=”gpt-4.1-mini”,
messages=[{”role”: “user”, “content”: prompt}],
temperature=0.2,
)
return response.choices[0].message.content
if __name__ == “__main__”:
answer = ask_llm(”Explain uv package manager in one sentence.”)
print(answer)Run it with uv
uv run main.pyYou now have:
A reproducible environment (thanks to
pyproject.toml+uv.lock)A clean way to call an external API
Locking, syncing, and reproducibility
For real-world AI systems (especially agentic ones that call many tools), you really want reproducible environments.
Key uv concepts:
pyproject.toml– what you want to installuv.lock– the exact versions you actually installed.venv/– the environment itself
Sync/recreate env from lockfile
On CI, a new machine, or when on-boarding teammates:
uv syncuv sync uses uv.lock to recreate the environment exactly, avoiding the “it works on my machine” drama.
Updating dependencies
If you change versions in pyproject.toml or add new ones:
uv lock # updates uv.lock
uv sync # syncs the env to match the lockThis workflow is especially good for:
Agent frameworks (LangChain, LlamaIndex, etc.)
Vector DB clients (e.g.
qdrant-client,weaviate-client)Browser automation tools (
playwright,selenium)
where version mismatches can cause subtle runtime bugs.
Managing multiple Python versions
You might have:
One agent project requiring Python 3.10+
Another legacy ML project stuck on 3.9
uv can juggle this with ease.
# Install multiple Python versions
uv python install 3.10 3.12
# Pin a specific version for this project
echo “3.12” > .python-version
# Create a venv using that version
uv venvWhenever you run
uv run main.pyuv will use the Python version pinned in .python-version.
Using uv in Docker
Common deployment pattern for AI backends:
Use
uvto install and lock dependencies.Copy
pyproject.toml+uv.lockinto an image.Run
uv syncin the container.
Super-rough Dockerfile sketch (just conceptual):
FROM python:3.12-slim
# Install uv (one way)
RUN pip install uv
WORKDIR /app
COPY pyproject.toml uv.lock ./
RUN uv sync --frozen # install deps from lockfile
COPY . .
CMD [”uv”, “run”, “server.py”]This makes your container builds:
Fast (thanks to uv caching & fast resolver)
Reproducible (same
uv.lockeverywhere)
Common use cases
LLM-powered backend with tools: Chat endpoint that can call external APIs (tools), then compose responses using LLMs:
uv add openai httpx pydantic fastapi uvicorn python-dotenvRetrieval-augmented generation (RAG) app: Document Q&A app where:
langchain-community/similar handles chains & toolschromadbor other vector client is your store
uv add openai langchain-community chromadb tiktoken python-dotenvBrowser-based agent: Agent that can browse the web, click buttons, extract data, then summarize via LLM.
uv add playwright openai python-dotenv pydanticMulti-agent orchestration: Multiple agents pulling jobs from a queue (Redis), sharing tools, and using LLMs to decide next actions.
uv add fastapi openai httpx pydantic redis rq python-dotenvWhere to learn more
Some nice references if you want to go deeper:

