Sitemap

Optimizing uv in GitHub Actions: One Global Cache to Rule Them All

11 min readAug 12, 2025
Press enter or click to view image in full size
Photo by h heyerlein on Unsplash

Introduction

uv is a blazing-fast Python package manager by Astral that’s been gaining popularity as a modern alternative to pip, pip-tools, or poetry. It’s written in Rust, installs dependencies in parallel, and has smart features like lockfile management ( uv.lock).

For CI/CD pipelines, uvcan be a huge time saver… if you use it correctly.
Unfortunately, its default caching setup in GitHub Actions can lead to fragmented caches and unnecessary repeated installs, which slows everything down.

This post will show you:

  • How uv caching normally works in GitHub Actions
  • Why it causes fragmentation in multi-PR workflows
  • A step-by-step YAML setup to fix it
  • How to skip redundant dependencies install for unchanged dependencies

The Problem

When you try to use the setup-uv tools provided by astral

Official docs: https://docs.astral.sh/uv/guides/integration/github/

Then it will teach you this

- name: Enable caching
uses: astral-sh/setup-uv@v6
with:
enable-cache: true

Which seems normal, you follow the documentation, hoping it achieve the result you hope to see.

Then you will face the issues where the uv cache is very fragmented, even though you got the same hash key, the GitHub CI/CD workflow will store different cache when different Pull Request is made.

  • Same dependencies, different PRs → different caches
  • Even if uv.lock hasn’t changed, each PR stores its own copy of .venv and ~/.cache/uv
  • GitHub Actions storage fills up faster, and every small PR has to re-install dependencies

This will take up space when a small PR is made, it is not a big deal but it is just not ideal to always need to uv syncinstall all the dependencies again and again for every PR. In large projects with hundreds of MB of dependencies, this can waste 30–90 seconds per job — multiplied across multiple CI jobs and contributors.

Press enter or click to view image in full size

As you can see from the screenshot, there is 2 caches with the same hash keys but is stored twice! One in main , another one in exist in another PR

The Ideal Solution

In most Python projects:

  • Code changes happen all the time
  • Dependency changes are rare (maybe once a month) or even never change after the project is mature enough

if uv.lock hasn’t changed, your environment is already correct. There’s no need to re-install everything.

The trick is to:

  1. Build and cache the environment once when dependencies change.
  2. Share that cache accross all PRs instead of making a new one for each.
  3. Let uv sync quickly verify the environment instead of re-downloading packages.

This gives you one global cache that all jobs pull from, avoiding fragmentation and repeated installs.

The Practical

Press enter or click to view image in full size

Based on the official documentation, it is said that reusing cache across feature branches is not allowed today. But we still can have one pivot or canonical branch which is main so that the cache is still useful to become global cache.

Instead of letting each PR build and cache its own environment, we:

  1. Build a single, global uv cache only when uv.lock or pyproject.toml changes.
  2. Share that cache across all PRs so they never have to reinstall dependencies unless they’ve actually changed.

This approach removes cache fragmentation and keeps CI runs consistent.

Step 1 — Pre-build the Cache on main

Create a workflow that runs only when dependencies change:

name: Build uv cache

on:
push:
branches:
- main
paths:
- "uv.lock"
- "pyproject.toml"
workflow_dispatch:

jobs:
build-cache:
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v6
with:
version: "0.8.4"
python-version: "3.13"
enable-cache: false

- name: Install dependencies and populate cache
run: |
echo "Building global UV cache..."
uv sync --group dev
echo "Cache populated successfully"

- name: Save uv caches
uses: actions/cache/save@v4
with:
path: |
~/.cache/uv
~/.local/share/uv
.venv
key: uv-main-${{ hashFiles('uv.lock') }}

This ensure there’s one canonical cache tied to the lockfile which is uv.lock and pyproject.toml

After any push to main with any changes inuv.lock and pyproject.toml will trigger this build_uv_cache.yaml action

Press enter or click to view image in full size
Press enter or click to view image in full size

Now the cache have been built in cache table of your GitHub repository

Step 2 — Running the CI Action

Mypy Type Check

This is the mypy_type_check.yaml for Mypy GitHub Workflow

This workflow runs Mypy on every pull request or push to ensure that the codebase adheres to static type hints. It helps catch type-related bugs early before they make it into production and acts as a guardrail for maintainable Python code.

name: Mypy Type Check

on:
pull_request:
branches:
- "*"

jobs:
mypy-type-check:
runs-on: ubuntu-latest
permissions:
contents: read

steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Restore global uv cache
id: cache-restore
uses: actions/cache/restore@v4
with:
path: |
~/.cache/uv
~/.local/share/uv
.venv
key: uv-main-${{ hashFiles('uv.lock') }}
restore-keys: |
uv-main-

- name: Install uv
uses: astral-sh/setup-uv@v6
with:
version: "0.8.4"
python-version: "3.13"
enable-cache: false

- name: Install dependencies
run: uv sync --group dev

- name: Run mypy type checking
run: uv run -- mypy --install-types --non-interactive --strict dataweaver tests

- name: Save uv caches
if: steps.cache-restore.outputs.cache-hit != 'true'
uses: actions/cache/save@v4
with:
path: |
~/.cache/uv
~/.local/share/uv
.venv
key: uv-main-${{ hashFiles('uv.lock') }}

Then the workflow will be initiated in GitHub for every PR

Press enter or click to view image in full size

As you can see, it successfully restored the cache, thus saving time in installing the dependencies.

Unit Test & Coverage

Now let’s have another workflow which is called unit_test_coveraging.yaml

This workflow automatically runs the project’s unit tests and measures code coverage whenever new code is pushed or a pull request is opened.
It ensures that:

  • Existing functionality isn’t broken by new changes.
  • The codebase maintains a healthy level of test coverage.
  • Contributors get instant feedback if tests fail.

Noted that pytest you have to write your own

name: Unit Test & Coverage

on:
pull_request:
branches:
- "*"

jobs:
unit-test-coverage:
runs-on: ubuntu-latest
permissions:
contents: read
steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Restore uv caches
id: cache-restore
uses: actions/cache/restore@v4
with:
path: |
~/.cache/uv
~/.local/share/uv
.venv
key: uv-main-${{ hashFiles('uv.lock') }}
restore-keys: |
uv-main-

- name: Install uv
uses: astral-sh/setup-uv@v6
with:
version: "0.8.4"
python-version: "3.13"
enable-cache: false # We handle caching ourselves

- name: Install dependencies
run: uv sync --group dev

- name: Run Unit Tests with Coverage
run: |
uv run -- coverage run -m pytest
uv run -- coverage report --show-missing

- name: Save uv caches
if: steps.cache-restore.outputs.cache-hit != 'true'
uses: actions/cache/save@v4
with:
path: |
~/.cache/uv
~/.local/share/uv
.venv
key: uv-main-${{ hashFiles('uv.lock') }}

Then the workflow will be initiated in GitHub for every PR

Press enter or click to view image in full size

It successfully restored the cache as well, hence saving time in installing the dependencies.

Edge Cases

Cache in Main Expired + New PR Created

Now let’s say I delete the cache from the main branch to simulate expired cache in main

Press enter or click to view image in full size
Press enter or click to view image in full size

Now the cache is cleared!

Let’s create a Makefile to simplify our job. It definition of multiple commands with one custom command in terminal.

Makefile

create-random-pr:
echo "Creating random PR" >> random_pr.txt
git add .
git commit -m "Create random PR" || true
git push origin HEAD
gh pr create --title "Random PR" --body "This is a random PR"

Let’s try to checkout to another branch and create a PR

>> git branch -m testing-branch-1
>> make create-random-pr

Let’s run another one again

>> git branch -m testing-branch-2
>> make create-random-pr

Now you can see that 2 different caches have been built in the cache table. This is not ideal and not what we want.

Press enter or click to view image in full size

Next, let’s run the build_uv_cache.yaml workflow from GitHub Action

Press enter or click to view image in full size
Press enter or click to view image in full size

Now there is 3 different caches

Ok never mind, let’s try again

>> git branch -m testing-branch-3
>> make create-random-pr
Press enter or click to view image in full size
Press enter or click to view image in full size

You can see that the cache hit is from the main. This prove that the cache is global if built from main

Enhancement

Since the cache in GitHub table will be expired after 7 days with no usage, let’s say you don’t do any update to the GitHub repository, then there is no cache from main branch serve as a Global Cache.

View more about GitHub Caching Policy

We can add cron job schedule to build_uv_cache.yaml to ensure the cache on main is fresh.

Read more about GitHub Action — Schedule

name: Build uv cache

on:
push:
branches:
- main
paths:
- "uv.lock"
- "pyproject.toml"
workflow_dispatch:
schedule:
- cron: "0 0 */5 * *" # Every 5 days, before cache expiry

jobs:
build-cache:
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v4

- name: Install uv
uses: astral-sh/setup-uv@v6
with:
version: "0.8.4"
python-version: "3.13"
enable-cache: false

- name: Install dependencies and populate cache
run: |
echo "Building global UV cache..."
uv sync --group dev
echo "Cache populated successfully"

- name: Save uv caches
uses: actions/cache/save@v4
with:
path: |
~/.cache/uv
~/.local/share/uv
.venv
key: uv-main-${{ hashFiles('uv.lock') }}

Now this will have schedule a cron job to let the cache on main to refresh every 5 days before cache expiry which is 7 days according to GitHub cache usage policy

What Each Cache Stores

When configuring uv caching in GitHub Actions, it helps to understand what’s inside each directory because skipping the wrong one can cost you minutes of install time.

1. ~/.cache/uv

This is uv’s HTTP download cache.

  • Stores raw wheel and source tarballs downloaded from PyPI (or your private registry).
  • If a package is already in here, uv won’t download it again. It will just copy it into the environment.
  • Without this cache, uv will re-download dependencies every time, even if they’re already installed in .venv.

2. ~/.local/share/uv

This is uv’s global virtual environment registry.

  • Tracks metadata about installed environments, lockfile hashes, and which .venv corresponds to which project.
  • Contains cached built wheels and other compiled artifacts so that reinstallation is faster than building from scratch.
  • Without this directory, uv won’t recognize .venv as valid and may rebuild the environment even if .venv exists.

3. .venv

This is your actual project-specific virtual environment.

  • Contains installed packages for your specific project and Python version.
  • If uv sees .venv and the registry in ~/.local/share/uv matches the lockfile hash, it will skip dependency installation entirely.
  • Without this, your environment will always be rebuilt from the caches above.

What If We Didn’t Cache The ~/.local/share/uv

Let’s delete every cache and try this

Press enter or click to view image in full size

Alright now the cahce table is clear and empty

Press enter or click to view image in full size

Let’s run the build_uv_cache.yaml to build a fresh cache

Press enter or click to view image in full size

Now it is on the cache table

Press enter or click to view image in full size

Ok, let’s create a PR again

>> git branch -m testing-branch-4
>> make create-random-pr
Press enter or click to view image in full size

Then it will remove the .venv/ and then recreate the .venv/ to reinstall all the dependencies. This is not aligned with our ideal goal which is to reduce the CI time spent on downloading the dependencies. It slows down CI significantly as every PR or every push will have to remove the .venv/ and reinstall the dependencies.

In other words, if you want lightning-fast restores, you must cache all three:

~/.cache/uv
~/.local/share/uv
.venv

That combination ensures uv recognizes the environment as valid and skips unnecessary reinstalls.

Why the Global Cache Approach Wins

Before

Installing dependencies took 11s

After

Installing dependencies become syncing dependencies for checking. Near 0ms

With the global cache approach, PR workflows benefit from:

  • Zero reinstall time if dependencies haven’t changed.
  • No cache fragmentation so only one main cache exists per lockfile hash.
  • Predictable CI runtimes, which mean each PR runs against the same prebuilt environment from main.

Even better, GitHub’s own caching system works perfectly here if you avoid enabling enable-cache in setup-uv and instead manually manage caching with actions/cache.

Conclusion

By combining uv caching with GitHub Actions, we’ve significantly reduced dependency installation times and kept our CI/CD pipelines lean.
The key takeaways from this setup:

  • Centralized cache on main → Ensures all branches can benefit from pre-built dependencies.
  • Workflow separation → A dedicated cache builder plus focused type checking and test workflows keeps pipelines clean and maintainable.
  • Cache persistence → Storing ~/.cache/uv, ~/.local/share/uv, and .venv ensures both the package index and the virtual environment are reused efficiently.
  • Fast feedback → Tests, coverage, and type checks run automatically on PRs, giving developers confidence in their changes.

This approach not only saves build minutes but also makes the developer experience smoother. Faster CI/CD means more time for actual coding and less time waiting for installs to finish.

As I continue my own coding journey, I’ll be sharing more insights, tutorials, and personal experiences. If you found this guide helpful, I’d truly appreciate your support!

Buy me a coffee on Ko-fi

Stay Connected!

🔔 Follow me on Medium for more updates on my coding journey and in-depth technical blogs.
💻 Check out my projects on GitHub: github.com/szeyu
🔗 Connect with me on LinkedIn: linkedin.com/in/szeyusim

Thanks for reading, and happy coding!

--

--

szeyusim
szeyusim

Written by szeyusim

Bachelor's of Computer Science (Artificial Intelligence)

No responses yet