I need help with automating deployments to my home server setup. I have a Raspberry Pi 5 running Ubuntu 22.04 with Docker containers for various web applications. Currently I’m using Cloudflare Zero Trust tunnels to access these apps from outside my network. Each app has its own subdomain like project1.example.com and I set up access policies that require email verification plus OTP.
Right now I have to manually SSH into the Pi every time I want to update code. I git pull the changes and restart containers with docker-compose. This gets tedious when I’m making frequent updates.
I want to set up GitHub Actions to automatically deploy when I push code. My idea was to use rsync or SSH from the CI pipeline to copy files and restart services. I tried setting up service tokens for Cloudflare Access but the GitHub runner still gets stuck waiting for manual authentication. It shows a Cloudflare URL that needs human interaction.
What’s the best approach for this kind of setup? Should I be using cloudflared daemon in the GitHub Actions workflow? How do I properly configure service tokens so the pipeline runs without any manual steps? I’m also wondering if there’s a more reliable method than SSH for this type of home server deployment.
skip cloudflare access for deployments - it’s a pain. set up ssh key auth directly to your pi and change the port from 22. way easier than dealing with service tokens that break constantly. i just use port forwarding on my router for ssh access, then github actions deploys straight through. no cloudflare headaches.
I had the same frustration with my home lab setup. A self-hosted runner works way better than trying to punch through Cloudflare Access from GitHub’s hosted runners. Just install the GitHub Actions runner directly on your Pi or another machine in your network - this kills the external auth issues completely. The runner gets local network access to your Pi without the tunnel.
For service tokens, make sure you’re creating the token in the right Cloudflare Access application and setting the CF-Access-Client-Id and CF-Access-Client-Secret headers properly. But honestly, I found this method unreliable for automated deployments.
Another option: expose a webhook endpoint on your Pi that GitHub can call directly through the tunnel. Create a simple API that validates the webhook signature and triggers your deployment script locally. This keeps the heavy lifting on your network instead of managing remote access from CI runners.
Try a deployment agent instead of external SSH. I run a lightweight service on my Pi that either polls a deployment endpoint or listens for GitHub webhooks. The agent runs locally and handles git pulls, docker builds, and container restarts - no external auth headaches.
You could also use GitHub’s deployment API. Your Pi checks periodically for new deployments and pulls them down. This flips the connection - instead of GitHub pushing to your Pi, the Pi pulls from GitHub when there’s updates.
For webhooks, create a simple Flask app that validates GitHub webhook signatures and triggers your deployment scripts. Expose it through your Cloudflare tunnel on a dedicated subdomain like deploy.example.com with its own access policy. This keeps your main SSH access locked down while giving you clean automation that doesn’t need service tokens or external SSH keys.