Vite+ TanStack Start on Cloudflare: Fast, but Not Vercel Fast
I tested a new stack from scratch and compared it to my Next.js and Vercel baseline. It works, but I had to build parts of the deployment DX myself.
Vercel and Nextjs still seems to have the best experience for getting from zero to production. I keep hearing their prices are high with Cloudflare being a great alternative. Nextjs isn’t available on Cloudflare though yet, unless I go with OpenNext but I heard it’s nothing but pain. So I opted for Tanstack Start! With all the AI tools available, nobody gets excited anymore but I do! I couldn’t wait to learn something new along the way.
Here’s what I’m going with:
- Vite+,
- TanStack Start
- Cloudflare Workers
Short version: I got it working, it is genuinely fast, and I trust it enough to reuse. But it is not as instant as Vercel out of the box.
What I actually wanted to prove
I wanted a practical answer: can I go from zero to production deploy plus PR previews, without hand-holding every release? This is what Vercel excels at. You just point to your Github repository and you get instant deployments for every branch.
What worked well quickly
Cloudflare deploy speed is excellent.
TanStack Start on Workers feels great once build output and runtime are aligned.
Vite+ was the biggest practical win for day-to-day flow. I standardized everything on vp locally and in CI:
vp devvp checkvp testvp build
That one decision removed a lot of “works on my machine” nonsense. 5 stars for Vite+!
What I had to build manually
To get a Vercel-style preview flow, I had to build the whole lifecycle myself in GitHub Actions.
I had to implement and tune all of this:
- Separate workflows for production deploy, preview deploy, and preview cleanup
- Deterministic worker naming from PR number plus branch slug
- Worker name sanitization and length limits for Cloudflare constraints
- An extra fix for trailing hyphens in generated names
- Concurrency control to avoid race conditions in preview deploys
- Parsing Wrangler output to capture preview URLs
- PR comment upsert logic so reviewers always see one current preview link
- Cleanup on PR close to delete preview workers and avoid leftovers
I also switched cleanup to vp dlx wrangler delete ... --force, so teardown did not depend on full dependency install state.
None of this is hard in isolation. Together, it is still real platform plumbing that Vercel gives you by default.
Vite+ and Cloudflare vs Next.js and Vercel
My honest take right now:
- Next.js and Vercel still win on instant deployment UX
- Vite+ plus TanStack plus Cloudflare is very workable today, but more DIY
- If you want control over infra behavior, this stack is rewarding
- If you want almost no setup for previews and deploys, Vercel is still ahead
That does not make the Cloudflare route worse. It just moves responsibility back to you, especially for preview lifecycle and workflow reliability.
Where Void fits in
I am paying close attention to Void. An opinionated Cloudflare-first framework on top of Vite+ is exactly what could close this DX gap. Right now it is not ready, so I treated this as a ground-up build.
If Void bakes in the patterns I had to wire by hand, this setup gets a lot closer to the “just ship it” experience people expect from Vercel.
Final thoughts
This experiment did exactly what I needed and now I have a template ready for any project. It has reliable production deploys, automatic previews, and cleanup that does not leak resources.
I am not replacing Next.js and Vercel everywhere tomorrow. But I now have a Cloudflare-native starter I can trust, and that is a solid result for a first serious run at this stack.