⚠️ This post links to an external website. ⚠️
Every millisecond matters when you're in the critical path of API authentication. After two years of fighting serverless limitations, we rebuilt our entire API stack and slashed the end-to-end latency.
When we launched our API on Cloudflare Workers, it seemed like the perfect choice for an API authentication service. Global edge deployment, automatic scaling, and pay-per-use pricing. What's not to love?
Fast forward, and we've completely rebuilt it using stateful Go servers. The result is a 6x performance improvement and a dramatically simplified architecture that enabled self-hosting and platform independence.
TL;DR
- Moved from Cloudflare Workers to Go servers
- Lowered latency by 6x
- Eliminated complex caching workarounds and data pipeline overhead
- Simplified architecture from distributed system to straightforward application
- Enabled self-hosting and platform independence
Here's the story of why we made this move, the problems that forced our hand, and what we learned along the way.
continue reading on www.unkey.com
If this post was enjoyable or useful for you, please share it! If you have comments, questions, or feedback, you can email my personal email. To get new posts, subscribe use the RSS feed.