When you self-host Supabase, you trade managed infrastructure for complete control. But that control comes with a catch: your single-origin server becomes the bottleneck for users worldwide. A user in Tokyo requesting assets from your Frankfurt server faces 200+ milliseconds of latency on every request.
Content Delivery Networks (CDNs) solve this by caching your content at edge locations globally. Supabase Cloud includes Cloudflare CDN automatically, but self-hosted deployments need manual configuration. This guide covers everything from basic CDN setup to advanced caching strategies that can reduce your origin server load by 90%.
Why CDN Matters for Self-Hosted Supabase
Self-hosted Supabase typically runs on a single VPS in one geographic region. Every request—whether it's an API call, image fetch, or static asset—travels to that one location. This creates three problems:
Latency compounds quickly. A user in Sydney fetching assets from your US-based server adds 150-200ms per request. Load an image gallery with 20 thumbnails, and you're looking at 3-4 seconds of waiting just for network round-trips.
Origin server overload. Without caching, every identical request hits your server. Ten thousand users requesting the same product image means ten thousand database queries and file reads. Your VPS runs hot while serving the same bytes repeatedly.
Bandwidth costs spiral. VPS providers charge for egress bandwidth. Serving a 500KB image to 10,000 users costs you 5GB of bandwidth—per image. Multiply that across your entire storage bucket, and costs add up fast.
CDN caching addresses all three issues by storing content at edge nodes worldwide. That Sydney user gets the image from an Australian edge node in 20ms instead of 200ms. Your origin server handles the first request; the CDN handles the next 9,999.
Choosing a CDN for Self-Hosted Supabase
Several CDN providers work well with self-hosted Supabase. Your choice depends on budget, geographic coverage, and technical requirements.
Cloudflare (Free Tier Available)
Cloudflare offers the best free tier for self-hosted Supabase. The free plan includes unlimited bandwidth, global edge network, and DDoS protection. You're limited to 3 page rules on free plans, but that's usually enough for basic caching configuration.
Pros: Free tier, excellent performance, easy setup, built-in security features Cons: Free tier has limited page rules, aggressive caching can cause issues with dynamic content
Bunny CDN (Pay-as-you-go)
Bunny CDN charges per GB transferred—typically $0.01-0.06 per GB depending on region. This pay-as-you-go model works well for smaller projects where Cloudflare's pro features aren't needed but you want predictable pricing.
Pros: Simple pricing, good performance, pull zone configuration Cons: No free tier, requires more manual configuration
AWS CloudFront
If you're already using AWS services or S3-compatible storage with Supabase, CloudFront integrates naturally. It's more complex to configure but offers fine-grained control over caching behavior.
Pros: Deep AWS integration, detailed analytics, sophisticated cache policies Cons: Complex pricing, steeper learning curve
For most self-hosted Supabase deployments, Cloudflare offers the best balance of features, performance, and cost. The rest of this guide focuses on Cloudflare, though the caching principles apply to any CDN.
Setting Up Cloudflare for Self-Hosted Supabase
Step 1: Add Your Domain to Cloudflare
If you've already configured custom domains for Supabase, you may already have Cloudflare in front of your setup. If not, start by adding your domain to Cloudflare:
- Sign up at cloudflare.com
- Add your domain and let Cloudflare scan existing DNS records
- Update your domain's nameservers to Cloudflare's (provided during setup)
- Wait for DNS propagation (typically 15-30 minutes)
Step 2: Configure DNS Records
Your Supabase services need DNS records pointing to your VPS. With Cloudflare, you'll see an orange cloud (proxied) or gray cloud (DNS only) next to each record:
Type Name Content Proxy A supabase 203.0.113.50 Proxied (orange) A api 203.0.113.50 Proxied (orange) A storage 203.0.113.50 Proxied (orange) CNAME studio supabase.example.com DNS only (gray)
Important: Proxy your public-facing services (API, Storage) but consider keeping Studio as DNS-only. Proxying Studio can cause WebSocket issues with the real-time dashboard features.
Step 3: SSL Configuration
Under SSL/TLS settings, set encryption mode to "Full (strict)". This ensures traffic is encrypted both between users and Cloudflare, and between Cloudflare and your origin server.
If you're using Let's Encrypt certificates on your origin, Full (strict) mode validates the certificate chain completely. Never use "Flexible" mode—it leaves the connection between Cloudflare and your origin unencrypted.
Caching Strategies for Supabase Storage
Supabase Storage serves files through the Storage API at /storage/v1/object/public/ (public buckets) or /storage/v1/object/sign/ (signed URLs for private buckets). Each requires different caching strategies.
Public Bucket Caching
Public buckets are straightforward to cache. Files are accessible without authentication, making them perfect CDN candidates.
Create a Cloudflare Page Rule (or Cache Rule in the newer interface):
URL: *example.com/storage/v1/object/public/* Setting: Cache Level - Cache Everything Edge Cache TTL: 1 month Browser Cache TTL: 1 week
This tells Cloudflare to cache all public storage objects for one month at the edge and instruct browsers to cache for one week.
For more granular control, use Transform Rules to set cache headers based on file type:
When: URI Path contains "/storage/v1/object/public/" AND
URI Path matches ".*\.(jpg|jpeg|png|gif|webp|svg)$"
Then: Set Cache-Control header to "public, max-age=31536000, immutable"
The immutable directive tells browsers the file will never change—perfect for content-addressed assets like user uploads with unique filenames.
Private Bucket Caching Challenges
Private buckets use signed URLs with expiration timestamps. This creates a caching problem: each generated URL is unique, causing CDN cache misses even for identical files.
Strategy 1: URL normalization
If your signed URLs include query parameters for authentication, configure Cloudflare to ignore those parameters when generating cache keys:
Under Caching > Configuration > Cache Key, add a rule to ignore specific query parameters:
Ignore query string: token
This caches based on the base URL, not the signed token. Be careful—this only works if your signed URLs share the same base path.
Strategy 2: Application-level caching
Cache signed URLs in your application rather than generating new ones for each request:
// Example: Cache signed URLs in Redis
async function getSignedUrl(bucket: string, path: string): Promise<string> {
const cacheKey = `signed-url:${bucket}:${path}`;
// Check cache first
const cached = await redis.get(cacheKey);
if (cached) return cached;
// Generate new signed URL (valid for 1 hour)
const { data } = await supabase.storage
.from(bucket)
.createSignedUrl(path, 3600);
// Cache for 50 minutes (safety margin before expiry)
await redis.setex(cacheKey, 3000, data.signedUrl);
return data.signedUrl;
}
This approach reuses the same signed URL across multiple requests, improving CDN hit rates while maintaining access control.
Caching Supabase API Responses
Caching REST API responses requires more caution than static assets. You're dealing with dynamic data that might change between requests.
Safe API Caching Patterns
Some API endpoints are safe to cache:
Public read endpoints: If you're fetching public data that rarely changes (product listings, blog posts, configuration), cache at the edge:
URL: *api.example.com/rest/v1/products* Cache Level: Standard Edge Cache TTL: 5 minutes
User-specific data: Never cache at the edge. Use Cache-Control: private, no-store headers to ensure user data only caches in the user's browser.
Cache-Control Headers from PostgREST
PostgREST (Supabase's REST API layer) doesn't set cache headers by default. You can add them through your reverse proxy configuration.
In your Nginx configuration (if using reverse proxy setup):
location /rest/v1/products {
proxy_pass http://kong:8000;
# Add caching headers for product listings
add_header Cache-Control "public, max-age=300, stale-while-revalidate=60";
add_header Vary "Accept, Accept-Encoding";
}
location /rest/v1/users {
proxy_pass http://kong:8000;
# Prevent caching of user data
add_header Cache-Control "private, no-store, no-cache, must-revalidate";
}
Stale-While-Revalidate Strategy
The stale-while-revalidate directive is powerful for API caching. It tells clients: "Use the cached version immediately, but fetch a fresh copy in the background."
Cache-Control: public, max-age=60, stale-while-revalidate=300
This means:
- For the first 60 seconds, serve from cache
- From 60-360 seconds, serve stale content while fetching fresh data
- After 360 seconds, wait for fresh data
Users get instant responses while data stays reasonably fresh.
Monitoring Cache Performance
Setting up caching is only half the battle. You need to monitor whether it's actually working.
Cloudflare Analytics
In your Cloudflare dashboard, check:
- Cache Analytics: Shows hit rate, bandwidth saved, and requests by cache status
- Traffic Analytics: Reveals geographic distribution and peak usage times
A healthy CDN setup should show 70-90% cache hit rates for static assets. Lower rates indicate misconfigured cache rules or highly dynamic content.
Response Header Inspection
Check cache status directly in browser DevTools. Look for these headers:
cf-cache-status: HIT # Served from edge cache cf-cache-status: MISS # Fetched from origin cf-cache-status: EXPIRED # Cache entry expired, refetched cf-cache-status: BYPASS # Not cacheable (dynamic content)
If you're seeing lots of BYPASS responses for content that should be cached, review your cache rules and origin headers.
Origin Server Monitoring
With effective caching, your origin server should handle significantly fewer requests. Monitor your Supabase observability stack for:
- Requests per second to storage endpoints
- Database queries for public data
- Bandwidth egress from your VPS
A successful CDN deployment might show 80% reduction in origin requests for cached endpoints.
Cache Invalidation Strategies
The hardest problem in caching is knowing when to invalidate. Here's how to handle it for self-hosted Supabase:
Manual Purging
Cloudflare provides purge APIs you can call when content changes:
curl -X POST "https://api.cloudflare.com/client/v4/zones/{zone_id}/purge_cache" \
-H "Authorization: Bearer {api_token}" \
-H "Content-Type: application/json" \
--data '{"files":["https://storage.example.com/storage/v1/object/public/images/logo.png"]}'
Integrate this into your application when files are updated:
// After updating a file in Supabase Storage
await supabase.storage.from('images').upload('logo.png', file, { upsert: true });
// Purge CDN cache
await fetch('https://api.cloudflare.com/client/v4/zones/xxx/purge_cache', {
method: 'POST',
headers: {
'Authorization': `Bearer ${process.env.CLOUDFLARE_TOKEN}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({
files: ['https://storage.example.com/storage/v1/object/public/images/logo.png']
})
});
Cache Tags (Enterprise)
Cloudflare Enterprise offers cache tags—a way to purge groups of related content. Tag all product images with product-images, then purge the tag when any product updates. This feature isn't available on free or pro plans.
Content-Addressed URLs
The simplest invalidation strategy is avoiding it entirely. Use content-addressed URLs that include a hash or version:
/storage/v1/object/public/images/logo-v2.png /storage/v1/object/public/images/logo-abc123.png
When content changes, the URL changes. Old URLs remain cached (harmless), and new URLs cache fresh content automatically. This is why Supabase Cloud generates unique URLs for transformed images.
Performance Gains and Cost Savings
After implementing CDN caching for self-hosted Supabase, expect these improvements:
Latency reduction: Global users experience 50-200ms faster load times. Assets load from nearby edge nodes instead of your origin server.
Bandwidth savings: With 80% cache hit rates, your VPS egress drops proportionally. If you were serving 100GB monthly, that drops to 20GB from origin—significant savings on metered VPS plans.
Origin server headroom: Your VPS handles fewer requests, leaving more CPU and memory for database operations and API processing. This can delay or eliminate the need for scaling to larger instances.
Resilience: CDN edge nodes can serve cached content even if your origin experiences brief outages. Users might see slightly stale data, but the application remains functional.
Common Pitfalls to Avoid
Over-caching authenticated content. Never cache responses that contain user-specific data at the edge. Check your RLS policies—just because a row is returned doesn't mean it should be cached publicly.
Forgetting Vary headers. If your API returns different content based on Accept-Encoding or Authorization headers, include Vary headers so the CDN maintains separate cache entries.
Ignoring cache warming. After a cache purge or CDN configuration change, the first request to each asset is a cache miss. For critical content, implement cache warming by proactively requesting assets after deployment.
WebSocket interference. Supabase Realtime uses WebSockets. Some CDN configurations break long-lived connections. Test real-time features thoroughly after enabling CDN proxying.
Conclusion
CDN caching transforms self-hosted Supabase from a single-server deployment into a globally distributed platform. The configuration takes an afternoon, but the performance and cost benefits compound over the lifetime of your application.
Start with Cloudflare's free tier—it handles most use cases without cost. Configure aggressive caching for public storage, conservative caching for APIs, and no caching for authenticated endpoints. Monitor your cache hit rates and adjust TTLs based on your content update frequency.
For production deployments, Supascale simplifies self-hosted Supabase management, letting you focus on CDN optimization rather than infrastructure maintenance. Combined with proper caching, you get managed-service performance at self-hosted prices.
