Chapter 10: Cost Analysis and Optimization

Understanding and optimizing costs is crucial for sustainable self-hosting. This chapter provides detailed cost breakdowns and optimization strategies.

Cost Components

Infrastructure Costs

Component Purpose Typical Range
Compute API servers $0-50/month
Database Data storage $0-25/month
Cache Redis/memory $0-10/month
CDN Asset delivery $0-5/month
DNS Domain routing $0-5/month
SSL Certificates Free (Let’s Encrypt)

Third-Party Services

Service Purpose Typical Range
Email sending Notifications $0-20/month
Spam filtering Akismet/CleanTalk $0-10/month
Error tracking Sentry $0-26/month
Monitoring Uptime checks $0-10/month

Cost by Traffic Level

Low Traffic (< 1K comments/month)

Typical: Personal blog, small portfolio

Free Stack:

Monthly Cost: $0

┌────────────────────────────────────────────────────────────┐
│           LOW TRAFFIC COST BREAKDOWN ($0/month)            │
├────────────────────────────────────────────────────────────┤
│                                                             │
│  Cloudflare Workers              100K req/day free   $0.00 │
│  Turso (SQLite)                  8GB storage free    $0.00 │
│  Cloudflare DNS                  Always free         $0.00 │
│                                                             │
│  Estimated usage:                                           │
│  - 5K API calls/month           (within free tier)         │
│  - 50MB database                (within free tier)         │
│                                                             │
│  ────────────────────────────────────────────────────────  │
│  TOTAL:                                              $0.00 │
│                                                             │
└────────────────────────────────────────────────────────────┘

Medium Traffic (1K-10K comments/month)

Typical: Active blog, small community

Recommended Stack:

Monthly Cost: $0-20

┌────────────────────────────────────────────────────────────┐
│         MEDIUM TRAFFIC COST BREAKDOWN ($0-20/month)        │
├────────────────────────────────────────────────────────────┤
│                                                             │
│  Vercel (Hobby)                  100K func/month     $0.00 │
│  Supabase (Free)                 500MB database      $0.00 │
│  Upstash Redis (Free)            10K commands/day    $0.00 │
│                                                             │
│  Optional upgrades:                                         │
│  - Vercel Pro (if needed)                           $20.00 │
│  - Akismet (anti-spam)                               $5.00 │
│                                                             │
│  ────────────────────────────────────────────────────────  │
│  TOTAL:                                          $0-25/mo │
│                                                             │
└────────────────────────────────────────────────────────────┘

High Traffic (10K-100K comments/month)

Typical: Popular blog, active forum

Recommended Stack:

Monthly Cost: $25-100

┌────────────────────────────────────────────────────────────┐
│         HIGH TRAFFIC COST BREAKDOWN ($25-100/month)        │
├────────────────────────────────────────────────────────────┤
│                                                             │
│  Fly.io (2 VMs, 256MB)           ~$5 × 2            $10.00 │
│  Supabase Pro                    8GB database       $25.00 │
│  Upstash Redis                   Pay-as-you-go       $5.00 │
│  Akismet                         Plus plan          $10.00 │
│                                                             │
│  Bandwidth (estimated):                                     │
│  - 100GB egress                  (usually included)  $0.00 │
│                                                             │
│  ────────────────────────────────────────────────────────  │
│  TOTAL:                                             $50/mo │
│                                                             │
└────────────────────────────────────────────────────────────┘

Very High Traffic (100K+ comments/month)

Typical: News site, large community

Recommended Stack:

Monthly Cost: $100-500

Comparison with Commercial Solutions

Disqus

Plan Price Comments Ads
Basic Free Unlimited Yes
Plus $12/mo Unlimited No
Pro $115/mo Unlimited No, + analytics

Commento

Plan Price Page Views
Cloud $10/mo 50K
Cloud $20/mo 500K
Self-hosted Free Unlimited

Your Self-Hosted Solution

┌────────────────────────────────────────────────────────────┐
│              COST COMPARISON (10K comments/month)           │
├────────────────────────────────────────────────────────────┤
│                                                             │
│  Disqus Plus                                        $12/mo │
│  Commento Cloud                                     $10/mo │
│  Your Solution (optimized)                           $0/mo │
│  Your Solution (with extras)                        $25/mo │
│                                                             │
│  Annual Savings vs Disqus:                    $144-$288/yr │
│                                                             │
└────────────────────────────────────────────────────────────┘

Cost Optimization Strategies

Database Optimization

1. Use connection pooling:

# Supabase connection pooler
DATABASE_URL = "postgresql://postgres:[password]@db.[project].supabase.co:6543/postgres?pgbouncer=true"

2. Implement query caching:

async def get_comments_cached(page_id: str) -> list:
    cache_key = f"comments:{page_id}"
    
    # Check cache
    cached = await redis.get(cache_key)
    if cached:
        return json.loads(cached)
    
    # Query database
    comments = await get_comments_from_db(page_id)
    
    # Cache for 5 minutes
    await redis.setex(cache_key, 300, json.dumps(comments))
    
    return comments

3. Archive old comments:

async def archive_old_comments():
    """Move comments older than 1 year to archive table"""
    cutoff = datetime.utcnow() - timedelta(days=365)
    
    await db.execute("""
        INSERT INTO comments_archive 
        SELECT * FROM comments WHERE created_at < :cutoff
    """, {"cutoff": cutoff})
    
    await db.execute("""
        DELETE FROM comments WHERE created_at < :cutoff
    """, {"cutoff": cutoff})

Compute Optimization

1. Use edge caching:

# Cache headers for GET requests
@app.get("/api/comments")
async def get_comments(page_id: str):
    comments = await get_comments_cached(page_id)
    
    return Response(
        content=json.dumps(comments),
        headers={
            "Cache-Control": "public, max-age=60, stale-while-revalidate=300",
            "CDN-Cache-Control": "max-age=300"
        }
    )

2. Auto-scaling configuration:

# fly.toml - Scale to zero when idle
[http_service]
  auto_stop_machines = true
  auto_start_machines = true
  min_machines_running = 0

3. Regional deployment:

Deploy only where your users are:

# Single region for local audience
fly scale count 1 --region cdg

# Multiple regions for global audience
fly scale count 2 --region cdg,iad

Bandwidth Optimization

1. Compress responses:

from fastapi.middleware.gzip import GZipMiddleware

app.add_middleware(GZipMiddleware, minimum_size=500)

2. Use pagination:

@app.get("/api/comments")
async def get_comments(
    page_id: str,
    limit: int = 20,
    cursor: str = None
):
    # Cursor-based pagination reduces data transfer
    query = select(Comment).where(Comment.page_id == page_id)
    
    if cursor:
        query = query.where(Comment.id > cursor)
    
    query = query.limit(limit + 1)  # Fetch one extra to check for more
    
    results = await db.execute(query)
    comments = results.scalars().all()
    
    has_more = len(comments) > limit
    if has_more:
        comments = comments[:-1]
    
    return {
        "data": comments,
        "next_cursor": comments[-1].id if has_more else None
    }

3. Lazy load images:

<img 
    src="placeholder.svg" 
    data-src="${avatarUrl}" 
    loading="lazy"
    class="lazy-avatar"
>

Storage Optimization

1. Compress content:

import zlib

def compress_content(content: str) -> bytes:
    return zlib.compress(content.encode(), level=6)

def decompress_content(data: bytes) -> str:
    return zlib.decompress(data).decode()

2. Deduplicate avatars:

def get_avatar_url(email: str, size: int = 80) -> str:
    # Use Gravatar - no storage needed
    hash = hashlib.md5(email.lower().encode()).hexdigest()
    return f"https://www.gravatar.com/avatar/{hash}?s={size}&d=mp"

3. Clean up spam regularly:

async def cleanup_spam():
    """Delete spam older than 7 days"""
    cutoff = datetime.utcnow() - timedelta(days=7)
    
    await db.execute("""
        DELETE FROM comments 
        WHERE status = 'spam' AND created_at < :cutoff
    """, {"cutoff": cutoff})

Cost Monitoring

Track Usage Metrics

from prometheus_client import Counter, Histogram

# Metrics
api_requests = Counter('api_requests_total', 'Total API requests', ['endpoint', 'method'])
db_queries = Counter('db_queries_total', 'Total database queries')
cache_hits = Counter('cache_hits_total', 'Cache hit count')
cache_misses = Counter('cache_misses_total', 'Cache miss count')

# Track in code
@app.middleware("http")
async def metrics_middleware(request: Request, call_next):
    api_requests.labels(
        endpoint=request.url.path,
        method=request.method
    ).inc()
    return await call_next(request)

Set Up Alerts

# Example: Alert if costs exceed threshold
alerts:
  - name: high_database_usage
    condition: database_storage_bytes > 400_000_000  # 400MB
    action: send_email
    
  - name: high_api_usage
    condition: api_requests_daily > 80000  # 80% of free tier
    action: send_slack

Free Tier Maximization

Cloudflare Workers

Vercel

Supabase

Combined Free Stack

Cloudflare (DNS + CDN) → Vercel (API) → Supabase (DB) → Upstash (Redis)
        Free                Free           Free           Free

Chapter Summary

Traffic Level Monthly Cost Stack
< 1K/mo $0 CF Workers + Turso
1K-10K/mo $0-25 Vercel + Supabase
10K-100K/mo $25-100 Fly.io + Supabase Pro
100K+/mo $100-500 Dedicated infra

Key optimizations:

  1. Use caching aggressively
  2. Implement pagination
  3. Archive old data
  4. Monitor usage closely

Navigation: