Raypx

Rate Limiting

In-memory rate limiting for RPC and auth endpoints.

Raypx includes an in-memory rate limiter to protect API endpoints from abuse. It uses a sliding window approach with automatic cleanup of stale entries.

Implementation

The rate limiter is defined in src/lib/rate-limit.ts:

src/lib/rate-limit.ts
interface RateLimitEntry {
  count: number
  resetAt: number
}

const store = new Map<string, RateLimitEntry>()

// Cleanup stale entries every 60 seconds
setInterval(() => {
  const now = Date.now()
  for (const [key, entry] of store) {
    if (entry.resetAt <= now) store.delete(key)
  }
}, 60_000)

export interface RateLimitOptions {
  /** Maximum number of requests in the window */
  limit: number
  /** Window duration in seconds */
  window: number
}

export function rateLimit(
  key: string,
  options: RateLimitOptions = { limit: 100, window: 60 },
): RateLimitResult {
  const now = Date.now()
  const windowMs = options.window * 1000
  const entry = store.get(key)

  if (!entry || entry.resetAt <= now) {
    const resetAt = now + windowMs
    store.set(key, { count: 1, resetAt })
    return { success: true, remaining: options.limit - 1, resetAt }
  }

  if (entry.count >= options.limit) {
    return { success: false, remaining: 0, resetAt: entry.resetAt }
  }

  entry.count++
  return { success: true, remaining: options.limit - entry.count, resetAt: entry.resetAt }
}

Middleware Helper

The createRateLimitMiddleware function creates a request-level middleware that only rate-limits mutating methods:

src/lib/rate-limit.ts
export function createRateLimitMiddleware(options: RateLimitOptions) {
  return (request: Request): RateLimitResult | null => {
    // Only rate limit mutating methods
    if (request.method === "GET" || request.method === "HEAD") return null

    const ip = request.headers.get("x-forwarded-for")?.split(",")[0]?.trim() ?? "unknown"
    const key = `${ip}:${new URL(request.url).pathname}`
    return rateLimit(key, options)
  }
}

Key behaviors:

  • GET and HEAD requests are not rate-limited -- Read operations pass through freely.
  • Rate limit key format -- {IP}:{pathname}, for example 192.168.1.1:/api/rpc.
  • IP extraction -- Uses the first entry in the X-Forwarded-For header (set by your reverse proxy).
  • Fallback -- If no IP is found, the key uses "unknown".

Current Configuration

Rate limiting is applied to two endpoint groups:

RPC Endpoints (/api/rpc)

src/routes/api/rpc/$.ts
const checkRateLimit = createRateLimitMiddleware({ limit: 100, window: 60 })
  • Limit: 100 requests per 60-second window per IP.
  • Applies to: POST, PUT, PATCH, DELETE requests only.

Auth Endpoints (/api/auth)

src/routes/api/auth/$.ts
const checkRateLimit = createRateLimitMiddleware({ limit: 30, window: 60 })
  • Limit: 30 requests per 60-second window per IP.
  • Applies to: All methods (both GET and POST).

The auth endpoint rate-limits both GET and POST because both handlers call checkRateLimit without filtering. This is intentional -- sign-in attempts (both email and OAuth) should be rate-limited.

Response on Limit Exceeded

When a request exceeds the rate limit, a 429 Too Many Requests response is returned:

if (result && !result.success) {
  return new Response("Too Many Requests", {
    status: 429,
    headers: {
      "Retry-After": String(Math.ceil((result.resetAt - Date.now()) / 1000)),
      "X-RateLimit-Remaining": "0",
    },
  })
}

Headers included:

HeaderDescription
Retry-AfterSeconds until the rate limit window resets.
X-RateLimit-RemainingRemaining requests in the current window (0 when limited).

Auto Cleanup

Stale entries are automatically cleaned up every 60 seconds:

setInterval(() => {
  const now = Date.now()
  for (const [key, entry] of store) {
    if (entry.resetAt <= now) store.delete(key)
  }
}, 60_000)

This prevents memory leaks from accumulating expired rate limit entries.

Adjusting Limits

To change the rate limits, modify the options passed to createRateLimitMiddleware:

// Stricter: 50 requests per minute
const checkRateLimit = createRateLimitMiddleware({ limit: 50, window: 60 })

// More permissive: 200 requests per 2 minutes
const checkRateLimit = createRateLimitMiddleware({ limit: 200, window: 120 })

The rate limiter uses an in-memory Map, which means it is per-process. In a multi-instance deployment, each instance maintains its own rate limit store. For distributed rate limiting, consider migrating to Redis (see the project roadmap).

On this page