Test caching large files from R2 using body.tee() instead of clone().
response.clone() buffers the entire body into memory.
Workers have a 128 MB memory limit.
For large files (like videos), clone() will fail with memory exceeded error.
body.tee() creates two independent streams without full buffering.
const [stream1, stream2] = response.body.tee(); // stream1 → cache.put() // stream2 → return to client
Caveat: If one stream is consumed faster than the other, data backs up in memory.
| Method | Description | Expected | Action |
|---|---|---|---|
tee() |
Original: tee + waitUntil | ? Testing | Test |
tee-await |
tee + TransformStream + waitUntil(Promise.all) | ? Testing | Test |
tee-fixed |
tee + FixedLengthStream wrapper | ? Testing | Test |
pipe-through |
tee + TransformStream pipes | ? Testing | Test |
manual-pump |
Manual read/write to both destinations | ? Testing | Test |
no-wait |
tee without waitUntil (fire & forget) | ? Testing | Test |
readable-from |
Buffer chunks + ReadableStream.from() | ✗ Memory | Test |
cache-after |
Buffer entire file, then cache | ✗ Memory | Test |
clone() |
Baseline: response.clone() | ✗ Memory | Test |
stream |
No caching, direct R2 stream | ✓ Works | Test |
Test sequence for each method:
curl 'https://cache-edge.erfianugrah.com/r2-tee/clear?key=erfi-135kg.mp4'curl 'https://cache-edge.erfianugrah.com/r2-tee/METHOD?key=erfi-135kg.mp4' -o /dev/nullcurl 'https://cache-edge.erfianugrah.com/r2-tee/check?key=erfi-135kg.mp4'Replace METHOD with: tee, tee-await, tee-fixed, pipe-through, manual-pump, no-wait
| Header | Meaning |
|---|---|
X-Cache-Status | HIT = served from Cache API, MISS = fresh from R2 |
X-Cache-Method | tee or clone - which method was used |
X-File-Size-MB | File size in MB |
CF-Cache-Status | HIT when served from Cache API match() |
// From caching.mdx - handling large responses
async function fetchLargeWithCache(
originUrl: string,
ctx: ExecutionContext,
): Promise<Response> {
const cache = caches.default;
const cacheKey = new Request(originUrl, { method: "GET" });
const cached = await cache.match(cacheKey);
if (cached) return cached;
const originResp = await fetch(originUrl);
if (!originResp.ok || !originResp.body) return originResp;
// tee() creates two streams from one - avoids buffering entire body
const [stream1, stream2] = originResp.body.tee();
const headers = new Headers(originResp.headers);
headers.delete("Set-Cookie");
if (!headers.has("Cache-Control")) {
headers.set("Cache-Control", "public, max-age=3600");
}
const responseToCache = new Response(stream1, {
status: originResp.status,
headers,
});
ctx.waitUntil(cache.put(cacheKey, responseToCache));
return new Response(stream2, { status: originResp.status, headers });
}