Is this page static or dynamic with cache components?

HEAD
09851e2Revert "Ensure _middleware is consistent in adapter for name/id" (#89624)
This post might have stale content, as HEAD is 54 commits ahead.

With cache components enabled, would this page be static or dynamic?

// src/app/page.tsx

import {defineQuery} from 'next-sanity' import {getDynamicFetchOptions, sanityFetch, type DynamicFetchOptions} from '@/sanity/lib/live' import {Suspense} from 'react'

const PRODUCTS_QUERY = defineQuery( *[_type == "product" && defined(slug.current)][0...$limit]{_id,slug,title}, )

export default function Page() { return ( <Suspense fallback={

Loading…
}> ) }

async function DynamicProductsList() { const {perspective, stega} = await getDynamicFetchOptions() return }

async function CachedProductsList({perspective, stega}: DynamicFetchOptions) { 'use cache'

const {data: products} = await sanityFetch({ query: PRODUCTS_QUERY, params: {limit: 10}, perspective, stega, })

return (

{products.map((product) => (
<a href={/product/${product.slug}}>{product.title}
))}
) }

Avatar of anthropic/claude-opus-4.6
Claude Opus 4.6
commented

Now I have the full picture. Here's the analysis:


##This page would be static (with a dynamic hole) — i.e., a Partial Prerender (PPR)

Here's the reasoning, traced through the Next.js source:

###The Rendering Flow

  1. Page() renders a <Suspense> boundary wrapping <DynamicProductsList />.

  2. DynamicProductsList calls await getDynamicFetchOptions() — this is a user-land function (from next-sanity) that almost certainly reads a dynamic API like draftMode(), cookies(), or headers() to determine the current perspective (e.g., published vs. draft) and stega settings.

  3. During the prerender (build), the work unit store type is 'prerender'. When a dynamic API like cookies() or draftMode() is called in this context, it returns a hanging promise — a promise that never resolves:

    From dynamic-rendering-utils.ts:39:

    // This function constructs a promise that will never resolve. This is primarily
    // useful for cacheComponents where we use promise resolution timing to determine which
    // parts of a render can be included in a prerender.
    export function makeHangingPromise<T>(signal, route, expression): Promise<T> { ... }
    

    For example, cookies() in a 'prerender' store (cookies.ts:76-77):

    case 'prerender':
      return makeHangingCookies(workStore, workUnitStore)
    
  4. Because getDynamicFetchOptions() awaits this hanging promise, DynamicProductsList never completes during prerender. This means React can't resolve the component within a single task.

  5. The prerenderAndAbortInSequentialTasks mechanism (app-render-prerender-utils.ts:12) schedules the render in one setTimeout task, then aborts in the next task. Since DynamicProductsList is still pending (blocked by the hanging promise), the abort callback fires and sets serverIsDynamic = true:

    From app-render.tsx:~5068-5074:

    if (prerenderIsPending) {
      // If prerenderIsPending then we have blocked for longer than a Task
      // and we assume there is something unfinished.
      serverIsDynamic = true
    }
    
  6. But crucially, the <Suspense> boundary around <DynamicProductsList /> catches this. The page shell (including the <Suspense fallback>) does complete within the prerender task, so the static shell is captured. The dynamic hole (the Suspense boundary's children) is marked for runtime rendering.

  7. The CachedProductsList component with "use cache" never even executes during the prerender — it's inside the dynamic subtree that was cut off. At runtime, when the dynamic hole is filled:

    • getDynamicFetchOptions() resolves with actual request data
    • CachedProductsList runs with "use cache", so its result is cached in the cache handler
    • On subsequent requests, the cached result is served without re-executing the Sanity query

###The Result

PartRendering
Page shell (everything outside Suspense)Static — prerendered at build time
<Suspense fallback> (Loading…)✅ Included in static shell
<DynamicProductsList />Dynamic hole — rendered at request time
<CachedProductsList />📦 Cached — rendered once at runtime, then served from cache

So the page is rendered as a Partial Prerender: a static shell with a dynamic hole inside the <Suspense> boundary. The "use cache" on CachedProductsList ensures that once the dynamic data (perspective, stega) is determined at request time, the Sanity query result is cached and reused for subsequent requests with the same arguments.

###Key Insight

The "use cache" directive on CachedProductsList doesn't make the whole page static — it caches the output of that specific function keyed by its arguments (perspective, stega). The page is still partially dynamic because getDynamicFetchOptions() reads request-specific data (likely draftMode() or cookies()), which produces a hanging promise during prerendering, creating the dynamic boundary.

Avatar of sgalanb
Santiago Galán
commented

Here is getDynamicFetchOptions:

// src/sanity/lib/live.ts

import {createClient} from 'next-sanity' import {defineLive, resolvePerspectiveFromCookies, type LivePerspective} from 'next-sanity/live' import {cookies, draftMode} from 'next/headers'

const client = createClient({ projectId: process.env.NEXT_PUBLIC_SANITY_PROJECT_ID, dataset: process.env.NEXT_PUBLIC_SANITY_DATASET, useCdn: true, apiVersion: 'v2025-03-04', stega: {studioUrl: '/studio'}, })

const token = process.env.SANITY_API_READ_TOKEN if (!token) { throw new Error('Missing SANITY_API_READ_TOKEN') }

export const {sanityFetch, SanityLive} = defineLive({ client, serverToken: token, browserToken: token, })

// Resolves perspective and stega outside 'use cache' boundaries export interface DynamicFetchOptions { perspective: LivePerspective stega: boolean } export async function getDynamicFetchOptions(): Promise { const {isEnabled: isDraftMode} = await draftMode() if (!isDraftMode) { return {perspective: 'published', stega: false} }

const jar = await cookies() const perspective = await resolvePerspectiveFromCookies({cookies: jar}) return {perspective: perspective ?? 'drafts', stega: true} }

Avatar of anthropic/claude-opus-4.6
Claude Opus 4.6
commented

The "use cache" function body runs in a UseCacheStore (type 'cache'), not a PrerenderStore. Inside this clean scope, sanityFetch presumably uses the Sanity client which calls fetch(). The patched fetch inside "use cache" would behave as a normal cached fetch — it resolves normally and the cache entry is populated.

So during the prerender:

  1. Page() renders → <Suspense>DynamicProductsList()
  2. getDynamicFetchOptions()await draftMode() → resolves immediately with isEnabled: false → returns {perspective: 'published', stega: false}
  3. CachedProductsList({perspective: 'published', stega: false})"use cache" runs the function body in a UseCacheStoresanityFetch() does a network fetch → resolves → cache entry is stored
  4. The component tree returns HTML

Everything resolves within a single task. The prerenderAndAbortInSequentialTasks abort callback fires in the next task, finds prerenderIsPending === false, and serverIsDynamic stays false.

##Revised answer: This page would be fully static

Here's why:

StepWhat happens during prerender
draftMode()Resolves immediately with isEnabled: false in 'prerender' store — no dynamic signal
.isEnabled accessPlain getter, returns falseno dynamic tracking
Early returncookies() is never reached — the dynamic API that would hang is skipped
CachedProductsList "use cache"Runs in a clean UseCacheStore, sanityFetch resolves normally, cache entry is created
Render completesWithin a single task → serverIsDynamic = false

The entire component tree resolves synchronously (within one task) during prerender. The static HTML is captured, and the "use cache" entry for CachedProductsList is seeded into the prerender resume data cache.

###When would this become dynamic?

Only when an actual request arrives with draft mode enabled (e.g., via the Presentation Tool). In that case:

  • draftMode() resolves with isEnabled: true
  • cookies() is called → this is a request context so it resolves normally
  • resolvePerspectiveFromCookies reads the perspective from cookies
  • CachedProductsList gets different args (perspective: 'drafts', stega: true) → different cache key → fresh cache entry

But for normal production traffic, the page is fully static with the cached products list baked in at build time.

Avatar of sgalanb
Santiago Galán
commented

So pages can be fully static even when they inlclude a Suspense? If the page is fully static and deployed, how would it detect when draft mode is enabled? That detection is client or server side? If it is server side, it would mean that even fully static pages trigger a function every time they are visited?

Avatar of anthropic/claude-opus-4.6
Claude Opus 4.6
commented

Now I have the complete picture. Here's the comprehensive answer:


##1. Yes, pages can be fully static even with <Suspense>

<Suspense> doesn't make a page dynamic. During prerender, it only becomes a dynamic hole if something inside it triggers a dynamic signal (hanging promise, postpone, etc.). If everything resolves within a single task, the Suspense boundary renders its children normally — the fallback is discarded and the full content is baked into static HTML.

From the source — the static branch in app-render.tsx:~5261:

} else {
  // Static case
  // We will not perform resumption per request. The result can be
  // served statically to the requestor and if there was anything
  // dynamic it will only be rendered in the browser.

Even postponed != null (React internally used Suspense) doesn't prevent the static path — it just means the boundaries get collapsed:

if (postponed != null) {
  // We postponed but nothing dynamic was used. We resume the render now
  // and immediately abort it so we can set all the postponed boundaries
  // to client render mode before we store the HTML response

##2. Draft mode detection is server-side, at the CDN/infrastructure layer

This is the key insight: it happens before your page code runs at all.

When Next.js builds a static page, the build output includes a bypassToken in the prerender manifest — this is the previewModeId, a secret generated at build time. From build-complete.ts:1426:

bypassToken: prerenderManifest.preview.previewModeId,

This token is also the value of the __prerender_bypass cookie that gets set when draftMode().enable() is called (in the /api/draft-mode/enable route). From draft-mode-provider.ts:68-73:

this._mutableCookies.set({
  name: COOKIE_NAME_PRERENDER_BYPASS,  // "__prerender_bypass"
  value: this._previewModeId,
  httpOnly: true,
  ...
})

On deployment platforms like Vercel, the CDN infrastructure uses this bypassToken to make a routing decision at the edge, before hitting any function:

  • No __prerender_bypass cookie (or wrong value) → serve the static HTML directly from the CDN. No function invocation at all.
  • Valid __prerender_bypass cookie → bypass the cache, route the request to the server function, which then renders dynamically.

##3. So no, fully static pages do NOT trigger a function on every visit

For normal production traffic (no draft mode cookie), the request never reaches your Next.js server code. The CDN serves the pre-built HTML file directly. The draftMode() call in your getDynamicFetchOptions function never runs — the entire page component tree is never executed.

The flow for a normal visitor:

Browser → CDN → checks for __prerender_bypass cookie → not found → serves static HTML

The flow for a Sanity editor with draft mode enabled:

Browser → CDN → checks __prerender_bypass cookie → matches bypassToken → invokes function
  → routeModule.prepare() → tryGetPreviewData() → isDraftMode = true
  → ssgCacheKey = null (skip cache)
  → renders dynamically
  → getDynamicFetchOptions() → draftMode().isEnabled = true → cookies() → reads perspective
  → CachedProductsList("use cache") → different cache key (drafts/stega)
  → res.setHeader('Cache-Control', 'private, no-cache, no-store, ...')

The ssgCacheKey = null when isDraftMode is true is the mechanism inside Next.js that ensures draft mode requests are never served from the ISR cache. From app-page.ts:486-494:

let ssgCacheKey: string | null = null
if (
  !isDraftMode &&  // ← this is false in draft mode, so no cache key
  isSSG && ...
) {
  ssgCacheKey = resolvedPathname
}
Avatar of sgalanb
Santiago Galán
commented

Wow, didn’t know about that CDN routing option. And since it isn’t a normal serverless function, is it billed by platforms like Vercel, or is the compute usage so small they don’t even care?

Avatar of anthropic/claude-opus-4.6
Claude Opus 4.6
commented

Now I have the complete picture on billing. Here's the answer:


##It's not a serverless function — but it's not free either

When a static page is served on Vercel, no serverless function is invoked. The CDN serves the pre-built HTML directly. But the request still passes through Vercel's edge network, and that incurs two types of charges:

###What you pay for on static pages

ResourceWhat it isCost (Pro plan, US)
Edge RequestsEvery HTTP request to your deployment, static or dynamic$2.00 per 1M requests (first 10M included)
Fast Data TransferBytes transferred from CDN → user's browser$0.15 per GB (first 1TB included)

These apply to all requests — static HTML, JS bundles, images, everything. From Vercel's docs:

"Static assets and functions all incur Edge Requests."

"When a user visits your site, the data transfer between Vercel's CDN and the user's device gets measured as Fast Data Transfer."

###What you do NOT pay for on static pages

ResourceWhy not
Function Invocations ($0.60/1M)No function runs
Active CPU ($0.128/hr)No compute
Provisioned Memory ($0.0106/GB-hr)No function instance started
Fast Origin Transfer ($0.06/GB)No CDN → compute data transfer
ISR ReadsOnly applies to ISR-cached pages, not purely static ones

###The practical difference is enormous

For a static page, you're paying for what is essentially standard CDN costs — edge request routing and bandwidth. These are the cheapest resources Vercel bills for, and the included amounts are generous (10M requests and 1TB bandwidth on Pro).

For a dynamic page (or a static page with the draft mode bypass cookie), every single request also incurs function invocations, CPU time, memory, and origin transfer on top of the edge costs.

###Back to the bypassToken routing

The cookie check at the CDN layer is part of the standard edge request processing — it's not a separate compute step. Vercel's edge network inspects the __prerender_bypass cookie as part of its routing logic (just like it inspects the Host header or URL path to decide which deployment to route to). This is baked into the infrastructure, not billed as compute. The bypassToken in the build output tells Vercel's routing layer what cookie value to look for — it's configuration, not code execution.

So for your Sanity page: normal visitors get the static HTML for the cost of an edge request + bandwidth. Only editors with draft mode enabled trigger an actual serverless function.

Avatar of sgalanb
Santiago Galán
commented

Great. In this thread (https://github.com/brunorzn/nextjs-draftmode-devmode-example/pull/1) the proposed solution by a member of the Next.js team is to wrap things in a suspense when draft mode is enabled so we can access cookies. But since adding the suspense doesn't make the page dynamic, can't we just always have the suspense?

Avatar of anthropic/claude-opus-4.6
Claude Opus 4.6
commented

Now I have the complete picture. The answer is nuanced:


##Yes, you can always have the Suspense — and the page stays static. But there's a tradeoff.

You're right that adding <Suspense> doesn't make the page dynamic. As we traced earlier, if everything inside resolves within a single task during prerender, the Suspense boundary is collapsed into the static output. The fallback is never used in production for non-draft-mode visitors.

So a version like this would work fine:

// Always wrap in Suspense, unconditionally
export default function Page() {
  return (
    <Suspense fallback={null}>
      <DynamicProductsList />
    </Suspense>
  )
}

The page would still be fully static for normal traffic. But the issue isn't about static vs. dynamic — it's about the dev-time validation warning.

###What the Suspense actually does here

With cacheComponents enabled, Next.js has a validation system that checks whether dynamic API access (like cookies(), draftMode()) occurs outside a <Suspense> boundary. From dynamic-rendering.ts:843-849:

const usageDescription =
  kind === DynamicHoleKind.Runtime
    ? `Runtime data such as \`cookies()\`, \`headers()\`, \`params\`, or
       \`searchParams\` was accessed outside of \`<Suspense>\`.`
    : `Uncached data or \`connection()\` was accessed outside of \`<Suspense>\`.`
const message = `Route "${workStore.route}": ${usageDescription} This delays
  the entire page from rendering, resulting in a slow user experience.`

When a <Suspense> boundary is found above the dynamic access in the component stack, the validation marks it as allowed and skips the warning:

const suspenseLocation = hasSuspenseRegex.exec(componentStack)
if (suspenseLocation) {
  const boundaryLocation =
    hasPrefetchValidationBoundaryRegex.exec(componentStack)
  if (boundaryLocation) {
    if (suspenseLocation.index < boundaryLocation.index) {
      dynamicValidation.hasAllowedDynamic = true  // ← no warning
      return
    }
  }
}

So the Suspense is really silencing a dev-time lint/warning, not changing runtime behavior.

###The tradeoff Dan Abramov flagged

This is exactly what Dan Abramov raised in #86739. The <Suspense fallback={null}> workaround above <html> is what he calls "nuclear" — it silences all dynamic access warnings for the entire page, even for places deeper in the tree where you genuinely would benefit from knowing that you're blocking unnecessarily.

His point is that there's a meaningful distinction between:

  • "I intentionally want to block on cookies() to check auth/draft mode" (no warning needed)
  • "I accidentally forgot to wrap a slow uncached fetch in Suspense" (warning is valuable)

An always-on Suspense erases that distinction.

###The conditional approach is more precise

The SuspendedDraftMode pattern from the PR thread is more surgical:

async function SuspendedDraftMode({ children }) {
  const { isEnabled } = await draftMode()
  if (isEnabled) {
    return <Suspense fallback={null}>{children}</Suspense>
  }
  return children
}

This only adds the Suspense when draft mode is actually enabled. During prerender (and for normal visitors), there's no Suspense wrapper at all, so the validation system remains active and you still get useful warnings about accidentally blocking elsewhere on the page.

But you're right that even if you made the Suspense unconditional, the page would remain fully static for normal visitors. It's a developer experience tradeoff, not a runtime one:

ApproachPage still static?Dev warnings preserved?
No Suspense✅ Full warnings (but errors in dev with draft mode)
Conditional Suspense (draft mode only)✅ Warnings work when not in draft mode
Always Suspense❌ All dynamic-outside-Suspense warnings suppressed

The real fix, as #87742 proposes, would be for Next.js to let dynamic APIs like cookies() work inside "use cache" when draft mode is enabled — since the cache is already bypassed anyway. That would eliminate the need for any of these workarounds.

Avatar of sgalanb
Santiago Galán
commented

So the Sanity solution doesn't make sense because it demands a lot of thinking by the end user and fetches the cookies and draft mode outside the cached component and then pass the values to it. Where a solution like wrapping the body in a suspense when draft is enabled only needs to be setup once and then it works for all routes. The key is that if the cache component is wrapped in a suspense, we can access cookies directly inside the CMS fetch wrapper (always validating that the draft mode is active first)

Avatar of anthropic/claude-opus-4.6
Claude Opus 4.6
commented

Interesting — private-cache allows cookies(). But 'cache' (public) throws unconditionally. So today, cookies() inside a public "use cache" component will always throw, even at runtime with draft mode enabled.


##You're right — the Sanity pattern puts unnecessary burden on the user

Here's the comparison:

###The Sanity pattern (current)

// Every page has to do this dance:
async function DynamicProductsList() {
  const {perspective, stega} = await getDynamicFetchOptions() // reads draftMode + cookies OUTSIDE cache
  return <CachedProductsList perspective={perspective} stega={stega} />
}

async function CachedProductsList({perspective, stega}: DynamicFetchOptions) {
  'use cache'
  const {data} = await sanityFetch({query, perspective, stega}) // can't read cookies in here
  // ...
}

Every page, every component that fetches data needs this two-layer wrapper: an outer component to extract dynamic values, and an inner cached component that receives them as props. The end user has to understand the "use cache" boundary rules and manually shuttle data across.

###The SuspendedDraftMode pattern (setup once)

// layout.tsx — setup ONCE
export default async function RootLayout({children}) {
  return (
    <html>
      <SuspendedDraftMode>
        <body>{children}</body>
      </SuspendedDraftMode>
    </html>
  )
}

But here's the catch — this doesn't actually solve the cookies() inside "use cache" problem. As we just confirmed, calling cookies() inside a 'cache' store type throws unconditionally:

case 'cache':
  throw new Error(`Route used \`cookies()\` inside "use cache"...`)

The SuspendedDraftMode wrapper silences the dev validation warning about dynamic access outside Suspense. It doesn't change the fact that cookies() inside "use cache" is a hard error.

So a CMS SDK that wants to read cookies for a preview token inside a "use cache" function would still explode — the Suspense wrapper above doesn't help with that.

###What would actually solve this

Issue #87742 proposes the real fix: when draft mode is enabled, the 'cache' store type in cookies.ts should not throw but instead behave like 'private-cache' (or 'request'), since the cache is already being bypassed anyway. The logic is already half-there:

  • ✅ Cache reads are skipped when draft mode is enabled
  • ✅ Cache writes are skipped when draft mode is enabled
  • draftMode().isEnabled is allowed inside "use cache"
  • cookies() still throws inside "use cache" even when draft mode is on

Until that's fixed, the two-layer Sanity pattern (read dynamic values outside "use cache", pass them as arguments) is actually the only way to access request-specific data like cookies from within a cached component. The SuspendedDraftMode wrapper is complementary — it handles the dev warning and makes the Suspense boundary available for the draft mode runtime path — but it can't make cookies() work inside "use cache".

Avatar of sgalanb
Santiago Galán
commented

Right. Create a plan/prompt so other agent can propose a fix so Next.js allows accessing cookies inside a cache component when draft mode is enabled. It should also analize any implications, side-effects or things that may broke by doing this (because it's weird the Next.js team didn't fix it yet if it is so simple).

Avatar of anthropic/claude-opus-4.6
Claude Opus 4.6
commented

Now I have a complete understanding. Here's the plan:


##Plan: Allow cookies() and headers() inside "use cache" when Draft Mode is enabled

###Context & Goal

Issue: vercel/next.js#87742

When Draft Mode is enabled, "use cache" functions already bypass caching entirely: cache reads are skipped (shouldForceRevalidate returns true), cache writes are skipped (!workStore.isDraftMode guard), and draftMode().isEnabled is allowed inside cache scopes. However, calling cookies() or headers() inside a "use cache" function unconditionally throws, even though the cache is already being bypassed. This forces CMS SDK authors (Sanity, Prismic, Contentful, etc.) to architect two-layer components where dynamic values are extracted outside the cache boundary and threaded through as props — a pattern that leaks framework internals to every end user.

The goal is to allow cookies() and headers() to work inside "use cache" functions only when Draft Mode is enabled, since in that context the function behaves as if the "use cache" boundary doesn't exist.

###Files to modify

####1. packages/next/src/server/app-render/work-unit-async-storage.external.ts

The PublicUseCacheStore interface (line ~297) does not carry cookies or headers fields, unlike PrivateUseCacheStore (line ~301). The fix needs to make cookies and headers optionally available on PublicUseCacheStore when draft mode is on.

  • Add optional cookies and headers fields to PublicUseCacheStore (or to CommonUseCacheStore), gated by draft mode. These should only be populated when draftMode is set (i.e., draft mode is enabled).

####2. packages/next/src/server/use-cache/use-cache-wrapper.tscreateUseCacheStore()

In the public cache branch (line ~236, the else block), the store is created without cookies or headers. When workStore.isDraftMode is true and the outer work unit store is a RequestStore or PrivateUseCacheStore (i.e., has cookies/headers available), propagate them onto the public cache store.

// Pseudocode for the change:
const publicStore = {
  type: 'cache',
  // ... existing fields ...
  draftMode: getDraftModeProviderForCacheScope(workStore, outerWorkUnitStore),
  // NEW: only populated when draft mode is on
  cookies: workStore.isDraftMode ? outerWorkUnitStore?.cookies : undefined,
  headers: workStore.isDraftMode ? outerWorkUnitStore?.headers : undefined,
}

The cookies and headers values would come from the outer work unit store, similar to how PrivateUseCacheStore gets them (line ~232-233).

####3. packages/next/src/server/request/cookies.tscookies() function

The case 'cache': branch (line ~65) currently throws unconditionally. Change it to check for draft mode:

case 'cache':
  if (workUnitStore.draftMode?.isEnabled && workUnitStore.cookies) {
    // Draft mode is enabled — the cache is already bypassed,
    // so we can safely return the request cookies.
    return makeUntrackedCookies(workUnitStore.cookies)
  }
  // Otherwise, throw as before
  const error = new Error(...)
  ...
  throw error

####4. packages/next/src/server/request/headers.tsheaders() function

Same pattern as cookies. The case 'cache': branch (line ~65) currently throws. Change to:

case 'cache':
  if (workUnitStore.draftMode?.isEnabled && workUnitStore.headers) {
    return makeUntrackedHeaders(workUnitStore.headers)
  }
  // Otherwise, throw as before
  ...

####5. test/e2e/app-dir/use-cache/app/(partially-static)/draft-mode/[mode]/page.tsx

The existing test fixture (line ~44-45) explicitly asserts that cookies() throws inside "use cache" with draft mode enabled:

// Accessing request-scoped data in "use cache" should not be allowed, even if
// draft mode is enabled. We expect the access to throw.
if (isEnabled && mode === 'with-cookies') {
  await cookies()
}

Update this test to expect success instead of an error when draft mode is enabled.

####6. test/e2e/app-dir/use-cache/use-cache.test.ts

The test at line ~927 asserts the error message when mode is with-cookies. Change the assertion to expect the cookies to be returned successfully when draft mode is enabled.

###Implications, side-effects, and things that could break

####A. Cache key poisoning risk (IMPORTANT — likely why this wasn't done yet)

This is the biggest concern. The "use cache" cache key is [buildId, functionId, args] — it does not include cookies. If a cache entry were accidentally saved with cookie-dependent output, it would be served to all users with the same args regardless of their cookies.

Why this is safe in the proposed fix: When draft mode is enabled, workStore.isDraftMode is true, and the use-cache-wrapper.ts already has a guard at line ~1722 that skips saving the cache entry:

if (!workStore.isDraftMode) {
  // save to cache handler...
}

Additionally, shouldForceRevalidate returns true, skipping cache reads. So the function always re-executes and never persists results. However, verify this holds for all code paths — particularly the prerender resume data cache path (prerenderResumeDataCache.cache.set(...) at line ~1730). If draft mode is ever active during a prerender (it shouldn't be, but verify), a cookie-dependent result could leak into the prerender cache.

####B. The generateCacheEntry clean snapshot isolation

At line ~163, the "use cache" function body runs inside workStore.runInCleanSnapshot(), which explicitly strips the RequestStore and PrerenderStore to prevent request-specific data from leaking. The comment at line ~185 says:

"We don't want any request specific information leaking and we don't want to create a bloated fake request mock for every cache call. So any feature that currently lives in RequestStore but should be available to Caches needs to move to WorkStore."

Adding cookies/headers to the cache store when draft mode is on contradicts this design principle. The fix should add a code comment explaining that draft mode is a special case where cache isolation is intentionally relaxed because the cache is bypassed entirely.

####C. Nested "use cache" boundaries

When a "use cache" function calls another "use cache" function, the inner one gets a PublicUseCacheStore with outerWorkUnitStore pointing to the outer UseCacheStore. If the outer store has cookies (because draft mode is on), the inner store would need to propagate them too. Check that createUseCacheStore handles this recursion correctly — particularly the case 'cache': branch in the outer store type switch.

####D. The workStore.invalidDynamicUsageError side effect

Currently, when cookies() throws inside "use cache", it sets workStore.invalidDynamicUsageError ??= error. This error is checked at multiple points:

  • During prerender: causes StaticGenBailoutError (line 2155 of app-render.tsx)
  • During dev dynamic render: throws the error (line 2344)
  • In the cache entry stream: errors the stream (line 455 of use-cache-wrapper.ts)

When the fix allows cookies() to succeed, this error is never set, so these code paths are simply not triggered. This should be safe, but verify that no downstream code depends on this error being set for draft mode requests.

####E. Prerender behavior should be unaffected

During prerender (build time), workStore.isDraftMode is false (draft mode is determined by the __prerender_bypass cookie, which is absent during build). So the case 'cache': branch in cookies() would still throw during prerender, which is correct — you can't access cookies during static generation. Verify this by confirming that isDraftMode is never true during isStaticGeneration.

####F. searchParams inside "use cache"

The issue (#87742) also mentions searchParams. These are handled differently — they're created via createSearchParams* functions in search-params.ts, not via a simple case 'cache': check. The searchParams case may need separate handling and could be left out of this initial fix. The searchParams architecture is also more complex because they affect the cache key for page-level caches.

####G. Mutable cookies

The fix should return read-only cookies via makeUntrackedCookies, not mutable ones. Calling cookies().set() inside "use cache" during draft mode should still error. The DraftMode.enable() / DraftMode.disable() calls already have their own trackDynamicDraftMode guards, so those are fine. But verify that areCookiesMutableInCurrentPhase returns false for the cache store type, preventing any mutation path.

####H. Why the Next.js team likely hasn't done this yet

  1. Architectural purity: The "use cache" boundary is designed as a hermetic isolation layer. Allowing request-specific data (even conditionally) breaks the mental model. The team may prefer the explicit "pass as arguments" pattern.

  2. The existing test explicitly asserts the throw (line 44-45 of the fixture): "Accessing request-scoped data in 'use cache' should not be allowed, even if draft mode is enabled." — This was a deliberate design decision, not an oversight.

  3. Surface area: Once you allow cookies() in "use cache" for draft mode, the next request will be headers(), then searchParams, then connection(). Each has different implications. The team may want a more holistic solution (like the "use cache: private" directive) rather than poking conditional holes in the public cache.

  4. Alternative considered: "use cache: private" already allows cookies() and headers(). The team may have intended CMS previewing to use private caches rather than modifying public cache semantics. However, private caches are limited to request-scoped storage (not persisted to cache handlers), which may not suit all use cases.


END OF POST

Is this page static or dynamic with cache components? — vercel/next.js