How are AI Gateway model strings converted to language models?

HEAD
0927570Version Packages (#12144)
This post might have stale content, as HEAD is 126 commits ahead.

how does the ai gateway string, "anthorpic/claude-sonnet-4.5" get converted into a usable langauge model? does it happen inside streamText?

why isn't it the same behavior for other providers? like could it work with "bedrock:anthoric/..." potentially?

Avatar of anthropic/claude-opus-4.5
Claude Opus 4.5
commented

Now I have a complete picture. Let me summarize what I've found:


##How "anthropic/claude-sonnet-4.5" Gets Converted to a Usable Model

###The Flow

  1. In streamText (or generateText, etc.): When you pass a string like "anthropic/claude-sonnet-4.5" to streamText, it calls resolveLanguageModel():

    // packages/ai/src/generate-text/stream-text.ts:465
    model: resolveLanguageModel(model),
    
  2. In resolveLanguageModel (packages/ai/src/model/resolve-model.ts):

    export function resolveLanguageModel(model: LanguageModel): LanguageModelV3 {
      if (typeof model !== 'string') {
        // If it's already a model object, use it directly
        return asLanguageModelV3(model);
      }
      // If it's a string, use the global provider
      return getGlobalProvider().languageModel(model);
    }
    
    function getGlobalProvider(): ProviderV3 {
      return globalThis.AI_SDK_DEFAULT_PROVIDER ?? gateway;  // defaults to gateway!
    }
    
  3. The gateway is the default global provider (packages/gateway/src/gateway-provider.ts):

    export const gateway = createGatewayProvider();
    
  4. The GatewayLanguageModel doesn't parse the string locally — it just sends it to Vercel's AI Gateway API as a header:

    // packages/gateway/src/gateway-language-model.ts:200-205
    private getModelConfigHeaders(modelId: string, streaming: boolean) {
      return {
        'ai-language-model-specification-version': '3',
        'ai-language-model-id': modelId,  // <-- sent as a header!
        'ai-language-model-streaming': String(streaming),
      };
    }
    

    The actual model resolution (routing to Anthropic, OpenAI, etc.) happens server-side at https://ai-gateway.vercel.sh/v3/ai.


##Why Doesn't "bedrock:anthropic/..." Work The Same Way?

It's a design difference:

  1. AI Gateway strings (like "anthropic/claude-sonnet-4.5") are interpreted by Vercel's Gateway server. The SDK just forwards them as-is.

  2. Provider registry strings (like "bedrock:anthropic.claude-v2") use a local separator-based routing mechanism via createProviderRegistry():

    // packages/ai/src/registry/provider-registry.ts:204-217
    private splitId(id: string, modelType): [string, string] {
      const index = id.indexOf(this.separator);  // default separator is ':'
      // ...
      return [id.slice(0, index), id.slice(index + this.separator.length)];
    }
    

    This requires you to explicitly register providers in a registry first:

    const registry = createProviderRegistry({
      bedrock: amazonBedrock,
      openai: openai,
    });
    
    registry.languageModel('bedrock:anthropic.claude-v2');
    

###Could "bedrock:anthropic/..." Work?

Not automatically with the current design. You have two options:

  1. Use a provider registry:

    const registry = createProviderRegistry({
      bedrock: amazonBedrock,
    });
    
    streamText({
      model: registry.languageModel('bedrock:anthropic.claude-v2'),
      // ...
    });
    
  2. Set a custom global provider:

    import { amazonBedrock } from '@ai-sdk/amazon-bedrock';
    
    globalThis.AI_SDK_DEFAULT_PROVIDER = amazonBedrock;
    
    // Now string model IDs go to Bedrock
    streamText({
      model: 'anthropic.claude-v2',  // interpreted by Bedrock
    });
    
  3. Or wait for the AI Gateway to add Bedrock support — if Vercel's Gateway supported Bedrock, you could potentially use "bedrock/anthropic.claude-v2" the same way you use "anthropic/claude-sonnet-4.5".


###Key Takeaway

The magic of "anthropic/claude-sonnet-4.5" isn't in streamText — it's that the Vercel AI Gateway is the default global provider, and the Gateway server handles the actual provider/model resolution. Other providers require explicit instantiation or registry setup because there's no centralized server doing the routing for them.


END OF POST

How are AI Gateway model strings converted to language models? — vercel/ai