OptionaladditionalAdditional inference parameters that the model supports, beyond the
base set of inference parameters that the Converse API supports in the inferenceConfig
field. For more information, see the model parameters link below.
OptionalauthorizationBROWSER ONLY.
Providing this value will set an "Authorization" request header value on the GET request.
OptionalawsAn alternative to awsContainerAuthorizationTokenFile, this is the token value itself.
For browser environments, use instead authorizationToken.
OptionalawsWill be read on each credentials request to add an Authorization request header value.
Not supported in browsers.
OptionalawsIf this value is provided, it will be used as-is.
For browser environments, use instead credentialsFullUri.
OptionalawsIf this value is provided instead of the full URI, it will be appended to the default link local host of 169.254.170.2.
Not supported in browsers.
OptionalconfigThe path at which to locate the ini config file. Defaults to the value of
the AWS_CONFIG_FILE environment variable (if defined) or
~/.aws/config otherwise.
OptionalcredentialsAWS Credentials. If no credentials are provided, the default credentials from
@aws-sdk/credential-provider-node will be used.
OptionalcredentialsBROWSER ONLY.
In browsers, a relative URI is not allowed, and a full URI must be provided. HTTPS is required.
This value is required for the browser environment.
Optionalec2Only used in the IMDS credential provider.
OptionalendpointOverride the default endpoint hostname.
OptionalfilepathThe path at which to locate the ini credentials file. Defaults to the
value of the AWS_SHARED_CREDENTIALS_FILE environment variable (if
defined) or ~/.aws/credentials otherwise.
OptionalguardrailConfiguration information for a guardrail that you want to use in the request.
OptionalignoreConfiguration files are normally cached after the first time they are loaded. When this property is set, the provider will always reload any configuration files loaded before.
OptionalloggerFor credential resolution trace logging.
OptionalmaxDefault is 3 retry attempts or 4 total attempts.
OptionalmaxMax tokens.
OptionalmfaA function that returns a promise fulfilled with an MFA token code for
the provided MFA Serial code. If a profile requires an MFA code and
mfaCodeProvider is not a valid function, the credential provider
promise will be rejected.
The serial code of the MFA device specified.
OptionalmodelModel to use. For example, "anthropic.claude-3-haiku-20240307-v1:0", this is equivalent to the modelId property in the list-foundation-models api. See the below link for a full list of models.
https://docs.aws.amazon.com/bedrock/latest/userguide/model-ids.html#model-ids-arns
anthropic.claude-3-haiku-20240307-v1:0
OptionalprofileThe configuration profile to use.
OptionalregionThe AWS region e.g. us-west-2.
Fallback to AWS_DEFAULT_REGION env variable or region specified in ~/.aws/config
in case it is not provided here.
OptionalroleA function that assumes a role and returns a promise fulfilled with credentials for the assumed role.
The credentials with which to assume a role.
OptionalroleA function that assumes a role with web identity and returns a promise fulfilled with credentials for the assumed role.
OptionalroleThe IAM session name used to distinguish sessions.
OptionalssoThe ID of the AWS account to use for temporary credentials.
OptionalssoOptionalssoThe AWS region to use for temporary credentials.
OptionalssoThe name of the AWS role to assume.
OptionalssoSSO session identifier. Presence implies usage of the SSOTokenProvider.
OptionalssoThe URL to the AWS SSO service.
OptionalstreamWhether or not to include usage data, like token counts in the streamed response chunks. Passing as a call option will take precedence over the class-level setting.
OptionalstreamingWhether or not to stream responses
OptionalsupportsWhich types of tool_choice values the model supports.
Inferred if not specified. Inferred as ['auto', 'any', 'tool'] if a 'claude-3' model is used, ['auto', 'any'] if a 'mistral-large' model is used, empty otherwise.
OptionaltemperatureTemperature.
OptionaltimeoutDefault is 1000ms. Time in milliseconds to spend waiting between retry attempts.
OptionaltopPThe percentage of most-likely candidates that the model considers for the next token. For
example, if you choose a value of 0.8 for topP, the model selects from the top 80% of the
probability distribution of tokens that could be next in the sequence.
The default value is the default value for the model that you are using.
For more information, see the inference parameters for foundation models link below.
OptionalwebFile location of where the OIDC token is stored.
Inputs for ChatBedrockConverse.