Research
Security News
Quasar RAT Disguised as an npm Package for Detecting Vulnerabilities in Ethereum Smart Contracts
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
@anthropic-ai/bedrock-sdk
Advanced tools
The official TypeScript library for the Anthropic Bedrock API
This library provides convenient access to the Anthropic Bedrock REST API from server-side TypeScript or JavaScript.
For the non-Bedrock Anthropic API at api.anthropic.com, see @anthropic-ai/sdk
.
The API documentation can be found here.
npm install --save @anthropic-ai/bedrock-sdk
# or
yarn add @anthropic-ai/bedrock-sdk
The full API of this library can be found in api.md.
import AnthropicBedrock from '@anthropic-ai/bedrock-sdk';
const client = new AnthropicBedrock({
// Authenticate by either providing the keys below or use the default AWS credential providers, such as
// using ~/.aws/credentials or the "AWS_SECRET_ACCESS_KEY" and "AWS_ACCESS_KEY_ID" environment variables.
awsAccessKey: '<access key>',
awsSecretKey: '<secret key>',
// Temporary credentials can be used with awsSessionToken.
// Read more at https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html.
awsSessionToken: '<session_token>',
// awsRegion changes the aws region to which the request is made. By default, we read AWS_REGION,
// and if that's not present, we default to us-east-1. Note that we do not read ~/.aws/config for the region.
awsRegion: 'us-east-2',
});
async function main() {
const completion = await client.completions.create({
model: 'anthropic.claude-v2:1',
max_tokens_to_sample: 256,
prompt: `${AnthropicBedrock.HUMAN_PROMPT} how does a court case get to the Supreme Court? ${AnthropicBedrock.AI_PROMPT}`,
});
}
main().catch(console.error);
We provide support for streaming responses using Server Sent Events (SSE).
import AnthropicBedrock from '@anthropic-ai/bedrock-sdk';
const client = new AnthropicBedrock();
const stream = await client.completions.create({
prompt: `${AnthropicBedrock.HUMAN_PROMPT} Your prompt here${AnthropicBedrock.AI_PROMPT}`,
model: 'anthropic.claude-v2:1',
stream: true,
max_tokens_to_sample: 300,
});
for await (const completion of stream) {
console.log(completion.completion);
}
If you need to cancel a stream, you can break
from the loop
or call stream.controller.abort()
.
This library includes TypeScript definitions for all request params and response fields. You may import and use them like so:
import AnthropicBedrock from '@anthropic-ai/bedrock-sdk';
const client = new AnthropicBedrock();
async function main() {
const params: AnthropicBedrock.CompletionCreateParams = {
model: 'anthropic.claude-v2:1',
prompt: `${AnthropicBedrock.HUMAN_PROMPT} how does a court case get to the Supreme Court? ${AnthropicBedrock.AI_PROMPT}`,
max_tokens_to_sample: 256,
};
const completion: AnthropicBedrock.Completion = await client.completions.create(params);
}
main().catch(console.error);
Documentation for each method, request param, and response field are available in docstrings and will appear on hover in most modern editors.
This library uses @smithy/signature-v4 internally for authentication; you can read more about default providers here.
We provide a separate package for counting how many tokens a given piece of text contains.
See the repository documentation for more details.
When the library is unable to connect to the API,
or if the API returns a non-success status code (i.e., 4xx or 5xx response),
a subclass of APIError
will be thrown:
async function main() {
const completion = await anthropicBedrock.completions
.create({
model: 'anthropic.claude-v2:1',
prompt: `${AnthropicBedrock.HUMAN_PROMPT} your prompt here ${AnthropicBedrock.AI_PROMPT}`,
max_tokens_to_sample: 256,
})
.catch((err) => {
if (err instanceof AnthropicBedrock.APIError) {
console.log(err.status); // 400
console.log(err.name); // BadRequestError
console.log(err.headers); // {server: 'nginx', ...}
}
});
}
main().catch(console.error);
Error codes are as followed:
Status Code | Error Type |
---|---|
400 | BadRequestError |
401 | AuthenticationError |
403 | PermissionDeniedError |
404 | NotFoundError |
422 | UnprocessableEntityError |
429 | RateLimitError |
>=500 | InternalServerError |
N/A | APIConnectionError |
Certain errors will be automatically retried 2 times by default, with a short exponential backoff. Connection errors (for example, due to a network connectivity problem), 408 Request Timeout, 409 Conflict, 429 Rate Limit, and >=500 Internal errors will all be retried by default.
You can use the maxRetries
option to configure or disable this:
// Configure the default for all requests:
const client = new AnthropicBedrock({
maxRetries: 0, // default is 2
});
// Or, configure per-request:
await client.completions.create(
{
prompt: `${AnthropicBedrock.HUMAN_PROMPT} Can you help me effectively ask for a raise at work?${AnthropicBedrock.AI_PROMPT}`,
max_tokens_to_sample: 300,
model: 'anthropic.claude-v2:1',
},
{
maxRetries: 5,
},
);
Requests time out after 10 minutes by default. You can configure this with a timeout
option:
// Configure the default for all requests:
const client = new AnthropicBedrock({
timeout: 20 * 1000, // 20 seconds (default is 10 minutes)
});
// Override per-request:
await client.completions.create(
{
prompt: `${AnthropicBedrock.HUMAN_PROMPT} Where can I get a good coffee in my neighbourhood?${AnthropicBedrock.AI_PROMPT}`,
max_tokens_to_sample: 300,
model: 'anthropic.claude-v2:1',
},
{
timeout: 5 * 1000,
},
);
On timeout, an APIConnectionTimeoutError
is thrown.
Note that requests which time out will be retried twice by default.
The "raw" Response
returned by fetch()
can be accessed through the .asResponse()
method on the APIPromise
type that all methods return.
You can also use the .withResponse()
method to get the raw Response
along with the parsed data.
const response = await client.completions
.create({
model: 'anthropic.claude-v2:1',
prompt: `${AnthropicBedrock.HUMAN_PROMPT} your prompt here ${AnthropicBedrock.AI_PROMPT}`,
max_tokens_to_sample: 256,
})
.asResponse();
console.log(response.headers.get('X-My-Header'));
console.log(response.statusText); // access the underlying Response object
const { data: completions, response: raw } = await client.completions
.create({
model: 'anthropic.claude-v2:1',
prompt: `${AnthropicBedrock.HUMAN_PROMPT} your prompt here ${AnthropicBedrock.AI_PROMPT}`,
max_tokens_to_sample: 256,
})
.withResponse();
console.log(raw.headers.get('X-My-Header'));
console.log(completions.choices);
By default, this library uses node-fetch
in Node, and expects a global fetch
function in other environments.
If you would prefer to use a global, web-standards-compliant fetch
function even in a Node environment,
(for example, if you are running Node with --experimental-fetch
or using NextJS which polyfills with undici
),
add the following import before your first import from "AnthropicBedrock"
:
// Tell TypeScript and the package to use the global web fetch instead of node-fetch.
// Note, despite the name, this does not add any polyfills, but expects them to be provided if needed.
import '@anthropic-ai/bedrock-sdk/shims/web';
import AnthropicBedrock from '@anthropic-ai/bedrock-sdk';
To do the inverse, add import "@anthropic-ai/bedrock-sdk/shims/node"
(which does import polyfills).
This can also be useful if you are getting the wrong TypeScript types for Response
- more details here.
You may also provide a custom fetch
function when instantiating the client,
which can be used to inspect or alter the Request
or Response
before/after each request:
import { fetch } from 'undici'; // as one example
import AnthropicBedrock from '@anthropic-ai/bedrock-sdk';
const client = new AnthropicBedrock({
fetch: (url: RequestInfo, init?: RequestInfo): Response => {
console.log('About to make request', url, init);
const response = await fetch(url, init);
console.log('Got response', response);
return response;
},
});
Note that if given a DEBUG=true
environment variable, this library will log all requests and responses automatically.
This is intended for debugging purposes only and may change in the future without notice.
By default, this library uses a stable agent for all http/https requests to reuse TCP connections, eliminating many TCP & TLS handshakes and shaving around 100ms off most requests.
If you would like to disable or customize this behavior, for example to use the API behind a proxy, you can pass an httpAgent
which is used for all requests (be they http or https), for example:
import http from 'http';
import AnthropicBedrock from '@anthropic-ai/bedrock-sdk';
import HttpsProxyAgent from 'https-proxy-agent';
// Configure the default for all requests:
const client = new AnthropicBedrock({
httpAgent: new HttpsProxyAgent(process.env.PROXY_URL),
});
// Override per-request:
await client.completions.create(
{
prompt: `${AnthropicBedrock.HUMAN_PROMPT} How does a court case get to the Supreme Court?${AnthropicBedrock.AI_PROMPT}`,
max_tokens_to_sample: 300,
model: 'anthropic.claude-v2:1',
},
{
baseURL: 'http://localhost:8080/test-api',
httpAgent: new http.Agent({ keepAlive: false }),
},
);
This package generally follows SemVer conventions, though certain backwards-incompatible changes may be released as minor versions:
We take backwards-compatibility seriously and work hard to ensure you can rely on a smooth upgrade experience.
We are keen for your feedback; please open an issue with questions, bugs, or suggestions.
TypeScript >= 4.5 is supported.
The following runtimes are supported:
import AnthropicBedrock from "npm:@anthropic-ai/bedrock-sdk"
."node"
environment ("jsdom"
is not supported at this time).Note that React Native is not supported at this time.
If you are interested in other runtime environments, please open or upvote an issue on GitHub.
0.6.1 (2023-08-23)
FAQs
The official TypeScript library for the Anthropic Bedrock API
The npm package @anthropic-ai/bedrock-sdk receives a total of 0 weekly downloads. As such, @anthropic-ai/bedrock-sdk popularity was classified as not popular.
We found that @anthropic-ai/bedrock-sdk demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket researchers uncover a malicious npm package posing as a tool for detecting vulnerabilities in Etherium smart contracts.
Security News
Research
A supply chain attack on Rspack's npm packages injected cryptomining malware, potentially impacting thousands of developers.
Research
Security News
Socket researchers discovered a malware campaign on npm delivering the Skuld infostealer via typosquatted packages, exposing sensitive data.