
Product
Introducing Scala and Kotlin Support in Socket
Socket now supports Scala and Kotlin, bringing AI-powered threat detection to JVM projects with easy manifest generation and fast, accurate scans.
anthropic-openai-wrapper
Advanced tools
A simple npm package that allows you to use the Anthropic API as a drop-in replacement for the OpenAI API. This package converts OpenAI API calls into Anthropic API calls, enabling software that uses the OpenAI API to seamlessly use the Anthropic Claude m
A simple npm package that allows you to use the Anthropic API as a drop-in replacement for the OpenAI API. This package converts OpenAI API calls into Anthropic API calls, enabling software that uses the OpenAI API to seamlessly use the Anthropic Claude model without requiring significant changes.
To install the package, use npm:
npm install anthropic-openai-wrapper
First, require the package and initialize an instance of the AnthropicOpenAIWrapper
class with your Anthropic API key:
const AnthropicOpenAIWrapper = require('anthropic-openai-wrapper');
const apiKey = 'YOUR_ANTHROPIC_API_KEY';
const wrapper = new AnthropicOpenAIWrapper(apiKey);
Then, you can use the callAPI
function to make API calls with the same parameters you would use for the OpenAI API:
const openAIParams = {
prompt: 'Hello, how are you?',
max_tokens: 50,
temperature: 0.8,
};
wrapper.callAPI(openAIParams)
.then((response) => {
console.log(response);
})
.catch((error) => {
console.error('Error:', error);
});
The callAPI
function returns a promise that resolves to the formatted response or rejects with an error if the API call fails.
constructor(apiKey)
apiKey
(string): Your Anthropic API key.Initializes a new instance of the AnthropicOpenAIWrapper
class with the provided API key.
callAPI(openAIParams)
openAIParams
(object): The parameters for the API call, using the same format as the OpenAI API.
prompt
(string): The prompt text.max_tokens
(number, optional): The maximum number of tokens to generate. Default is 100.temperature
(number, optional): The sampling temperature. Default is 0.7.top_p
(number, optional): The top-p sampling parameter. Default is 1.n
(number, optional): The number of completions to generate. Default is 1.stop
(string or array, optional): The stop sequence(s). Default is null.Returns a promise that resolves to the formatted response or rejects with an error if the API call fails.
The callAPI
function returns a response object formatted to match the structure of the OpenAI API response:
{
"id": "cmpl-abc123",
"object": "text_completion",
"created": 1621234567,
"model": "claude-v1",
"choices": [
{
"text": "I'm doing well, thank you! How can I assist you today?",
"index": 0,
"logprobs": null,
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 5,
"completion_tokens": 15,
"total_tokens": 20
}
}
This package is released under the MIT License.
FAQs
A simple npm package that allows you to use the Anthropic API as a drop-in replacement for the OpenAI API. This package converts OpenAI API calls into Anthropic API calls, enabling software that uses the OpenAI API to seamlessly use the Anthropic Claude m
The npm package anthropic-openai-wrapper receives a total of 0 weekly downloads. As such, anthropic-openai-wrapper popularity was classified as not popular.
We found that anthropic-openai-wrapper demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Product
Socket now supports Scala and Kotlin, bringing AI-powered threat detection to JVM projects with easy manifest generation and fast, accurate scans.
Application Security
/Security News
Socket CEO Feross Aboukhadijeh and a16z partner Joel de la Garza discuss vibe coding, AI-driven software development, and how the rise of LLMs, despite their risks, still points toward a more secure and innovative future.
Research
/Security News
Threat actors hijacked Toptal’s GitHub org, publishing npm packages with malicious payloads that steal tokens and attempt to wipe victim systems.