Bedrock LLM Bridge
Universal Amazon Bedrock LLM Bridge supporting multiple models (Anthropic Claude, Meta Llama, and more) with a unified interface.
🚀 Features
- Multi-Model Support: Anthropic Claude, Meta Llama, and other Bedrock models
- Abstract Model Pattern: Extensible architecture for easy model addition
- Type Safety: Full TypeScript support with comprehensive type definitions
- Streaming Support: Native streaming API support for supported models
- AWS Integration: Seamless AWS SDK integration with Bedrock Runtime
- Error Handling: Robust error handling with standardized error types
- Flexible Configuration: Easy AWS credential and region management
- Model Switching: Runtime model switching capabilities
📦 Installation
pnpm add bedrock-llm-bridge llm-bridge-spec @aws-sdk/client-bedrock-runtime @smithy/node-http-handler zod
npm install bedrock-llm-bridge llm-bridge-spec @aws-sdk/client-bedrock-runtime @smithy/node-http-handler zod
yarn add bedrock-llm-bridge llm-bridge-spec @aws-sdk/client-bedrock-runtime @smithy/node-http-handler zod
🏗️ Architecture
This package follows the Abstract Model Pattern for maximum extensibility:
bedrock-llm-bridge/
├── models/
│ ├── base/AbstractModel # Abstract base class
│ ├── anthropic/AnthropicModel # Claude implementation
│ └── meta/MetaModel # Llama implementation
├── bridge/BedrockBridge # Main bridge class
├── factory/ # Factory functions
└── utils/error-handler # Error handling
🎯 Quick Start
Basic Usage with Claude
import { createBedrockBridge } from 'bedrock-llm-bridge';
const bridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
maxTokens: 1000,
temperature: 0.7,
});
const response = await bridge.invoke({
messages: [{ role: 'user', content: [{ type: 'text', text: 'Hello!' }] }],
});
console.log(response.choices[0].message.content[0].text);
Using Meta Llama
import { createBedrockBridge } from 'bedrock-llm-bridge';
const bridge = createBedrockBridge({
region: 'us-west-2',
model: 'meta.llama2-70b-chat-v1',
maxTokens: 2048,
temperature: 0.6,
});
const response = await bridge.invoke({
messages: [{ role: 'user', content: [{ type: 'text', text: 'Explain quantum computing' }] }],
});
Streaming (Claude 3 Models)
const stream = bridge.invokeStream({
messages: [{ role: 'user', content: [{ type: 'text', text: 'Tell me a story' }] }],
});
for await (const chunk of stream) {
const text = chunk.choices[0]?.message?.content[0]?.text;
if (text) {
process.stdout.write(text);
}
}
🔧 Factory Functions
Main Factory
import { createBedrockBridge } from 'bedrock-llm-bridge';
const bridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
credentials: {
accessKeyId: 'your-access-key',
secretAccessKey: 'your-secret-key',
},
maxTokens: 1000,
temperature: 0.7,
});
Convenience Factories
import {
createAnthropicBridge,
createMetaBridge,
createDefaultBedrockBridge,
} from 'bedrock-llm-bridge';
const claudeBridge = createAnthropicBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
temperature: 0.8,
});
const llamaBridge = createMetaBridge({
region: 'us-west-2',
model: 'meta.llama2-70b-chat-v1',
maxTokens: 2048,
});
const defaultBridge = createDefaultBedrockBridge({
region: 'us-east-1',
temperature: 0.5,
});
📋 Supported Models
Anthropic Claude Models
anthropic.claude-3-opus-20240229-v1:0 - Most capable Claude 3 model
anthropic.claude-3-sonnet-20240229-v1:0 - Balanced performance and speed
anthropic.claude-3-haiku-20240307-v1:0 - Fastest Claude 3 model
anthropic.claude-v2:1 - Claude 2.1
anthropic.claude-v2 - Claude 2.0
anthropic.claude-instant-v1 - Fast and cost-effective
Meta Llama Models
meta.llama2-70b-chat-v1 - Llama 2 70B Chat
meta.llama2-13b-chat-v1 - Llama 2 13B Chat
meta.llama2-7b-chat-v1 - Llama 2 7B Chat
Amazon Titan Models (Coming Soon)
amazon.titan-text-express-v1
amazon.titan-text-lite-v1
⚙️ Configuration
interface BedrockConfig {
region: string;
model: string;
credentials?: {
accessKeyId: string;
secretAccessKey: string;
sessionToken?: string;
};
maxTokens?: number;
temperature?: number;
topP?: number;
topK?: number;
stopSequences?: string[];
}
🔐 AWS Authentication
Method 1: Environment Variables
export AWS_ACCESS_KEY_ID=your-access-key
export AWS_SECRET_ACCESS_KEY=your-secret-key
export AWS_REGION=us-east-1
const bridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
});
Method 2: AWS Credentials File
[default]
aws_access_key_id = your-access-key
aws_secret_access_key = your-secret-key
Method 3: IAM Roles (EC2/Lambda)
const bridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
});
Method 4: Explicit Credentials
const bridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
credentials: {
accessKeyId: process.env.AWS_ACCESS_KEY_ID!,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY!,
},
});
🎭 Model Capabilities
const metadata = await bridge.getMetadata();
console.log(metadata);
const manifest = bridge.getManifest();
console.log(manifest.capabilities.supportsStreaming);
console.log(manifest.capabilities.supportsMultiTurn);
console.log(manifest.capabilities.supportsToolCall);
🚦 Error Handling
The bridge provides comprehensive error handling with standardized error types:
import {
ServiceUnavailableError,
InvalidRequestError,
AuthenticationError,
ModelNotSupportedError,
NetworkError,
} from 'llm-bridge-spec';
try {
const response = await bridge.invoke(prompt);
} catch (error) {
if (error instanceof AuthenticationError) {
console.error('AWS credentials invalid or insufficient permissions');
} else if (error instanceof ModelNotSupportedError) {
console.error('Model not available in region:', error.requestedModel);
console.log('Supported models:', error.supportedModels);
} else if (error instanceof ServiceUnavailableError) {
console.error('Bedrock service temporarily unavailable');
console.log('Retry after:', error.retryAfter);
} else if (error instanceof InvalidRequestError) {
console.error('Invalid request parameters:', error.message);
}
}
🔄 Model Switching
const bridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
});
bridge.setModel('anthropic.claude-3-haiku-20240307-v1:0');
console.log(bridge.getCurrentModel());
console.log(bridge.getSupportedModels());
🌍 Multi-Region Support
const usEastBridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
});
const usWestBridge = createBedrockBridge({
region: 'us-west-2',
model: 'meta.llama2-70b-chat-v1',
});
const availableModels = await bridge.getModelsForRegion('us-east-1');
console.log(availableModels);
💰 Cost Optimization
const response = await bridge.invoke(prompt);
if (response.usage) {
console.log('Input tokens:', response.usage.promptTokens);
console.log('Output tokens:', response.usage.completionTokens);
const inputCost = (response.usage.promptTokens * 0.003) / 1000;
const outputCost = (response.usage.completionTokens * 0.015) / 1000;
console.log(`Estimated cost: $${(inputCost + outputCost).toFixed(6)}`);
}
🧪 Testing
pnpm test
pnpm test:coverage
pnpm test:watch
pnpm test bedrock-bridge.test.ts
🔧 Adding New Models
The abstract model pattern makes it easy to add support for new Bedrock models:
export class NewModelBridge extends AbstractModel {
protected getModelFamily(): string {
return 'new-provider';
}
protected async transformRequest(prompt: LlmBridgePrompt, options?: InvokeOption) {
}
protected transformResponse(response: any): LlmBridgeResponse {
}
}
export function createNewModelBridge(config: BedrockConfig) {
return new BedrockBridge(new NewModelBridge(config));
}
📊 Performance Considerations
Model Selection
const fastBridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-haiku-20240307-v1:0',
});
const capableBridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-opus-20240229-v1:0',
});
const balancedBridge = createBedrockBridge({
region: 'us-east-1',
model: 'anthropic.claude-3-sonnet-20240229-v1:0',
});
Streaming for Long Responses
if (expectedResponseLength > 500) {
const stream = bridge.invokeStream(prompt);
} else {
const response = await bridge.invoke(prompt);
}
🔗 Related Packages
🔮 Roadmap
🤝 기여하기
이 프로젝트는 Git Workflow Guide를 따릅니다.
📄 라이선스
이 프로젝트는 MIT 라이선스 하에 있습니다.
🙏 Acknowledgments
Made with ❤️ by the LLM Bridge Team