Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

csa-ai-foundation-model-api-clients

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

csa-ai-foundation-model-api-clients

Cloud Security Alliance AI Foundation Model API Clients

  • 0.1.24
  • PyPI
  • Socket score

Maintainers
1

Cloud Security Alliance AI Foundation Model API Clients

You can get this library at https://pypi.org/project/csa-ai-foundation-model-api-clients/ or via:

pip install csa-ai-foundation-model-api-clients

This Python library (csa_ai_foundation_model_api_clients) provides API access to text completions for:

  • Anthropic Claude 3
  • Google Gemini 1.5
  • OpenAI ChatGPT 4 gpt-4o

and has plans to add:

  • OpenAI ChatGPT 4 batch mode

You can set the following options:

  • system prompt (aka developer prompt, persona)
  • user prompt (aka instructions)
  • user data (as part of the user prompt)
  • temperature
  • max_tokens

and has plans to add:

  • top_p
  • top_k
  • model specific paramaters

Please note this code does not have tests, or good error handling, but it works. Also with respect to handling rate limiting that is on the todo, but currently if you use this tool put a sleep statement to slow it down.

The code will be looking for the API key as an env variable:

  • ANTHROPIC_CLAUDE_API_KEY
  • GOOGLE_GEMINI_API_KEY
  • OPENAI_CHATGPT_API_KEY

Examples:

Using this as a library:

#!/usr/bin/env python3

from csa_ai_foundation_model_api_clients import FoundationModelAPIClient

def main():
    model = 'claude'
    system_prompt = "You are a helpful assistant who answers in rhyme."
    user_prompt = "What is the capital of "
    user_data = "France?"
    output_file = 'claude-response.json'

    FoundationModelAPIClient(
        model=model,
        system_prompt=system_prompt,
        system_prompt_type="text",
        user_prompt=user_prompt,
        user_prompt_type="text",
        user_data=user_data,
        user_data_type="text",
        temperature=0.7,
        max_tokens=100,
        output_file=output_file
    )

if __name__ == '__main__':
    main()

Using this as a command line tool by calling it as a module (please note all prompts MUST be files currently when used as a command line tool):

python3 -m csa_ai_foundation_model_api_clients.csa_ai_foundation_model_api_clients \
    --model chatgpt \
    --system system-prompt.txt \
    --user-prompt user-prompt.txt \
    --user-data user-data.txt \
    --output output.json \
    --temperature 0.9 \
    --max_tokens 2000

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc