Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

llamaapi

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

llamaapi

Llama API python SDK

  • 0.1.36
  • PyPI
  • Socket score

Maintainers
1

Llama API Client

LlamaAPI is a Python SDK for interacting with the Llama API. It abstracts away the handling of aiohttp sessions and headers, allowing for a simplified interaction with the API.

Installation

You can install the LlamaAPI SDK using pip:

pip install llamaapi

Usage

After installing the SDK, you can use it in your Python projects like so:

import json
from llamaapi import LlamaAPI

# Initialize the llamaapi with your api_token
llama = LlamaAPI("<your_api_token>")

# Define your API request
api_request_json = {
  "messages": [
    {"role": "user", "content": "Extract the desired information from the following passage.:\n\nHi!"},
  ],
  "functions": [
        {'name': 'information_extraction',
         'description': 'Extracts the relevant information from the passage.',
         'parameters': {
             'type': 'object',
             'properties': {
                 'sentiment': {
                    'title': 'sentiment',
                    'type': 'string',
                    'description': 'the sentiment encountered in the passage'
                    },
                 'aggressiveness': {
                    'title': 'aggressiveness',
                    'type': 'integer',
                    'description': 'a 0-10 score of how aggressive the passage is'
                    },
                 'language': {
                    'title': 'language',
                    'type': 'string',
                    'description': 'the language of the passage'
                    }
             },
             'required': ['sentiment', 'aggressiveness', 'language']
         }
      }
    ],
  "stream": False,
  "function_call": {"name": "information_extraction"},
}

# Make your request and handle the response
response = llama.run(api_request_json)
print(json.dumps(response.json(), indent=2))

Other parameters that you can pass in the request json is:

{
  ...
  "max_length" = 500,
  "temperature"= 0.1,
  "top_p"= 1.0,
  "frequency_penalty"=1.0
  ...
}

Note: Stream is still not working, so it is recommended to submit with stream: False.

Change Log

Version 0.1: Initial release

Contributing

We welcome contributions to this project. Please see the Contributing Guidelines for more details.

License

llamaapi SDK is licensed under the MIT License. Please see the License File for more details.

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc