Security News
Research
Data Theft Repackaged: A Case Study in Malicious Wrapper Packages on npm
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Chat Experiments, Simplified
💬🔬
ChatLab is a Python package that makes it easy to experiment with OpenAI's chat models. It provides a simple interface for chatting with the models and a way to register functions that can be called from the chat model.
Best yet, it's interactive in the notebook!
import chatlab
import random
def flip_a_coin():
'''Returns heads or tails'''
return random.choice(['heads', 'tails'])
chat = chatlab.Chat()
chat.register(flip_a_coin)
await chat("Please flip a coin for me")
Input:
{}
Output:
"tails"
It landed on tails!
In the notebook, text will stream into a Markdown output and function inputs and outputs are a nice collapsible display, like with ChatGPT Plugins.
TODO: Include GIF/mp4 of this in action
pip install chatlab
You'll need to set your OPENAI_API_KEY
environment variable. You can find your API key on your OpenAI account page. I recommend setting it in an .env
file when working locally.
On hosted notebook environments, set it in your Secrets to keep it safe from prying LLM eyes.
Chat
s enable you to do?💬
Where Chat
s take it next level is with Chat Functions. You can
Chat
You may recall this kind of behavior from ChatGPT Plugins. Now, you can take this even further with your own custom code.
As an example, let's give the large language models the ability to tell time.
from datetime import datetime
from pytz import timezone, all_timezones, utc
from typing import Optional
from pydantic import BaseModel
def what_time(tz: Optional[str] = None):
'''Current time, defaulting to UTC'''
if tz is None:
pass
elif tz in all_timezones:
tz = timezone(tz)
else:
return 'Invalid timezone'
return datetime.now(tz).strftime('%I:%M %p')
class WhatTime(BaseModel):
tz: Optional[str] = None
Let's break this down.
what_time
is the function we're going to provide access to. Its docstring forms the description
for the model while the schema comes from the pydantic BaseModel
called WhatTime
.
import chatlab
chat = chatlab.Chat()
# Register our function
chat.register(what_time, WhatTime)
After that, we can call chat
with direct strings (which are turned into user messages) or using simple message makers from chatlab
named user
and system
.
await chat("What time is it?")
Input:
{}
Output:
"11:19 AM"
The current time is 11:19 AM.
The chatlab
package exports
Chat
The Chat
class is the main way to chat using OpenAI's models. It keeps a history of your chat in Chat.messages
.
Chat.submit
submit
is how you send all the currently built up messages over to OpenAI. Markdown output will display responses from the assistant
.
await chat.submit('What would a parent who says "I have to play zone defense" mean? ')
# Markdown response inline
chat.messages
[{'role': 'user',
'content': 'What does a parent of three kids mean by "I have to play zone defense"?'},
{'role': 'assistant',
'content': 'When a parent of three kids says "I have to play zone defense," it means that they...
Chat.register
You can register functions with Chat.register
to make them available to the chat model. The function's docstring becomes the description of the function while the schema is derived from the pydantic.BaseModel
passed in.
from pydantic import BaseModel
class WhatTime(BaseModel):
tz: Optional[str] = None
def what_time(tz: Optional[str] = None):
'''Current time, defaulting to UTC'''
if tz is None:
pass
elif tz in all_timezones:
tz = timezone(tz)
else:
return 'Invalid timezone'
return datetime.now(tz).strftime('%I:%M %p')
chat.register(what_time, WhatTime)
Chat.messages
The raw messages sent and received to OpenAI. If you hit a token limit, you can remove old messages from the list to make room for more.
chat.messages = chat.messages[-100:]
human
/user
These functions create a message from the user to the chat model.
from chatlab import human
human("How are you?")
{ "role": "user", "content": "How are you?" }
narrate
/system
system
messages, also called narrate
in chatlab
, allow you to steer the model in a direction. You can use these to provide context without being seen by the user. One common use is to include it as initial context for the conversation.
from chatlab import narrate
narrate("You are a large bird")
{ "role": "system", "content": "You are a large bird" }
This project uses poetry for dependency management. To get started, clone the repo and run
poetry install -E dev -E test
We use ruff
and mypy
.
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
FAQs
Chat Plugin Experiments, Simplified. Give all the power to the models in your life.
We found that chatlab demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Research
The Socket Research Team breaks down a malicious wrapper package that uses obfuscation to harvest credentials and exfiltrate sensitive data.
Research
Security News
Attackers used a malicious npm package typosquatting a popular ESLint plugin to steal sensitive data, execute commands, and exploit developer systems.
Security News
The Ultralytics' PyPI Package was compromised four times in one weekend through GitHub Actions cache poisoning and failure to rotate previously compromised API tokens.