
Research
PyPI Package Disguised as Instagram Growth Tool Harvests User Credentials
A deceptive PyPI package posing as an Instagram growth tool collects user credentials and sends them to third-party bot services.
Granite IO Processing is a framework which enables you to transform how a user calls or infers an IBM Granite model and how the output from the model is returned to the user. In other words, the framework allows you to extend the functionality of calling the model.
We recommend using a Python virtual environment with Python 3.10+. Here is how to setup a virtual environment using Python venv:
python3 -m venv granite_io_venv
source granite_io_venv/bin/activate
[!TIP] If you use pyenv, Conda Miniforge or other such tools for Python version management, create the virtual environment with that tool instead of venv. Otherwise, you may have issues with installed packages not being found as they are linked to your Python version management tool and not
venv
.
There are 2 ways to install the Granite IO Processor as follows:
To install from release (PyPi package):
python3 -m venv granite_io_venv
source granite_io_venv/bin/activate
pip install granite-io
python -m nltk.downloader punkt_tab
[!NOTE]
granite-io
uses NLTK Data Punkt Sentence Tokenizer for extracting contents when parsing output from a model. The command above shows how to install the required NLTK data. Check out Installing NLTK Data for more detailed instructions.
To install from source(GitHub Repository):
python3 -m venv granite_io_venv
source granite_io_venv/bin/activate
git clone https://github.com/ibm-granite/granite-io
cd granite-io
pip install -e .
python -m nltk.downloader punkt_tab
[!NOTE]
granite-io
uses NLTK Data Punkt Sentence Tokenizer for extracting contents when parsing output from a model. The command above shows how to install the required NLTK data. Check out Installing NLTK Data for more detailed instructions.
Sample code snippet showing how to use the framework:
from granite_io import make_backend, make_io_processor
from granite_io.types import ChatCompletionInputs, UserMessage
model_name = "granite3.2:8b"
io_processor = make_io_processor(
model_name, backend=make_backend("openai", {"model_name": model_name})
)
messages=[
UserMessage(
content="What's the fastest way for a seller to visit all the cities in their region?",
)
]
# Without Thinking
outputs = io_processor.create_chat_completion(ChatCompletionInputs(messages=messages))
print("------ WITHOUT THINKING ------")
print(outputs.results[0].next_message.content)
# With Thinking
outputs = io_processor.create_chat_completion(
ChatCompletionInputs(messages=messages, thinking=True)
)
print("------ WITH THINKING ------")
print(">> Thoughts:")
print(outputs.results[0].next_message.reasoning_content)
print(">> Response:")
print(outputs.results[0].next_message.content)
[!IMPORTANT] To get started with the examples, make sure you have followed the Installation steps first. You will need additional packages to be able to run the OpenAI example. They can be installed by running
pip install -e "granite-io[openai]"
. Replace package namegranite-io
with.
if installing from source.To be able to run the above code snippet, you will need an Ollama server running locally and IBM Granite 3.2 model cached (
ollama pull granite3.2:8b
).
To help you get up and running as quickly as possible with the Granite IO Processing framework, check out the following resources which demonstrate further how to use the framework:
[!IMPORTANT] To get started with the examples, make sure you have followed the Installation steps first. You will need additional packages to be able to run the examples. They can be installed by running
pip install -e "granite-io[openai]"
andpip install -e "granite-io[litellm]
. Replace package namegranite-io
with.
if installing from source.You will also need an Ollama server running locally and IBM Granite 3.2 model cached (
ollama pull granite3.2:8b
).
[!IMPORTANT] To get started with the examples, make sure you have followed the Installation steps first. You will also need additional packages to be able to run the Jupyter notebook. They can be installed by running
pip install -e "granite-io[transformers]"
andpip install -e "granite-io[notebook]"
. Replace package namegranite-io
with.
if installing from source. The notebooks can be then run with following commandjupyter notebook <path_to_notebook>
.
For more information about architecture and design decisions, refer to docs/design.md.
Check out our contributing guide to learn how to contribute.
FAQs
Input and output processing for IBM Granite models
We found that granite-io demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
A deceptive PyPI package posing as an Instagram growth tool collects user credentials and sends them to third-party bot services.
Product
Socket now supports pylock.toml, enabling secure, reproducible Python builds with advanced scanning and full alignment with PEP 751's new standard.
Security News
Research
Socket uncovered two npm packages that register hidden HTTP endpoints to delete all files on command.