
Security News
Attackers Are Hunting High-Impact Node.js Maintainers in a Coordinated Social Engineering Campaign
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.
Partitura's built-in integrated AI model router, strumento, the tool each agent uses to play their role
/model command.strumento model.First, ensure you have Claude Code installed:
npm install -g @anthropic-ai/claude-code
Then, install Strumento:
npm install -g strumento-cli
Create and configure your ~/.strumento/config.json file. For more details, you can refer to config.example.json.
The config.json file has several key sections:
PROXY_URL (optional): You can set a proxy for API requests, for example: "PROXY_URL": "http://127.0.0.1:7890".
LOG (optional): You can enable logging by setting it to true. When set to false, no log files will be created. Default is true.
LOG_LEVEL (optional): Set the logging level. Available options are: "fatal", "error", "warn", "info", "debug", "trace". Default is "debug".
Logging Systems: The Strumento uses two separate logging systems:
~/.strumento/logs/ directory with filenames like strumento-*.log~/.strumento/strumento.logAPIKEY (optional): You can set a secret key to authenticate requests. When set, clients must provide this key in the Authorization header (e.g., Bearer your-secret-key) or the x-api-key header. Example: "APIKEY": "your-secret-key".
HOST (optional): You can set the host address for the server. If APIKEY is not set, the host will be forced to 127.0.0.1 for security reasons to prevent unauthorized access. Example: "HOST": "0.0.0.0".
NON_INTERACTIVE_MODE (optional): When set to true, enables compatibility with non-interactive environments like GitHub Actions, Docker containers, or other CI/CD systems. This sets appropriate environment variables (CI=true, FORCE_COLOR=0, etc.) and configures stdin handling to prevent the process from hanging in automated environments. Example: "NON_INTERACTIVE_MODE": true.
Providers: Used to configure different model providers.
Router: Used to set up routing rules. default specifies the default model, which will be used for all requests if no other route is configured.
API_TIMEOUT_MS: Specifies the timeout for API calls in milliseconds.
Strumento supports environment variable interpolation for secure API key management. You can reference environment variables in your config.json using either $VAR_NAME or ${VAR_NAME} syntax:
{
"OPENAI_API_KEY": "$OPENAI_API_KEY",
"GEMINI_API_KEY": "${GEMINI_API_KEY}",
"Providers": [
{
"name": "openai",
"api_base_url": "https://api.openai.com/v1/chat/completions",
"api_key": "$OPENAI_API_KEY",
"models": ["gpt-5", "gpt-5-mini"]
}
]
}
This allows you to keep sensitive API keys in environment variables instead of hardcoding them in configuration files. The interpolation works recursively through nested objects and arrays.
Here is a comprehensive example:
{
"APIKEY": "your-secret-key",
"PROXY_URL": "http://127.0.0.1:7890",
"LOG": true,
"API_TIMEOUT_MS": 600000,
"NON_INTERACTIVE_MODE": false,
"Providers": [
{
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-xxx",
"models": [
"google/gemini-2.5-pro-preview",
"openai/gpt-5-mini",
"openai/gpt-5-nano",
"anthropic/claude-3.7-sonnet:thinking"
],
"transformer": {
"use": ["openrouter"]
}
},
{
"name": "deepseek",
"api_base_url": "https://api.deepseek.com/chat/completions",
"api_key": "sk-xxx",
"models": ["deepseek-chat", "deepseek-reasoner"],
"transformer": {
"use": ["deepseek"],
"deepseek-chat": {
"use": ["tooluse"]
}
}
},
{
"name": "ollama",
"api_base_url": "http://localhost:11434/v1/chat/completions",
"api_key": "ollama",
"models": ["qwen2.5-coder:latest"]
},
{
"name": "gemini",
"api_base_url": "https://generativelanguage.googleapis.com/v1beta/models/",
"api_key": "sk-xxx",
"models": ["gemini-2.5-flash", "gemini-2.5-pro"],
"transformer": {
"use": ["gemini"]
}
},
{
"name": "volcengine",
"api_base_url": "https://ark.cn-beijing.volces.com/api/v3/chat/completions",
"api_key": "sk-xxx",
"models": ["deepseek-v3-250324", "deepseek-r1-250528"],
"transformer": {
"use": ["deepseek"]
}
},
{
"name": "modelscope",
"api_base_url": "https://api-inference.modelscope.cn/v1/chat/completions",
"api_key": "",
"models": ["Qwen/Qwen3-Coder-480B-A35B-Instruct", "Qwen/Qwen3-235B-A22B-Thinking-2507"],
"transformer": {
"use": [
[
"maxtoken",
{
"max_tokens": 65536
}
],
"enhancetool"
],
"Qwen/Qwen3-235B-A22B-Thinking-2507": {
"use": ["reasoning"]
}
}
},
{
"name": "dashscope",
"api_base_url": "https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions",
"api_key": "",
"models": ["qwen3-coder-plus"],
"transformer": {
"use": [
[
"maxtoken",
{
"max_tokens": 65536
}
],
"enhancetool"
]
}
},
{
"name": "aihubmix",
"api_base_url": "https://aihubmix.com/v1/chat/completions",
"api_key": "sk-",
"models": [
"Z/glm-4.5",
"claude-opus-4-20250514",
"gemini-2.5-pro"
]
}
],
"Router": {
"default": "openrouter,openai/gpt-5-mini",
"woodwind": "openrouter,openai/gpt-5-mini",
"percussion": "openrouter,openai/gpt-5-mini",
"brass": "openrouter,openai/gpt-5-nano",
"strings": "openrouter,x-ai/gpt-5-nano"
}
}
Start Claude Code using the router:
strumento
Note: After modifying the configuration file, you need to restart the service for the changes to take effect:
strumento restart
For users who prefer terminal-based workflows, you can use the interactive CLI model selector:
strumento model

This command provides an interactive interface to:
The CLI tool validates all inputs and provides helpful prompts to guide you through the configuration process, making it easy to manage complex setups without editing JSON files manually.
The Providers array is where you define the different model providers you want to use. Each provider object requires:
name: A unique name for the provider.api_base_url: The full API endpoint for chat completions.api_key: Your API key for the provider.models: A list of model names available from this provider.transformer (optional): Specifies transformers to process requests and responses.Transformers allow you to modify the request and response payloads to ensure compatibility with different provider APIs.
openrouter transformer is applied to all models under the openrouter provider.
{
"name": "openrouter",
"api_base_url": "https://openrouter.ai/api/v1/chat/completions",
"api_key": "sk-xxx",
"models": [
"google/gemini-2.5-pro-preview",
"openai/gpt-5-mini",
"openai/gpt-5-nano"
],
"transformer": { "use": ["openrouter"] }
}
Available Built-in Transformers:
Anthropic:If you use only the Anthropic transformer, it will preserve the original request and response parameters(you can use it to connect directly to an Anthropic endpoint).openrouter: Adapts requests/responses for OpenRouter API. It can also accept a provider routing parameter to specify which underlying providers OpenRouter should use. For more details, refer to the OpenRouter documentation. See an example below:
"transformer": {
"use": ["openrouter"],
"moonshotai/kimi-k2": {
"use": [
[
"openrouter",
{
"provider": {
"only": ["moonshotai/fp8"]
}
}
]
]
}
}
Custom Transformers:
You can also create your own transformers and load them via the transformers field in config.json.
{
"transformers": [
{
"path": "/User/xxx/.strumento/plugins/gemini-cli.js",
"options": {
"project": "xxx"
}
}
]
}
The Router object defines which model to use for different scenarios:
default: The default model for general tasks.
woodwind: A model for frontend/UI tasks. This can be a model with modern training data and highly focused on design-heavy work.
percussion: A model for reasoning-heavy tasks, logical structures, backend, architucture and system design.
brass: A model for handling bugs, debugging, checking code for errors and unintended behaviours and testing. Can be a smaller but faster model.
strings: Used for integrating other agents works into a unified and coherent system. Should be prioritized, following Spalla, to use more expensive but still fast models, that can handle logic and harder processes efficiently.
You can also switch models dynamically in the terminal tab on Partitura with the /model command:
/model provider_name,model_name
Example: /model openrouter,x-ai/grok-code-fast-1
Here is an example of a custom-router.js based on custom-router.example.js:
// /User/xxx/.strumento/custom-router.js
/**
* A custom router function to determine which model to use based on the request.
*
* @param {object} req - The request object from Claude Code, containing the request body.
* @param {object} config - The application's config object.
* @returns {Promise<string|null>} - A promise that resolves to the "provider,model_name" string, or null to use the default router.
*/
module.exports = async function router(req, config) {
const userMessage = req.body.messages.find((m) => m.role === "user")?.content;
if (userMessage && userMessage.includes("explain this code")) {
// Use a powerful model for code explanation
return "openrouter,openai/gpt-5-nano";
}
// Fallback to the default router configuration
return null;
};
name: Claude Code
on:
issue_comment:
types: [created]
# ... other triggers
jobs:
claude:
if: |
(github.event_name == 'issue_comment' && contains(github.event.comment.body, '@claude')) ||
# ... other conditions
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: read
issues: read
id-token: write
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 1
- name: Prepare Environment
run: |
curl -fsSL https://bun.sh/install | bash
mkdir -p $HOME/.strumento
cat << 'EOF' > $HOME/.strumento/config.json
{
"log": true,
"NON_INTERACTIVE_MODE": true,
"OPENAI_API_KEY": "${{ secrets.OPENAI_API_KEY }}",
"OPENAI_BASE_URL": "https://api.deepseek.com",
"OPENAI_MODEL": "deepseek-chat"
}
EOF
shell: bash
- name: Start Strumento
run: |
nohup ~/.bun/bin/bunx @gabriel-feang/strumento@1.0.8 start &
shell: bash
- name: Run Claude Code
id: claude
uses: anthropics/claude-code-action@beta
env:
ANTHROPIC_BASE_URL: http://localhost:3456
with:
anthropic_api_key: "any-string-is-ok"
FAQs
Partitura's built-in integrated AI model router, strumento, the tool each agent uses to play their role
We found that strumento demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Multiple high-impact npm maintainers confirm they have been targeted in the same social engineering campaign that compromised Axios.

Security News
Axios compromise traced to social engineering, showing how attacks on maintainers can bypass controls and expose the broader software supply chain.

Security News
Node.js has paused its bug bounty program after funding ended, removing payouts for vulnerability reports but keeping its security process unchanged.