
Security News
Insecure Agents Podcast: Certified Patches, Supply Chain Security, and AI Agents
Socket CEO Feross Aboukhadijeh joins Insecure Agents to discuss CVE remediation and why supply chain attacks require a different security approach.
sengpt
Advanced tools
ChatGPT in your terminal, runs on re-gpt so no OpenAI API key required
Installation • Configuration • Usage • Building from source • Support
Ensure you have Python 3.11 and Glow installed.
pip install sengpt
sengpt --session_token <your-session-token-goes-here>.sengpt --config_file to see its location.preconfigured_prompts and set its value to key value pairs of prompt name and value e.g.,{
"preconfigured_prompts": {
"readme": "generate a README.md for this project",
"expain": "briefly explain what this code does",
"refactor": "refactor this code to improve readability"
}
}
sengpt --prompt_name or sengpt -pn.script_tags will clash with session_token so every time you try and use it sengpt will think you want to set your session token.some_project.py | sengpt --readme make it as brief as possibleBack and forth interaction with ChatGPT, saves the conversation on exit.
Currently doesn't support piped inputs i.e., cat README.md | sengpt summarise this document, if piped inputs are passed Query mode will be used instead.
Press Ctrl + C to exit.
Print ChatGPT's response, delete the conversation and exit.
The default mode is interactive mode but you can change this in the config
{
"default_mode": "query"
}
With this configuration to use interactive mode run sengpt --interactive
Either gpt-3.5 or gpt-4 can be used, the default is gpt-3.5. gpt-4 requires a ChatGPT Plus account and is slower. To switch to gpt-4 add this in your config file.
{
"model": "gpt-4"
}
How your username appears in the conversation, the default is You.
{
"username": "Sen"
}
Usage: sengpt [prompt] [options]
-h, --help Show help message and exit
-v, --version Show the version information
-cf, --config_file Show the config file's contents and location
-st, --session_token Set session token
-ng, --no_glow Disable pretty printing with Glow,
this can be set to be the default behaviour in the config file
-c, --copy Copy the prompt response to the clipboard,
this can be set to be the default behaviour in the config file
-p, --paste Append the most recently copied clipboard text to the sent prompt
-rc, --recent_conversation Use the most recently saved conversation as context
-pp, --preconfigured_prompt Append a preconfigured prompt to the sent prompt,
replace "preconfigured_prompt" with the prompt's name
as it appears in the config file
-q, query Use query mode i.e., print ChatGPT's response and exit,
this flag is only necessary if "default_mode" in config file is interactive
-s, --save By default conversations in query mode are deleted on exit,
this saves the conversation instead,
this can be set to be the default behaviour in the config file
-i, --interactive Use interactive mode i.e., back and forth interaction with ChatGPT,
this flag is only necessary if "default_mode" in the config file is query
-d, --delete By default conversations in interactive mode are saved on exit,
this deletes then exits the interactive mode session,
this can be set to be the default behaviour in the config file
Ensure you have Python 3.11 and Git installed.
git clone https://github.com/SenZmaKi/Sengpt && cd Sengpt && pip install poetry && poetry install
poetry run sengpt
poetry build
tar and wheel will be built at Sengpt/distFAQs
ChatGPT in your terminal, no OpenAI API key required
We found that sengpt demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Socket CEO Feross Aboukhadijeh joins Insecure Agents to discuss CVE remediation and why supply chain attacks require a different security approach.

Security News
Tailwind Labs laid off 75% of its engineering team after revenue dropped 80%, as LLMs redirect traffic away from documentation where developers discover paid products.

Security News
The planned feature introduces a review step before releases go live, following the Shai-Hulud attacks and a rocky migration off classic tokens that disrupted maintainer workflows.