
Security News
vlt Launches "reproduce": A New Tool Challenging the Limits of Package Provenance
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
Ollama Chat is a conversational AI chat client that uses Ollama to interact with local large language models (LLMs).
To get up and running with Ollama Chat follows these steps:
Install and start Ollama
Install Ollama Chat
pip install ollama-chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
ollama-chat
A web browser is launched and opens the Ollama Chat application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
To start a conversation from the command line, use the -m
argument:
ollama-chat -m "Why is the sky blue?"
To start a named template from the command line, use the -t
and -v
arguments:
ollama-chat -t askAristotle -v question "Why is the sky blue?"
Conversation Templates allow you to repeat the same prompts with different models. Templates can define variables for
use in the template title and prompt text (e.g., {{var}}
).
There are two ways to create a template. Click "Add Template" from the index page, and a new template is created and opened in the template editor. The other way is to click "Template" from a conversation view's menu.
Ollama Chat supports special prompt commands that allow you to include files, images, and URL content in your prompt, among other things. The following prompt commands are available:
/file
- include a file
/file README.md
Please summarize the README file.
/image
- include an image
/image image.jpeg
Please summarize the image.
/dir
- include files from a directory
/dir src/ollama_chat py
Please provide a summary for each Ollama Chat source file.
/url
- include a URL resource
/url https://craigahobbs.github.io/ollama-chat/README.md
Please summarize the README file.
/do
- execute a conversation template by name
/do city-report -v CityState "Seattle, WA"
/?
- list available prompt commands
/?
To get prompt command help use the -h
option:
/file -h
This package is developed using python-build. It was started using python-template as follows:
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
FAQs
An Ollama chat web application
We found that ollama-chat demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
vlt's new "reproduce" tool verifies npm packages against their source code, outperforming traditional provenance adoption in the JavaScript ecosystem.
Research
Security News
Socket researchers uncovered a malicious PyPI package exploiting Deezer’s API to enable coordinated music piracy through API abuse and C2 server control.
Research
The Socket Research Team discovered a malicious npm package, '@ton-wallet/create', stealing cryptocurrency wallet keys from developers and users in the TON ecosystem.