@jupyter-ai/core
Advanced tools
Changelog
2.28.2
@file
and reply gracefully in chat #1106 (@srdas)Changelog
Changelog
2.28.0
This release notably includes the following changes:
Models from the Anthropic
and ChatAnthropic
providers are now merged in the config UI, so all Anthropic models are shown in the same place in the "Language model" dropdown.
Anthropic Claude v1 LLMs have been removed, as the models are retired and no longer available from the API.
The chat system prompt has been updated to encourage the LLM to express dollar quantities in LaTeX, i.e. the LLM should prefer returning \(\$100\)
instead of $100
. For the latest LLMs, this generally fixes a rendering issue when multiple dollar quantities are given literally in the same sentence.
/export
has been fixed to include streamed replies, which were previously omitted.
Calling non-chat providers with history has been fixed to behave properly in magics.
Haiku-3.5
model in Anthropic #1092 (@srdas)$
symbols to delimit inline math in human messages #1094 (@dlqqq)/export
by including streamed agent messages #1077 (@mcavdar)$
to literally denote quantities of USD in chat #1068 (@dlqqq)(GitHub contributors page for this release)
@alanmeeson | @dlqqq | @krassowski | @mcavdar | @srdas
Changelog
2.27.0
Changelog
2.26.0
This release notably includes the addition of a "Stop streaming" button, which takes over the "Send" button when a reply is streaming and the chat input is empty. While Jupyternaut is streaming a reply to a user, the user has the option to click the "Stop streaming" button to interrupt Jupyternaut and stop it from streaming further. Thank you @krassowski for contributing this feature! 🎉
/learn
#1047 (@dlqqq)BaseChatHandler
#1039 (@dlqqq)username
from IdentityProvider
#1034 (@krassowski)jupyter-collaboration
v3 #1035 (@krassowski)(GitHub contributors page for this release)
@dlqqq | @JasonWeill | @jlsajfj | @krassowski | @michaelchia | @pre-commit-ci
Changelog
2.25.0
(GitHub contributors page for this release)
@akaihola | @dlqqq | @jtpio | @pre-commit-ci
Changelog
2.24.1
Changelog
2.24.0
This release notably introduces a new context command @file:<file-path>
to the chat UI, which includes the content of the target file with your prompt when sent. This allows you to ask questions like:
What does @file:src/components/ActionButton.tsx do?
Can you refactor @file:src/index.ts to use async/await syntax?
How do I add an optional dependency to @file:pyproject.toml?
The context command feature also includes an autocomplete menu UI to help navigate your filesystem with fewer keystrokes.
Thank you @michaelchia for developing this feature!
ChatOllama
base class in Ollama provider #1015 (@srdas)metadata
field to agent messages #1013 (@dlqqq)/learn
#989 (@andrewfulton9)(GitHub contributors page for this release)
@akaihola | @andrewfulton9 | @dlqqq | @ellisonbg | @hockeymomonow | @krassowski | @michaelchia | @srdas
<!-- <END NEW CHANGELOG ENTRY> -->Changelog
2.23.0
RunnableWithMessageHistory
import #980 (@krassowski)(GitHub contributors page for this release)
@dlqqq | @krassowski | @michaelchia | @srdas
Changelog
2.22.0
(GitHub contributors page for this release)
@dlqqq | @krassowski | @michaelchia | @pre-commit-ci | @srdas | @trducng