@markprompt/core
@markprompt/core
is the core library for Markprompt, a conversational AI component for your website, trained on your data.
It contains core functionality for Markprompt and allows you to build abstractions on top of it.
Installation
npm install @markprompt/core
In browsers with esm.sh:
<script type="module">
import { submitPrompt } from 'https://esm.sh/@markprompt/core';
</script>
Usage
import { submitPrompt } from '@markprompt/core';
const prompt = 'Hello, Markprompt!';
const projectKey = 'YOUR-PROJECT-KEY';
function onAnswerChunk(chunk) {
}
function onReferences(references) {
}
const onError(error) {
}
const options = {
model: 'gpt-3.5-turbo',
iDontKnowMessage: 'Sorry, I am not sure how to answer that.',
completionsUrl: 'https://api.markprompt.com/v1/completions',
};
await submitPrompt(prompt, projectKey, onAnswerChunk, onReferences, onError, options);
API
submitPrompt(prompt, projectKey, onAnswerChunk, onReferences, onError, options?)
Submit a prompt the the Markprompt API.
Arguments
prompt
(string
): Prompt to submit to the modelprojectKey
(string
): The key of your projectonAnswerChunk
(function
): Answers come in via streaming. This function is called when a new chunk arrivesonReferences
(function
): This function is called when a chunk includes references.onError
(function
): called when an error occursoptions
(object
): Optional options object
Options
completionsUrl
(string
): URL at which to fetch completionsiDontKnowMessage
(string
): Message returned when the model does not have an answermodel
(OpenAIModelId
): The OpenAI model to usepromptTemplate
(string
): The prompt templatetemperature
(number
): The model temperaturetopP
(number
): The model top PfrequencyPenalty
(number
): The model frequency penaltypresencePenalty
(number
): The model present penaltymaxTokens
(number
): The max number of tokens to include in the responsesectionsMatchCount
(number
): The number of sections to include in the prompt contextsectionsMatchThreshold
(number
): The similarity threshold between the input question and selected sectionssignal
(AbortSignal
): AbortController signal
Returns
A promise that resolves when the response is fully handled.
Authors
This library is created by the team behind Markprompt
(@markprompt).
License
MIT © Motif