
Security News
The Changelog Podcast: Practical Steps to Stay Safe on npm
Learn the essential steps every developer should take to stay secure on npm and reduce exposure to supply chain attacks.
@markprompt/core
Advanced tools
`@markprompt/core` is the core library for Markprompt, a conversational AI component for your website, trained on your data.
@markprompt/core@markprompt/core is the core library for Markprompt, a conversational AI
component for your website, trained on your data.
It contains core functionality for Markprompt and allows you to build abstractions on top of it.
npm install @markprompt/core
In browsers with esm.sh:
<script type="module">
import { submitPrompt } from 'https://esm.sh/@markprompt/core';
</script>
import { submitPrompt } from '@markprompt/core';
// User input
const prompt = 'What is Markprompt?';
// Can be obtained in your project settings on markprompt.com
const projectKey = 'YOUR-PROJECT-KEY';
// Called when a new answer chunk is available
// Should be concatenated to previous chunks
function onAnswerChunk(chunk) {
// Process an answer chunk
}
// Called when references are available
function onReferences(references) {
// Process references
}
// Called when submitPrompt encounters an error
const onError(error) {
// Handle errors
}
// Optional parameters, defaults displayed
const options = {
model: 'gpt-3.5-turbo', // Supports all OpenAI models
iDontKnowMessage: 'Sorry, I am not sure how to answer that.',
apiUrl: 'https://api.markprompt.com/v1/completions', // Or your own completions API endpoint
};
await submitPrompt(prompt, projectKey, onAnswerChunk, onReferences, onPromptId, onError, options);
submitPrompt(prompt, projectKey, onAnswerChunk, onReferences, onError, options?)Submit a prompt to the Markprompt Completions API.
prompt (string): Prompt to submit to the modelprojectKey (string): Project key for the projectonAnswerChunk (function): Answers come in via streaming. This function is
called when a new chunk arrivesonReferences (function): This function is called when receiving the list
of references from which the response was created.onError (function): called when an error occursoptions (SubmitPromptOptions): Optional parametersapiUrl (string): URL at which to fetch completionsiDontKnowMessage (string): Message returned when the model does not have
an answermodel (OpenAIModelId): The OpenAI model to usesystemPrompt (string): The prompt templatetemperature (number): The model temperaturetopP (number): The model top PfrequencyPenalty (number): The model frequency penaltypresencePenalty (number): The model present penaltymaxTokens (number): The max number of tokens to include in the responsesectionsMatchCount (number): The number of sections to include in the
prompt contextsectionsMatchThreshold (number): The similarity threshold between the
input question and selected sectionssignal (AbortSignal): AbortController signalsubmitChat(prompt, projectKey, onAnswerChunk, onReferences, onError, options?)Submit a prompt to the Markprompt Completions API.
messages (ChatMessage[]): Chat messages to submit to the modelprojectKey (string): Project key for the projectonAnswerChunk (function): Answers come in via streaming. This function is
called when a new chunk arrivesonReferences (function): This function is called when receiving the list
of references from which the response was created.onError (function): called when an error occursoptions (SubmitChatOptions): Optional parametersapiUrl (string): URL at which to fetch completionsiDontKnowMessage (string): Message returned when the model does not have
an answermodel (OpenAIModelId): The OpenAI model to usesystemPrompt (string): The prompt templatetemperature (number): The model temperaturetopP (number): The model top PfrequencyPenalty (number): The model frequency penaltypresencePenalty (number): The model present penaltymaxTokens (number): The max number of tokens to include in the responsesectionsMatchCount (number): The number of sections to include in the
prompt contextsectionsMatchThreshold (number): The similarity threshold between the
input question and selected sectionssignal (AbortSignal): AbortController signalA promise that resolves when the response is fully handled.
submitSearchQuery(query, projectKey, options?)Submit a search query to the Markprompt Search API.
query (string): Search queryprojectKey (string): Project key for the projectoptions (object): Optional parametersapiUrl (string): URL at which to fetch search resultslimit (number): Maximum amount of results to returnsignal (AbortSignal): AbortController signalA list of search results.
This library is created by the team behind Markprompt (@markprompt).
MIT © Markprompt
FAQs
`@markprompt/core` is the core library for Markprompt, a conversational AI component for your website, trained on your data.
The npm package @markprompt/core receives a total of 13,226 weekly downloads. As such, @markprompt/core popularity was classified as popular.
We found that @markprompt/core demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
Learn the essential steps every developer should take to stay secure on npm and reduce exposure to supply chain attacks.

Security News
Experts push back on new claims about AI-driven ransomware, warning that hype and sponsored research are distorting how the threat is understood.

Security News
Ruby's creator Matz assumes control of RubyGems and Bundler repositories while former maintainers agree to step back and transfer all rights to end the dispute.