@xrpl/ai-core
@xrpl/ai-core
is the core library for XRPL AI, a conversational AI component for your website, trained on your data.
It contains core functionality for XRPL AI and allows you to build abstractions on top of it.
Installation
npm install @xrpl/ai-core
In browsers with esm.sh:
<script type="module">
import {
submitChat,
submitSearchQuery,
submitFeedback,
} from 'https://esm.sh/@markprompt/core';
</script>
Usage
import { submitChat } from '@xrpl/ai-core';
const prompt = 'What is XRPL AI?';
const projectKey = 'YOUR-PROJECT-KEY';
function onAnswerChunk(chunk) {
}
function onReferences(references) {
}
function onConversationId(conversationId) {
}
function onPromptId(promptId) {
}
function onError(error) {
}
const options = {
model: 'gpt-4.1-mini-2025-04-14',
iDontKnowMessage: 'Sorry, I am not sure how to answer that.',
apiUrl: 'https://api.ai.xrpl.org/v1/completions',
};
await submitChat(
[{ content: prompt, role: 'user' }],
projectKey,
onAnswerChunk,
onReferences,
onConversationId,
onPromptId,
onError,
options,
);
API
submitChat(messages: ChatMessage[], projectKey: string, onAnswerChunk, onReferences, onConversationId, onPromptId, onError, options?)
Submit a prompt to the XRPL AI Completions API.
Arguments
messages
(ChatMessage[]
): Chat messages to submit to the model
projectKey
(string
): Project key for the project
onAnswerChunk
(function(chunk: string)
): Answers come in via streaming.
This function is called when a new chunk arrives. Chunks should be
concatenated to previous chunks of the same answer response.
onReferences
(function(references: FileSectionReference[])
): This function
is called when receiving the list of references from which the response was
created.
onConversationId
(function(conversationId: string)
): This function is
called with the conversation ID returned by the API. Used to keep track of
conversations.
onPromptId
(function(promptId: string)
): This function is called with the
prompt ID returned by the API. Used to submit feedback.
onError
(function
): called when an error occurs
options
(SubmitChatOptions
): Optional parameters
Options
All options are optional.
apiUrl
(string
): URL at which to fetch completions
conversationId
(string
): Conversation ID
iDontKnowMessage
(string
): Message returned when the model does not have
an answer
model
(OpenAIModelId
): The OpenAI model to use
systemPrompt
(string
): The prompt template
temperature
(number
): The model temperature
topP
(number
): The model top P
frequencyPenalty
(number
): The model frequency penalty
presencePenalty
(number
): The model present penalty
maxTokens
(number
): The max number of tokens to include in the response
sectionsMatchCount
(number
): The number of sections to include in the
prompt context
sectionsMatchThreshold
(number
): The similarity threshold between the
input question and selected sections
signal
(AbortSignal
): AbortController signal
Returns
A promise that resolves when the response is fully handled.
submitSearchQuery(query, projectKey, options?)
Submit a search query to the Markprompt Search API.
Arguments
query
(string
): Search query
projectKey
(string
): Project key for the project
options
(object
): Optional parameters
Options
apiUrl
(string
): URL at which to fetch search results
limit
(number
): Maximum amount of results to return
signal
(AbortSignal
): AbortController signal
Returns
A list of search results.
submitFeedback(feedback, projectKey, options?)
Submit feedback to the Markprompt Feedback API about a specific prompt.
Arguments
feedback
(object
): Feedback to submit
feedback.feedback
(object
): Feedback data
feedback.feedback.vote
("1" | "-1"
): Vote
feedback.promptId
(string
): Prompt ID
projectKey
(string
): Project key for the project
options
(object
): Optional parameters
options.apiUrl
(string
): URL at which to post feedback
options.onFeedbackSubmitted
(function
): Callback function when feedback
is submitted
options.signal
(AbortSignal
): AbortController signal
Returns
A promise that resolves when the feedback is submitted. Has no return value.
License
MIT © XRPL AI Devs