New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

@copilot-extensions/preview-sdk

Package Overview
Dependencies
Maintainers
0
Versions
16
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@copilot-extensions/preview-sdk - npm Package Compare versions

Comparing version 2.2.0 to 2.3.0

lib/prompt.js

17

dreamcode.md

@@ -73,2 +73,18 @@ # Copilot Extension Dreamcode

For other environments, these methods are available:
```js
// verify the payload and call handlers
await copilotExtension.verifyAndReceive({ payload, signature, keyId });
// same, but skip verification
await copilotExtension.receive({ payload });
// and if you don't want to use the event-based API
const { isValidRequest, payload } = await copilotExtension.verifyAndParse(
payload,
signature,
keyId
);
```
## Notes

@@ -81,2 +97,3 @@

- `prompt` is based on my work at https://github.com/github/gr2m-projects/blob/167/github-models/167-github-models/README.md. A simple API to interact with GitHub models. I assume we will default the prompt URL to `https://api.githubcopilot.com/chat/completions` and the model to `gpt-4o` (or whatever our CAPI name for that is?)
- The `prompt` API will automatically apply interop transformations if the request is sent to an endpoint other than Copilot's chat complitions endpoint.
- `respond` is an API to send different types of responses to the user

@@ -83,0 +100,0 @@ - `log` is the logger as we use it in Octokit. See https://github.com/octokit/core.js?tab=readme-ov-file#logging

28

index.d.ts

@@ -247,2 +247,26 @@ import { request } from "@octokit/request";

// prompt
/** model names supported by Copilot API */
export type ModelName =
| "gpt-4"
| "gpt-3.5-turbo"
export type PromptOptions = {
model: ModelName
token: string
request?: {
fetch?: Function
}
}
export type PromptResult = {
requestId: string
message: Message
}
interface PromptInterface {
(userPrompt: string, options: PromptOptions): Promise<PromptResult>;
}
// exported methods

@@ -265,2 +289,4 @@

export declare const getUserMessage: GetUserMessageInterface;
export declare const getUserConfirmation: GetUserConfirmationInterface;
export declare const getUserConfirmation: GetUserConfirmationInterface;
export declare const prompt: PromptInterface;

@@ -6,1 +6,2 @@ // @ts-check

export * from "./lib/verification.js";
export * from "./lib/prompt.js";

@@ -21,2 +21,3 @@ import { expectType } from "tsd";

CopilotRequestPayload,
prompt,
} from "./index.js";

@@ -271,1 +272,29 @@

}
export async function promptTest() {
const result = await prompt("What is the capital of France?", {
model: "gpt-4",
token: "secret",
})
expectType<string>(result.requestId)
expectType<string>(result.message.content)
// with custom fetch
await prompt("What is the capital of France?", {
model: "gpt-4",
token: "secret",
request: {
fetch: () => { }
}
})
// @ts-expect-error - 2nd argument is required
prompt("What is the capital of France?")
// @ts-expect-error - model argument is required
prompt("What is the capital of France?", { token: "" })
// @ts-expect-error - token argument is required
prompt("What is the capital of France?", { model: "" })
}

2

package.json

@@ -8,3 +8,3 @@ {

"type": "module",
"version": "2.2.0",
"version": "2.3.0",
"keywords": [

@@ -11,0 +11,0 @@ "ai",

@@ -17,3 +17,3 @@ # `@copilot-extensions/preview-sdk`

signature,
key,
keyId,
{

@@ -42,2 +42,19 @@ token: process.env.GITHUB_TOKEN,

### Send a custom propmt
```js
import { prompt } from "@copilot-extensions/preview-sdk";
try {
const { message } = await prompt("What is the capital of France?", {
model: "gpt-4",
token: process.env.TOKEN,
});
console.log(message.content);
} catch (error) {
console.error(error);
}
```
## API

@@ -290,2 +307,82 @@

## Prompt (Custom Chat completion calls)
#### `prompt(message, options)`
Send a prompt to the chat UI and receive a response from the user. The `message` argument must be a string and may include markdown.
The `options` argument is optional. It can contain a `token` to authenticate the request to GitHub's API, or a custom `request.fetch` instance to use for the request.
```js
import { prompt } from "@copilot-extensions/preview-sdk";
const { message } = await prompt("What is the capital of France?", {
model: "gpt-4",
token: process.env.TOKEN,
});
console.log(message.content);
```
⚠️ Not all of the arguments below are implemented yet.
```js
await prompt({
model: "gpt-4o",
token: process.env.TOKEN,
system: "You are a helpful assistant.",
messages: [
{ role: "user", content: "What is the capital of France?" },
{ role: "assistant", content: "The capital of France is Paris." },
{
role: "user",
content: [
[
{ type: "text", text: "What about this country?" },
{
type: "image_url",
image_url: urlToImageOfFlagOfSpain,
},
],
],
},
],
// GitHub recommends using your GitHub username, the name of your application
// https://docs.github.com/en/rest/using-the-rest-api/getting-started-with-the-rest-api?apiVersion=2022-11-28#user-agent
userAgent: "gr2m/my-app v1.2.3",
// set an alternative chat completions endpoint
endpoint: "https://models.inference.ai.azure.com/chat/completions",
// compare https://platform.openai.com/docs/guides/function-calling/configuring-function-calling-behavior-using-the-tool_choice-parameter
toolChoice: "auto",
tools: [
{
type: "function",
function: {
name: "get_weather",
strict: true,
parameters: {
type: "object",
properties: {
location: { type: "string" },
unit: { type: "string", enum: ["c", "f"] },
},
required: ["location", "unit"],
additionalProperties: False,
},
},
},
],
// configuration related to the request transport layer
request: {
// for mocking, proxying, client certificates, etc.
fetch: myCustomFetch,
// hook into request life cycle for complex authentication strategies, retries, throttling, etc
// compare options.request.hook from https://github.com/octokit/request.js
hook: myCustomHook,
// Use an `AbortController` instance to cancel a request
signal: myAbortController.signal,
},
});
```
## Dreamcode

@@ -292,0 +389,0 @@

@@ -31,6 +31,19 @@ import { test, suite } from "node:test";

const payload = transformPayloadForOpenAICompatibility({
messages: [],
messages: [
{
role: "role",
name: "name",
content: "content",
someCopilotKey: "value",
},
],
someCopilotKey: "value",
});
t.assert.deepStrictEqual(payload.messages, []);
t.assert.deepStrictEqual(payload.messages, [
{
role: "role",
name: "name",
content: "content",
},
]);
});

@@ -37,0 +50,0 @@

@@ -31,2 +31,9 @@ import { test, suite } from "node:test";

test("createTextEvent()", (t) => {
const event = createTextEvent("test");
t.assert.equal(undefined, event.event);
t.assert.snapshot(event.data);
t.assert.snapshot(event.toString());
});
test("createConfirmationEvent()", (t) => {

@@ -33,0 +40,0 @@ const event = createConfirmationEvent({

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc