New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

llm-report-node

Package Overview
Dependencies
Maintainers
1
Versions
6
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

llm-report-node - npm Package Compare versions

Comparing version 1.0.3 to 1.0.4

1

dist/exporter.js

@@ -75,2 +75,3 @@ "use strict";

data = __assign(__assign({}, convertToOutputFormat(span.attributes)), { duration_in_ms: (0, utils_1.hrTimeToMilliseconds)(span.duration) });
console.log("Sending data to LLM Report");
return [4 /*yield*/, axios_1.default.post(this.serverAddress, data, {

@@ -77,0 +78,0 @@ headers: {

@@ -52,2 +52,3 @@ "use strict";

if (loggingApiUrl === void 0) { loggingApiUrl = "https://llm.report/api/v1/log/openai"; }
console.log("Initializing LLM Report SDK");
return new sdk_node_1.NodeSDK({

@@ -54,0 +55,0 @@ traceExporter: new exporter_1.LlmReportExporter(apiKey, loggingApiUrl),

2

exporter.ts

@@ -31,3 +31,3 @@ import { ReadableSpan, SpanExporter } from "@opentelemetry/sdk-trace-base";

};
console.log("Sending data to LLM Report");
await axios.post(this.serverAddress, data, {

@@ -34,0 +34,0 @@ headers: {

@@ -62,2 +62,3 @@ /*instrumentation.ts*/

) => {
console.log("Initializing LLM Report SDK");
return new NodeSDK({

@@ -64,0 +65,0 @@ traceExporter: new LlmReportExporter(apiKey, loggingApiUrl),

{
"name": "llm-report-node",
"version": "1.0.3",
"version": "1.0.4",
"description": "Log OpenAI requests to llm.report",

@@ -5,0 +5,0 @@ "main": "dist/index.js",

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc