New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

@empiricalrun/llm

Package Overview
Dependencies
Maintainers
0
Versions
45
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

@empiricalrun/llm - npm Package Compare versions

Comparing version 0.9.0 to 0.9.1

6

CHANGELOG.md
# @empiricalrun/llm
## 0.9.1
### Patch Changes
- be30850: fix: expose input/output tokens from LLM class
## 0.9.0

@@ -4,0 +10,0 @@

2

dist/index.d.ts

@@ -15,2 +15,4 @@ import { LangfuseGenerationClient, LangfuseSpanClient, LangfuseTraceClient } from "langfuse";

private _maxTokens;
completionTokens: number;
promptTokens: number;
constructor({ trace, provider, providerApiKey, traceName, maxTokens, defaultModel, }: {

@@ -17,0 +19,0 @@ trace?: TraceClient;

@@ -24,2 +24,4 @@ "use strict";

_maxTokens;
completionTokens = 0;
promptTokens = 0;
constructor({ trace, provider, providerApiKey, traceName = "get-llm-result", maxTokens, defaultModel, }) {

@@ -88,2 +90,4 @@ this._trace = trace;

}
this.completionTokens = completion?.usage?.completion_tokens || 0;
this.promptTokens = completion?.usage?.prompt_tokens || 0;
this._usedTokens += completion?.usage?.total_tokens || 0;

@@ -90,0 +94,0 @@ return output;

2

package.json
{
"name": "@empiricalrun/llm",
"version": "0.9.0",
"version": "0.9.1",
"main": "dist/index.js",

@@ -5,0 +5,0 @@ "exports": {

Sorry, the diff of this file is not supported yet

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc