Comparing version 0.3.3 to 0.3.4
{ | ||
"name": "openpipe", | ||
"version": "0.3.3", | ||
"version": "0.3.4", | ||
"type": "module", | ||
@@ -5,0 +5,0 @@ "description": "Metrics and auto-evaluation for LLM calls", |
@@ -62,7 +62,7 @@ # OpenPipe Node API Library | ||
<i>How do I report calls to my self-hosted instance?</i> | ||
#### <i>How do I report calls to my self-hosted instance?</i> | ||
Start an instance by following the instructions on [Running Locally](https://github.com/OpenPipe/OpenPipe#running-locally). Once it's running, point your `OPENPIPE_BASE_URL` to your self-hosted instance. | ||
<i>What if my `OPENPIPE_BASE_URL` is misconfigured or my instance goes down? Will my OpenAI calls stop working?</i> | ||
#### <i>What if my `OPENPIPE_BASE_URL` is misconfigured or my instance goes down? Will my OpenAI calls stop working?</i> | ||
@@ -69,0 +69,0 @@ Your OpenAI calls will continue to function as expected no matter what. The sdk handles logging errors gracefully without affecting OpenAI inference. |
Sorry, the diff of this file is not supported yet
Sorry, the diff of this file is not supported yet
250846