
Security News
minimatch Patches 3 High-Severity ReDoS Vulnerabilities
minimatch patched three high-severity ReDoS vulnerabilities that can stall the Node.js event loop, and Socket has released free certified patches.
inferless-cli
Advanced tools
inferlessInferless - Deploy Machine Learning Models in Minutes.
See the website at https://inferless.com/ for documentation and more information about running code on Inferless.
Usage:
$ inferless [OPTIONS] COMMAND [ARGS]...
Options:
-v, --version--install-completion: Install completion for the current shell.--show-completion: Show completion for the current shell, to copy it or customize the installation.--help: Show this message and exit.Commands:
deploy: Deploy a model to Inferlessexport: Export the runtime configuration of...init: Initialize a new Inferless modelintegration: Manage Inferless integrationslog: Inferless models logs (view build logs or...login: Login to Inferlessmode: Change modemodel: Manage Inferless models (list , delete ,...region: Manage Inferless regionsremote-run: Remotely run code on inferlessrun: Run a model locallyruntime: Manage Inferless runtimes (can be used to...scaffold: Scaffold a demo Inferless projectsecret: Manage Inferless secrets (list secrets)token: Manage Inferless tokensvolume: Manage Inferless volumes (can be used to...workspace: Manage Inferless workspaces (can be used...inferless deployDeploy a model to Inferless
Usage:
$ inferless deploy [OPTIONS]
Options:
--gpu TEXT: Denotes the machine type (A10/A100/T4). [required]--region TEXT: Inferless region. Defaults to Inferless default region.--beta: Deploys the model with v2 endpoints.--fractional: Use fractional machine type (default: dedicated).--runtime TEXT: Runtime name or file location. if not provided default Inferless runtime will be used.--volume TEXT: Volume name.--volume-mount-path TEXT: Custom volume mount path.--env TEXT: Key-value pairs for model environment variables.--inference-timeout INTEGER: Inference timeout in seconds. [default: 180]--scale-down-timeout INTEGER: Scale down timeout in seconds. [default: 600]--container-concurrency INTEGER: Container concurrency level. [default: 1]--secret TEXT: Secret names to attach to the deployment.--runtimeversion TEXT: Runtime version (default: latest version of runtime).--max-replica INTEGER: Maximum number of replicas. [default: 1]--min-replica INTEGER: Minimum number of replicas. [default: 0]-t, --runtime-type TEXT: Type of runtime to deploy [fastapi, triton]. Defaults to triton. [default: triton]-c, --config TEXT: Inferless config file path to override from inferless.yaml [default: inferless.yaml]--help: Show this message and exit.inferless exportExport the runtime configuration of another provider to Inferless runtime config
Usage:
$ inferless export [OPTIONS]
Options:
-r, --runtime TEXT: The runtime configuration file of another provider [default: cog.yaml]-d, --destination TEXT: The destination file for the Inferless runtime configuration [default: inferless-runtime-config.yaml]-f, --from TEXT: The provider from which to export the runtime configuration [default: replicate]--help: Show this message and exit.inferless initInitialize a new Inferless model
Usage:
$ inferless init [OPTIONS] COMMAND [ARGS]...
Options:
-n, --name TEXT: Denotes the name of the model.-s, --source TEXT: Not needed if local, else provide Github/Gitlab. [default: local]-u, --url TEXT: Denotes the URL of the repo. required if source is not local.-b, --branch TEXT: Denotes the branch where the model is located. required if source is not local.-a, --autobuild: Enable autobuild for the model. will be False for local source.--help: Show this message and exit.Commands:
docker: Initialize with Docker.file: Import a PyTorch, ONNX, or TensorFlow file...hf: Load a model from Hugging Face.pythonic: (Default) Deploy a Python workflow.inferless init dockerInitialize with Docker.
Usage:
$ inferless init docker [OPTIONS]
Options:
-n, --name TEXT: Denotes the name of the model. [required]-t, --type TEXT: Type for import: dockerimage/dockerfile. [required]-p, --provider TEXT: Provider for the model dockerimage = (dockerhub/ecr) dockerfile = (github/gitlab). [required]-u, --url TEXT: Docker image URL or GitHub/GitLab URL. [required]-b, --branch TEXT: Branch for Dockerfile import (GitHub/GitLab). required if type is dockerfile.-d, --dockerfilepath TEXT: Path to the Dockerfile. required if type is dockerfile.-h, --healthapi TEXT: Health check API endpoint. [required]-i, --inferapi TEXT: Inference API endpoint. [required]-s, --serverport INTEGER: Server port. [required]-a, --autobuild: Enable autobuild for the model.--help: Show this message and exit.inferless init fileImport a PyTorch, ONNX, or TensorFlow file for inference with Triton server.
The folder structure for the zip file should be as follows:
┌───────────────────────────────────────────────┐
│ . │
│ ├── config.pbtxt (optional) │
│ ├── input.json │
│ ├── output.json │
│ ├── 1/ │
│ │ ├── model.xxx (pt/onnx/savedmodel) │
└───────────────────────────────────────────────┘
Usage:
$ inferless init file [OPTIONS]
Options:
-n, --name TEXT: Denotes the name of the model. [required]-f, --framework TEXT: Framework of the model. [pytorch, onnx, tensorflow] [default: pytorch]-p, --provider TEXT: Provider for the model (local/gcs/s3). [default: local]--url TEXT: Provider URL. required if provider is not local.--help: Show this message and exit.inferless init hfLoad a model from Hugging Face.
We will create new files called app.py, inferless_runtime_config.yaml and input_schema.py at your current directory.
Transformers options: audio-classification, automatic-speech-recognition, conversational, depth-estimation, document-question-answering, feature-extraction, fill-mask, image-classification, image-segmentation, image-to-text, object-detection, question-answering, summarization, table-question-answering, text-classification, text-generation, text2text-generation, token-classification, translation, video-classification, visual-question-answering, zero-shot-classification, zero-shot-image-classification, zero-shot-object-detection
Diffusers options: Depth-to-Image, Image-Variation, Image-to-Image, Inpaint, InstructPix2Pix,Stable-Diffusion-Latent-Upscaler
Usage:
$ inferless init hf [OPTIONS]
Options:
-n, --name TEXT: Denotes the name of the model. [required]-m, --hfmodelname TEXT: Name of the Hugging Face repo. [required]-t, --modeltype TEXT: Type of the model (transformer/diffuser). [required]-k, --tasktype TEXT: Task type of the model (text-generation). [required]--help: Show this message and exit.inferless init pythonic(Default) Deploy a Python workflow.
Usage:
$ inferless init pythonic [OPTIONS]
Options:
-n, --name TEXT: Denotes the name of the model. [required]-s, --source TEXT: Not needed if local, else provide Github/Gitlab. [default: local]-u, --url TEXT: Denotes the URL of the repo. required if source is not local.-b, --branch TEXT: Denotes the branch where the model is located. required if source is not local.-a, --autobuild: Enable autobuild for the model. will be False for local source.--help: Show this message and exit.inferless integrationManage Inferless integrations
Usage:
$ inferless integration [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
add: Add an integration to your workspacelist: List all integrationsinferless integration addAdd an integration to your workspace
Usage:
$ inferless integration add [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
DOCKERHUB: Add Dockerhub integration to your workspaceECR: Add ECR integration to your workspaceGCS: Add Google cloud storage integration to...HF: Add Huggingface integration to your workspaceS3: Add S3/ECR Integration to your workspaceinferless integration add DOCKERHUBAdd Dockerhub integration to your workspace
Usage:
$ inferless integration add DOCKERHUB [OPTIONS]
Options:
-n, --name TEXT: Integration name [required]--username TEXT: Username for dockerhub integration [required]--access-token TEXT: Access token for dockerhub integration [required]--help: Show this message and exit.inferless integration add ECRAdd ECR integration to your workspace
Usage:
$ inferless integration add ECR [OPTIONS]
Options:
-n, --name TEXT: Integration name [required]--access-key TEXT: Access key for aws integration. [required]--secret-key TEXT: Access key for aws integration. [required]--help: Show this message and exit.inferless integration add GCSAdd Google cloud storage integration to your workspace
Usage:
$ inferless integration add GCS [OPTIONS]
Options:
-n, --name TEXT: Integration name [required]--gcp-json-path TEXT: Path to the GCP JSON key file [required]--help: Show this message and exit.inferless integration add HFAdd Huggingface integration to your workspace
Usage:
$ inferless integration add HF [OPTIONS]
Options:
-n, --name TEXT: Integration name [required]--api-key TEXT: API key for huggingface integration [required]--help: Show this message and exit.inferless integration add S3Add S3/ECR Integration to your workspace
Usage:
$ inferless integration add S3 [OPTIONS]
Options:
-n, --name TEXT: Integration name [required]--access-key TEXT: Access key for aws integration. [required]--secret-key TEXT: Access key for aws integration. [required]--help: Show this message and exit.inferless integration listList all integrations
Usage:
$ inferless integration list [OPTIONS]
Options:
--help: Show this message and exit.inferless logInferless models logs (view build logs or call logs)
Usage:
$ inferless log [OPTIONS] [MODEL_ID]
Arguments:
[MODEL_ID]: Model id or model import idOptions:
-i, --import-logs: Import logs-t, --type TEXT: Logs type [BUILD, CALL]] [default: BUILD]--help: Show this message and exit.inferless loginLogin to Inferless
Usage:
$ inferless login [OPTIONS]
Options:
--help: Show this message and exit.inferless modeChange mode
Usage:
$ inferless mode [OPTIONS] MODE
Arguments:
MODE: The mode to run the application in, either 'DEV' or 'PROD'. [required]Options:
--help: Show this message and exit.inferless modelManage Inferless models (list , delete , activate , deactivate , rebuild the models)
Usage:
$ inferless model [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
activate: activate a model.deactivate: deactivate a model.delete: delete a model.info: Get model details.list: List all models.patch: patch model configuration.rebuild: rebuild a model.inferless model activateactivate a model.
Usage:
$ inferless model activate [OPTIONS]
Options:
--model-id TEXT: Model ID--help: Show this message and exit.inferless model deactivatedeactivate a model.
Usage:
$ inferless model deactivate [OPTIONS]
Options:
--model-id TEXT: Model ID--help: Show this message and exit.inferless model deletedelete a model.
Usage:
$ inferless model delete [OPTIONS]
Options:
--model-id TEXT: Model ID--help: Show this message and exit.inferless model infoGet model details.
Usage:
$ inferless model info [OPTIONS]
Options:
--model-id TEXT: Model ID--help: Show this message and exit.inferless model listList all models.
Usage:
$ inferless model list [OPTIONS]
Options:
--help: Show this message and exit.inferless model patchpatch model configuration.
Usage:
$ inferless model patch [OPTIONS]
Options:
--model-id TEXT: Model ID--gpu TEXT: Denotes the machine type (A10/A100/T4).--fractional: Use fractional machine type (default: dedicated).--volume TEXT: Volume name.--mount-path TEXT: Volume Mount path for the volume.--env TEXT: Key-value pairs for model environment variables.--inference-timeout INTEGER: Inference timeout in seconds.--scale-down-timeout INTEGER: Scale down timeout in seconds.--container-concurrency INTEGER: Container concurrency level.--secret TEXT: Secret names to attach to the deployment.--runtimeversion TEXT: Runtime version (default: latest).--max-replica INTEGER: Maximum number of replicas.--min-replica INTEGER: Minimum number of replicas.--help: Show this message and exit.inferless model rebuildrebuild a model. (If you have a inferless.yaml file in your current directory, you can use the --local or -l flag to redeploy the model locally.)
Usage:
$ inferless model rebuild [OPTIONS]
Options:
--model-id TEXT: Model ID [required]-l, --local: Local rebuild-r, --runtime-path TEXT: runtime file path.-rv, --runtime-version TEXT: runtime version.--help: Show this message and exit.inferless regionManage Inferless regions
Usage:
$ inferless region [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
list: List available regionsinferless region listList available regions
Usage:
$ inferless region list [OPTIONS]
Options:
--help: Show this message and exit.inferless remote-runRemotely run code on inferless
Usage:
$ inferless remote-run [OPTIONS] [FILE_PATH]
Arguments:
[FILE_PATH]: The path to the file to run on InferlessOptions:
-c, --config TEXT: The path to the Inferless config file-e, --exclude TEXT: The path to the file to exclude from the run, use .gitignore format. If not provided, .gitignore will be used if present in the directory.--help: Show this message and exit.inferless runRun a model locally
Usage:
$ inferless run [OPTIONS]
Options:
-r, --runtime TEXT: custom runtime name or file location. if not provided default Inferless runtime will be used.-t, --runtime-type TEXT: Type of runtime to deploy [fastapi, triton]. Defaults to triton. [default: triton]-n, --name TEXT: Name of the model to deploy on inferless [default: inferless-model]-f, --env-file TEXT: Path to an env file containing environment variables (one per line in KEY=VALUE format)-e, --env TEXT: Environment variables to set for the runtime (e.g. 'KEY=VALUE'). If the env variable contains special chars please escape them.-u, --docker-base-url TEXT: Docker base url. Defaults to system default, feteched from env--volume TEXT: Volume name.-f, --framework TEXT: Framework type. (PYTORCH, ONNX, TENSORFLOW) [default: PYTORCH]-i, --input-schema TEXT: Input schema path. (Default: input_schema.json) [default: input_schema.py]-i, --input TEXT: Input json path-o, --output TEXT: Output json path--runtimeversion TEXT: Runtime version (default: latest).--help: Show this message and exit.inferless runtimeManage Inferless runtimes (can be used to list runtimes and upload new runtimes)
Usage:
$ inferless runtime [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
create: Create a runtime.generate: use to generate a new runtime from your...list: List all runtimes.patch: Update the runtime with the config file.version-list: use to list the runtime versionsinferless runtime createCreate a runtime.
Usage:
$ inferless runtime create [OPTIONS]
Options:
-p, --path TEXT: Path to the runtime-n, --name TEXT: Name of the runtime--help: Show this message and exit.inferless runtime generateuse to generate a new runtime from your local environment
Usage:
$ inferless runtime generate [OPTIONS]
Options:
--help: Show this message and exit.inferless runtime listList all runtimes.
Usage:
$ inferless runtime list [OPTIONS]
Options:
--help: Show this message and exit.inferless runtime patchUpdate the runtime with the config file.
Usage:
$ inferless runtime patch [OPTIONS]
Options:
-p, --path TEXT: Path to the runtime-i, --name TEXT: ID of the runtime--help: Show this message and exit.inferless runtime version-listuse to list the runtime versions
Usage:
$ inferless runtime version-list [OPTIONS]
Options:
-n, --name TEXT: runtime name--help: Show this message and exit.inferless scaffoldScaffold a demo Inferless project
Usage:
$ inferless scaffold [OPTIONS]
Options:
-d, --demo: Demo name [required]--help: Show this message and exit.inferless secretManage Inferless secrets (list secrets)
Usage:
$ inferless secret [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
list: List all secrets.inferless secret listList all secrets.
Usage:
$ inferless secret list [OPTIONS]
Options:
--help: Show this message and exit.inferless tokenManage Inferless tokens
Usage:
$ inferless token [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
set: Set account credentials for connecting to...inferless token setSet account credentials for connecting to Inferless. If not provided with the command, you will be prompted to enter your credentials.
Usage:
$ inferless token set [OPTIONS]
Options:
--token-key TEXT: Account CLI key [required]--token-secret TEXT: Account CLI secret [required]--help: Show this message and exit.inferless volumeManage Inferless volumes (can be used to list volumes and create new volumes)
Usage:
$ inferless volume [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
cp: Add a file or directory to a volume.create: Create a new volumelist: List all existing volumesls: List files and directories within a volumerm: Specify the Inferless path to the file or...inferless volume cpAdd a file or directory to a volume.
Usage:
$ inferless volume cp [OPTIONS]
Options:
-s, --source TEXT: Specify the source path (either a local directory/file path or an Inferless path)-d, --destination TEXT: Specify the destination path (either a local directory/file path or an Inferless path)-r, --recursive: Recursively copy the contents of a directory to the destination.--help: Show this message and exit.inferless volume createCreate a new volume
Usage:
$ inferless volume create [OPTIONS]
Options:
-n, --name TEXT: Assign a name to the new volume.--help: Show this message and exit.inferless volume listList all existing volumes
Usage:
$ inferless volume list [OPTIONS]
Options:
--help: Show this message and exit.inferless volume lsList files and directories within a volume
Usage:
$ inferless volume ls [OPTIONS] PATH
Arguments:
PATH: Specify the infer path to the directory [required]Options:
-d, --directory: List only directories.-f, --files: List only files.-r, --recursive: Recursively list contents of directories.--help: Show this message and exit.inferless volume rmSpecify the Inferless path to the file or directory you want to delete.
Usage:
$ inferless volume rm [OPTIONS]
Options:
-p, --path TEXT: Infer Path to the file/dir your want to delete--help: Show this message and exit.inferless workspaceManage Inferless workspaces (can be used to switch between workspaces)
Usage:
$ inferless workspace [OPTIONS] COMMAND [ARGS]...
Options:
--help: Show this message and exit.Commands:
useinferless workspace useUsage:
$ inferless workspace use [OPTIONS]
Options:
--help: Show this message and exit.FAQs
Inferless - Deploy Machine Learning Models in Minutes.
We found that inferless-cli demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 3 open source maintainers collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
minimatch patched three high-severity ReDoS vulnerabilities that can stall the Node.js event loop, and Socket has released free certified patches.

Research
/Security News
Socket uncovered 26 malicious npm packages tied to North Korea's Contagious Interview campaign, retrieving a live 9-module infostealer and RAT from the adversary's C2.

Research
An impersonated golang.org/x/crypto clone exfiltrates passwords, executes a remote shell stager, and delivers a Rekoobe backdoor on Linux.