Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

pnnx

Package Overview
Dependencies
Maintainers
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

pnnx

pnnx is an open standard for PyTorch model interoperability.

  • 20240819
  • PyPI
  • Socket score

Maintainers
2

pnnx

python wrapper of pnnx, only support python 3.7+ now.

Install from pip

pnnx is available as wheel packages for macOS, Windows and Linux distributions, you can install with pip:

pip install pnnx

Build & Install from source

Prerequisites

On Unix (Linux, OS X)

  • A compiler with C++14 support
  • CMake >= 3.4

On Mac

  • A compiler with C++14 support
  • CMake >= 3.4

On Windows

  • Visual Studio 2015 or higher
  • CMake >= 3.4

Build & install

  1. clone ncnn.
git clone https://github.com/Tencent/ncnn.git
  1. install pytorch

install pytorch according to https://pytorch.org/ . Anaconda is strongly recommended for example:

conda install pytorch
  1. install
cd /pathto/ncnntools/pnnx/python
python setup.py install

Note: If torchvision and pnnx2onnx are needed, you can set the following environment variables before 'python setup.py install' to enable them. e.g. on ubuntu:

export TORCHVISION_INSTALL_DIR="/project/torchvision"
export PROTOBUF_INCLUDE_DIR="/project/protobuf/include"
export PROTOBUF_LIBRARIES="/project/protobuf/lib64/libprotobuf.a"
export PROTOBUF_PROTOC_EXECUTABLE="/project/protobuf/bin/protoc" 

To do these, you must install Torchvision and Protobuf first.

Tests

cd /pathto/ncnn/tools/pnnx/python
pytest tests

Usage

  1. export model to pnnx
import torch
import torchvision.models as models
import pnnx

net = models.resnet18(pretrained=True)
x = torch.rand(1, 3, 224, 224)

# You could try disabling checking when torch tracing raises error
# opt_net = pnnx.export(net, "resnet18.pt", x, check_trace=False)
opt_net = pnnx.export(net, "resnet18.pt", x)
  1. convert existing model to pnnx
import torch
import pnnx

x = torch.rand(1, 3, 224, 224)
opt_net = pnnx.convert("resnet18.pt", x)

API Reference

  1. pnnx.export

model (torch.nn.Model): model to be exported.

ptpath (str): the torchscript name.

inputs (torch.Tensor of list of torch.Tensor) expected inputs of the model.

inputs2 (torch.Tensor of list of torch.Tensor) alternative inputs of the model. Usually, it is used with input_shapes to resolve dynamic shape.

input_shapes (Optional, list of int or list of list with int type inside) shapes of model inputs. It is used to resolve tensor shapes in model graph. for example, [1,3,224,224] for the model with only 1 input, [[1,3,224,224],[1,3,224,224]] for the model that have 2 inputs.

input_types (Optional, str or list of str) types of model inputs, it should have the same length with input_shapes. for example, "f32" for the model with only 1 input, ["f32", "f32"] for the model that have 2 inputs.

typenametorch type
f32torch.float32 or torch.float
f64torch.float64 or torch.double
f16torch.float16 or torch.half
u8torch.uint8
i8torch.int8
i16torch.int16 or torch.short
i32torch.int32 or torch.int
i64torch.int64 or torch.long
c32torch.complex32
c64torch.complex64
c128torch.complex128

input_shapes2 (Optional, list of int or list of list with int type inside) shapes of alternative model inputs, the format is identical to input_shapes. Usually, it is used with input_shapes to resolve dynamic shape (-1) in model graph.

input_types2 (Optional, str or list of str) types of alternative model inputs.

device (Optional, str, default="cpu") device type for the input in TorchScript model, cpu or gpu.

customop (Optional, str or list of str) list of Torch extensions (dynamic library) for custom operators. For example, "/home/nihui/.cache/torch_extensions/fused/fused.so" or ["/home/nihui/.cache/torch_extensions/fused/fused.so",...].

moduleop (Optional, str or list of str) list of modules to keep as one big operator. for example, "models.common.Focus" or ["models.common.Focus","models.yolo.Detect"].

optlevel (Optional, int, default=2) graph optimization level

optionoptimization level
0do not apply optimization
1do not apply optimization
2optimization more for inference

pnnxparam (Optional, str, default="*.pnnx.param", * is the model name): PNNX graph definition file.

pnnxbin (Optional, str, default="*.pnnx.bin"): PNNX model weight.

pnnxpy (Optional, str, default="*_pnnx.py"): PyTorch script for inference, including model construction and weight initialization code.

pnnxonnx (Optional, str, default="*.pnnx.onnx"): PNNX model in onnx format.

ncnnparam (Optional, str, default="*.ncnn.param"): ncnn graph definition.

ncnnbin (Optional, str, default="*.ncnn.bin"): ncnn model weight.

ncnnpy (Optional, str, default="*_ncnn.py"): pyncnn script for inference.

  1. pnnx.convert

ptpath (str): torchscript model to be converted.

Other parameters are consistent with pnnx.export

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc