🚀 Big News: Socket Acquires Coana to Bring Reachability Analysis to Every Appsec Team.Learn more
Socket
Book a DemoInstallSign in
Socket

nerfbaselines

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

nerfbaselines

Reproducible evaluation of NeRF and 3DGS methods

1.2.9
Source
PyPI
Maintainers
1

NerfBaselines

PyPI - Version GitHub License Downloads

NerfBaselines is a framework for evaluating and comparing existing NeRF and 3DGS methods. Currently, most official implementations use different dataset loaders, evaluation protocols, and metrics, which renders benchmarking difficult. Therefore, this project aims to provide a unified interface for running and evaluating methods on different datasets in a consistent way using the same metrics. But instead of reimplementing the methods, we use the official implementations and wrap them so that they can be run easily using the same interface.

Please visit the project page to see the results of implemented methods on dataset benchmarks.

🌐 Web  |  📄 Paper  |  📚 Docs

News

[25/01/2025] Added Taming-3DGS method.
[30/12/2024] Added a new viewer implementation.
[22/09/2024] Added mesh export for 2DGS, COLMAP, and GOF.
[17/09/2024] Moved project to nerfbaselines/nerfbaselines repository.
[16/09/2024] Added online demos and demo export for 3DGS-based methods. Check out the benchmark page.
[12/09/2024] Added gsplat, 2D Gaussian Splatting, Scaffold-GS, and COLMAP MVS methods.
[09/09/2024] Method and Dataset API refac in v1.2.x to simplify usage.
[28/08/2024] Implemented faster communication protocols using shared memory.
[20/08/2024] Added documentation page.

Getting started

Start by installing the nerfbaselines pip package on your host system.

pip install nerfbaselines

Now you can use the nerfbaselines cli to interact with NerfBaselines.

The next step is to choose the backend which will be used to install different methods. At the moment there are the following backends implemented:

  • docker: Offers good isolation, requires docker (with NVIDIA container toolkit) to be installed and the user to have access to it (being in the docker user group).
  • apptainer: Similar level of isolation as docker, but does not require the user to have privileged access.
  • conda (default): Does not require docker/apptainer to be installed, but does not offer the same level of isolation and some methods require additional dependencies to be installed. Also, some methods are not implemented for this backend because they rely on dependencies not found on conda.
  • python: Will run everything directly in the current environment. Everything needs to be installed in the environment for this backend to work.

The backend can be set as the --backend <backend> argument or using the NERFBASELINES_BACKEND environment variable.

Downloading data

For some datasets, e.g. Mip-NeRF 360, NerfStudio, Blender, or Tanks and Temples, the datasets can be downloaded automatically. You can specify the argument --data external://dataset/scene during training or download the dataset beforehand by running nerfbaselines download-dataset external://dataset/scene. Examples:

# Downloads the garden scene to the cache folder.
nerfbaselines download-dataset external://mipnerf360/garden

# Downloads all nerfstudio scenes to the cache folder.
nerfbaselines download-dataset external://nerfstudio

# Downloads kithen scene to folder kitchen
nerfbaselines download-dataset external://mipnerf360/kitchen -o kitchen

Training

To start the training, use the nerfbaselines train --method <method> --data <data> command. Use --help argument to learn about all implemented methods and supported features.

Rendering

The nerfbaselines render --checkpoint <checkpoint> command can be used to render images from a trained checkpoint. Again, use --help to learn about the arguments.

In order to render a camera trajectory (e.g., created using the interactive viewer), use the following command command:

nerfbaselines render-trajectory --checkpoint <checkpoint> --trajectory <trajectory> --output <output.mp4>

Interactive viewer

Given a trained checkpoint, the interactive viewer can be launched as follows:

nerfbaselines viewer --checkpoint <checkpoin> --data <dataset>

Even though the argument --data <dataset> is optional, it is recommended, as the camera poses are used to perform gravity alignment and rescaling for a better viewing experience. It also enables visualizing the input camera frustums.

viewer-gui

Results

In this section, we present results of implemented methods on standard benchmark datasets. For detailed results, visit the project page: https://nerfbaselines.github.io

Mip-NeRF 360

Mip-NeRF 360 is a collection of four indoor and five outdoor object-centric scenes. The camera trajectory is an orbit around the object with fixed elevation and radius. The test set takes each n-th frame of the trajectory as test views. Detailed results are available on the project page: https://nerfbaselines.github.io/mipnerf360

MethodPSNRSSIMLPIPS (VGG)TimeGPU mem.
Zip-NeRF28.5530.8290.2185h 30m 20s26.8 GB
3DGS-MCMC27.9830.8350.22441m 11s28.9 GB
Scaffold-GS27.7140.8130.26223m 28s8.7 GB
Mip-NeRF 36027.6810.7920.27230h 14m 36s33.6 GB
Mip-Splatting27.4920.8150.25825m 37s11.0 GB
Gaussian Splatting27.4340.8140.25723m 25s11.1 GB
Gaussian Opacity Fields27.4210.8260.2341h 3m 54s28.4 GB
gsplat27.4120.8150.25629m 19s8.3 GB
Octree-GS27.3970.8110.26428m 3s8.5 GB
Taming 3DGS27.2170.7930.3055m 59s8.7 GB
PGSR27.1990.8190.23339m 58s14.3 GB
H3DGS26.9270.7900.26955m 29s9.1 GB
2D Gaussian Splatting26.8150.7960.29731m 10s13.2 GB
NerfStudio26.3880.7310.34319m 30s5.9 GB
Instant NGP25.5070.6840.3983m 54s7.8 GB
COLMAP16.6700.4450.5902h 52m 55s0 MB

Blender

Blender (nerf-synthetic) is a synthetic dataset used to benchmark NeRF methods. It consists of 8 scenes of an object placed on a white background. Cameras are placed on a semi-sphere around the object. Scenes are licensed under various CC licenses. Detailed results are available on the project page: https://nerfbaselines.github.io/blender

MethodPSNRSSIMLPIPS (VGG)TimeGPU mem.
Zip-NeRF33.6700.9730.0365h 21m 57s26.2 GB
3DGS-MCMC33.6370.9710.0369m 8s4.5 GB
Gaussian Opacity Fields33.4510.9690.03818m 26s3.1 GB
Mip-Splatting33.3300.9690.0396m 49s2.7 GB
Gaussian Splatting33.3080.9690.0376m 6s3.1 GB
PGSR33.2740.9680.0398m 20s3.9 GB
TensoRF33.1720.9630.05110m 47s16.4 GB
Scaffold-GS33.0800.9660.0487m 4s3.7 GB
K-Planes32.2650.9610.06223m 58s4.6 GB
Instant NGP32.1980.9590.0552m 23s2.6 GB
Tetra-NeRF31.9510.9570.0566h 53m 20s29.6 GB
gsplat31.4710.9660.05414m 45s2.8 GB
Mip-NeRF 36030.3450.9510.0603h 29m 39s114.8 GB
NerfStudio29.1910.9410.0959m 38s3.6 GB
NeRF28.7230.9360.09223h 26m 30s10.2 GB
COLMAP12.1230.7660.2141h 20m 34s0 MB

Tanks and Temples

Tanks and Temples is a benchmark for image-based 3D reconstruction. The benchmark sequences were acquired outside the lab, in realistic conditions. Ground-truth data was captured using an industrial laser scanner. The benchmark includes both outdoor scenes and indoor environments. The dataset is split into three subsets: training, intermediate, and advanced. Detailed results are available on the project page: https://nerfbaselines.github.io/tanksandtemples

MethodPSNRSSIMLPIPSTimeGPU mem.
Zip-NeRF24.6280.8400.1315h 44m 9s26.6 GB
Mip-Splatting23.9300.8330.16615m 56s7.3 GB
Gaussian Splatting23.8270.8310.16513m 48s6.9 GB
PGSR23.2090.8320.14618m 59s10.4 GB
Gaussian Opacity Fields22.3950.8250.17241m 21s24.1 GB
NerfStudio22.0430.7430.27019m 27s3.7 GB
Instant NGP21.6230.7120.3404m 27s4.1 GB
2D Gaussian Splatting21.5350.7680.28115m 47s7.2 GB
COLMAP11.9190.4360.6065h 16m 11s0 MB

Implementation status

MethodBlenderHierarchical 3DGSLLFFMip-NeRF 360NerfstudioPhoto TourismSeaThru-NeRFTanks and TemplesZip-NeRF
2D Gaussian Splatting🥇 gold🥇 gold🥇 gold🥈 silver
3DGS-MCMC🥈 silver🥇 gold🥇 gold🥇 gold
CamP
COLMAP🥇 gold🥇 gold🥇 gold🥇 gold
Gaussian Opacity Fields🥇 gold🥇 gold🥇 gold
Gaussian Splatting🥇 gold🥇 gold🥇 gold🥇 gold
GS-W
gsplat🥇 gold🥇 gold🥇 gold🥇 gold
H3DGS
Instant NGP🥇 gold🥇 gold🥇 gold🥇 gold
K-Planes🥇 gold🥈 silver
Mip-NeRF 360🥇 gold🥇 gold🥇 gold
Mip-Splatting🥇 gold🥇 gold🥇 gold🥇 gold
NeRF🥇 gold
NeRF On-the-go
NeRF-W (reimplementation)🥇 gold
NerfStudio🥇 gold🥇 gold🥇 gold
Octree-GS
PGSR🥇 gold🥇 gold
Scaffold-GS🥇 gold🥇 gold🥇 gold🥇 gold
SeaThru-NeRF🥇 gold
Student Splatting Scooping
Taming 3DGS🥇 gold🥇 gold🥇 gold
TensoRF🥇 gold🥇 gold
Tetra-NeRF🥈 silver🥈 silver
WildGaussians🥇 gold
Zip-NeRF🥇 gold🥇 gold🥇 gold

Contributing

Contributions are very much welcome. Please open a PR with a dataset/method/feature that you want to contribute. The goal of this project is to slowly expand by implementing more and more methods.

Citation

If you use this project in your research, please cite the following paper:

@article{kulhanek2024nerfbaselines,
  title={NerfBaselines: Consistent and Reproducible Evaluation of Novel View Synthesis Methods},
  author={Jonas Kulhanek and Torsten Sattler},
  year={2024},
  journal={arXiv},
}

License

This project is licensed under the MIT license Each implemented method is licensed under the license provided by the authors of the method. For the currently implemented methods, the following licenses apply:

Acknowledgements

A big thanks to the authors of all implemented methods for the great work they have done. We would also like to thank the authors of NerfStudio. We also thank Mark Kellogg for the 3DGS web renderer. This work was supported by the Czech Science Foundation (GAČR) EXPRO (grant no. 23-07973X), the Grant Agency of the Czech Technical University in Prague (grant no. SGS24/095/OHK3/2T/13), and by the Ministry of Education, Youth and Sports of the Czech Republic through the e-INFRA CZ (ID:90254).

FAQs

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts