Security News
The Risks of Misguided Research in Supply Chain Security
Snyk's use of malicious npm packages for research raises ethical concerns, highlighting risks in public deployment, data exfiltration, and unauthorized testing.
@rerun-io/web-viewer
Advanced tools
Embed the Rerun web viewer within your app.
This package is framework-agnostic. A React wrapper is available at https://www.npmjs.com/package/@rerun-io/web-viewer-react.
$ npm i @rerun-io/web-viewer
βΉοΈ Note:
The package version is equal to the supported Rerun SDK version, and RRD files are not yet stable across different versions.
This means that @rerun-io/web-viewer@0.10.0
can only connect to a data source (.rrd
file, websocket connection, etc.) that originates from a Rerun SDK with version 0.10.0
!
The web viewer is an object which manages a canvas element:
import { WebViewer } from "@rerun-io/web-viewer";
const rrd = "β¦";
const parentElement = document.body;
const viewer = new WebViewer();
await viewer.start(rrd, parentElement, { width: "800px", height: "600px" });
// β¦
viewer.stop();
The rrd
in the snippet above should be a URL pointing to either:
.rrd
file, such as https://app.rerun.io/version/0.18.0/examples/dna.rrdserve
APIIf rrd
is not set, the Viewer will display the same welcome screen as https://app.rerun.io.
This can be disabled by setting hide_welcome_screen
to true
in the options object of viewer.start
.
β It's important to set the viewer's width and height, as without it the viewer may not display correctly. Setting the values to empty strings is valid, as long as you style the canvas through other means.
For a full example, see https://github.com/rerun-io/web-viewer-example. You can open the example via CodeSandbox: https://codesandbox.io/s/github/rerun-io/web-viewer-example
βΉοΈ Note: This package only targets recent versions of browsers. If your target browser does not support Wasm imports or top-level await, you may need to install additional plugins for your bundler.
0.18.0 - Ingestion speed and memory footprint
https://github.com/user-attachments/assets/95380a64-df05-4f85-b40a-0c6b8ec8d5cf
Rerun 0.18 introduces new column-oriented APIs and internal storage datastructures (Chunk
& ChunkStore
) that can both simplify logging code as well as improve ingestion speeds and memory overhead by a couple orders of magnitude in many cases (timeseries-heavy workloads in particular).
These improvements come in 3 broad categories:
send
family of APIs, available in all 3 SDKs (Python, C++, Rust),Furthermore, we started cleaning up our data schema, leading to various changes in the way represent transforms & images.
send
APIsUnlike the regular row-oriented log
APIs, the new send
APIs let you submit data in a columnar form, even if the data extends over multiple timestamps.
This can both greatly simplify logging code and drastically improve performance for some workloads, in particular timeseries, although we have already seen it used for other purposes!
API documentation:
API usage examples:
<details> <summary>Python timeseries</summary>Using log()
(slow, memory inefficient):
rr.init("rerun_example_scalar", spawn=True)
for step in range(0, 64):
rr.set_time_sequence("step", step)
rr.log("scalar", rr.Scalar(math.sin(step / 10.0)))
Using send()
(fast, memory efficient):
rr.init("rerun_example_send_columns", spawn=True)
rr.send_columns(
"scalars",
times=[rr.TimeSequenceColumn("step", np.arange(0, 64))],
components=[rr.components.ScalarBatch(np.sin(times / 10.0))],
)
</details>
<details>
<summary>C++ timeseries</summary>
Using log()
(slow, memory inefficient):
const auto rec = rerun::RecordingStream("rerun_example_scalar");
rec.spawn().exit_on_failure();
for (int step = 0; step < 64; ++step) {
rec.set_time_sequence("step", step);
rec.log("scalar", rerun::Scalar(std::sin(static_cast<double>(step) / 10.0)));
}
Using send()
(fast, memory efficient):
const auto rec = rerun::RecordingStream("rerun_example_send_columns");
rec.spawn().exit_on_failure();
std::vector<double> scalar_data(64);
for (size_t i = 0; i < 64; ++i) {
scalar_data[i] = sin(static_cast<double>(i) / 10.0);
}
std::vector<int64_t> times(64);
std::iota(times.begin(), times.end(), 0);
auto time_column = rerun::TimeColumn::from_sequence_points("step", std::move(times));
auto scalar_data_collection =
rerun::Collection<rerun::components::Scalar>(std::move(scalar_data));
rec.send_columns("scalars", time_column, scalar_data_collection);
</details>
<details>
<summary>Rust timeseries</summary>
Using log()
(slow, memory inefficient):
let rec = rerun::RecordingStreamBuilder::new("rerun_example_scalar").spawn()?;
for step in 0..64 {
rec.set_time_sequence("step", step);
rec.log("scalar", &rerun::Scalar::new((step as f64 / 10.0).sin()))?;
}
Using send()
(fast, memory efficient):
let rec = rerun::RecordingStreamBuilder::new("rerun_example_send_columns").spawn()?;
let timeline_values = (0..64).collect::<Vec<_>>();
let scalar_data: Vec<f64> = timeline_values
.iter()
.map(|step| (*step as f64 / 10.0).sin())
.collect();
let timeline_values = TimeColumn::new_sequence("step", timeline_values);
let scalar_data: Vec<Scalar> = scalar_data.into_iter().map(Into::into).collect();
rec.send_columns("scalars", [timeline_values], [&scalar_data as _])?;
</details>
The Rerun datastore now continuously compacts data as it comes in, in order find a sweet spot between ingestion speed, query performance and memory overhead.
This is very similar to, and has many parallels with, the micro-batching mechanism running on the SDK side.
You can read more about this in the dedicated documentation entry.
To help improve efficiency for completed recordings, Rerun 0.18 introduces some new commands for working with rrd files.
Multiple files can be merged, whole entity paths can be dropped, and chunks can be compacted.
You can read more about it in the new CLI reference manual, but to give a sense of how it works the below example merges all recordings in a folder and runs chunk compaction using the max-rows
and max-bytes
settings:
rerun rrd compact --max-rows 4096 --max-bytes=1048576 /my/recordings/*.rrd > output.rrd
As part of improving our arrow schema and in preparation for reading data back in the SDK, we've split up transforms into several parts. This makes it much more performant to log large number of transforms as it allows updating only the parts you're interested in, e.g. logging a translation is now as lightweight as logging a single position.
There are now additionally InstancePoses3D
which allow you to do two things:
Mesh3D
/Asset3D
/Boxes3D
/Ellipsoids3D
: instantiate objects several times with different poses, known as "instancing"
All four tetrahedron meshes on this screen share the same vertices and are instanced using an InstancePoses3D
archetype with 4 different translations
.rrd
files from older versions won't load correctly in Rerun 0.18mesh_material: Material
has been renamed to albedo_factor: AlbedoFactor
#6841Transform3D
is no longer a single component but split into its constituent parts. From this follow various smaller API changesNV12/YUY2
are now logged with Image
ImageEncoded
is deprecated and replaced with EncodedImage
(JPEG, PNG, β¦) and Image
(NV12, YUY2, β¦)DepthImage
and SegmentationImage
are no longer encoded as a tensors, and expects its shape in [width, height]
order𧳠Migration guide: http://rerun.io/docs/reference/migration/migration-0-18
Ellipsoids3D
archetype #6853 (thanks @kpreid!)archetypes.ImageEncoded
with PNG and JPEG support #6874Translation3D
& TransformMat3x3
#6866DepthImage
archetype #6915SegmentationImage
to the new image archetype style #6928RotationAxisAngle
and RotationQuat
#6929TransformRelation
component #6944LeafTransform3D
, replacing OutOfTreeTransform3D
#7015Scale3D
/Transform3D
/TranslationRotationScale3D
datatypes, remove Transform3D
component #7000Image
archetype #6942LeafTranslation
(centers), LeafRotationQuat
and LeafRotationAxisAngle
directly on Boxes3D
/Ellipsoids3D
#7029Rotation3D
component & datatype #7030rerun::Collection
by providing free functions for borrow
& take_ownership
#7055send_columns
#7103ImageEncoded
to ImageEncodedHelper
#6882ImageChromaDownsampled
#6883__version__
and __version_info__
to rerun package #7104--locked
#6868TensorBuffer::JPEG
, DecodedTensor
, TensorDecodeCache
#6884Spaces and Transforms
doc page #6955rerun rrd compact
: always put blueprints at the start of the recordings #6998Boxes3D
and Ellipsoids
#6953 (thanks @kpreid!)--blueprint
to plot_dashboard_stress
#6996Transformables
subscriber for improved TransformContext
perf #6997log_tick
#7082Chunk
concatenation primitives #6857Chunk
compaction #6858ChunkStore
: implement new component-less indices and APIs #6879Chunk
s #6989Chunk
-based time-series views #6995send_columns
examples for images, fix rust send_columns
handling of listarrays #7172Events and Timelines
doc page #6912Blob
s, especially those representing images #7128ChunkStore::drop_entity_path
#6588Chunk::cell
#6875Chunk::iter_indices
#6877RangeQueryOptions::include_extended_bounds
#7132Chunk
component-level helpers and UnitChunk
#6990rerun rrd
subcommands #7060rerun rrd filter
#7095rerun rrd filter --drop-entity
#7185rerun rrd <compare|print|compact>
subscommand #6861RangeQueryOptions
directly within RangeQuery
#7131macaw
with fork re_math
#6867FAQs
Embed the Rerun web viewer in your app
The npm package @rerun-io/web-viewer receives a total of 130 weekly downloads. As such, @rerun-io/web-viewer popularity was classified as not popular.
We found that @rerun-io/web-viewer demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago.Β It has 0 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Snyk's use of malicious npm packages for research raises ethical concerns, highlighting risks in public deployment, data exfiltration, and unauthorized testing.
Research
Security News
Socket researchers found several malicious npm packages typosquatting Chalk and Chokidar, targeting Node.js developers with kill switches and data theft.
Security News
pnpm 10 blocks lifecycle scripts by default to improve security, addressing supply chain attack risks but sparking debate over compatibility and workflow changes.