Research
Security News
Threat Actor Exposes Playbook for Exploiting npm to Build Blockchain-Powered Botnets
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
apache-arrow
Advanced tools
The apache-arrow npm package provides a cross-language development platform for in-memory data. It is designed to improve the performance and efficiency of data processing and analytics by using a columnar memory format. This package is particularly useful for handling large datasets and performing complex data manipulations.
Reading and Writing Arrow Files
This feature allows you to read and write Arrow files, which are efficient for storing and transferring large datasets. The code sample demonstrates how to read an Arrow file into a table and how to write a new table to an Arrow file.
const arrow = require('apache-arrow');
const fs = require('fs');
// Reading an Arrow file
const arrowFile = fs.readFileSync('data.arrow');
const table = arrow.Table.from([arrowFile]);
console.log(table.toString());
// Writing an Arrow file
const newTable = arrow.Table.new([{ name: 'Alice', age: 30 }, { name: 'Bob', age: 25 }]);
const arrowBuffer = newTable.serialize();
fs.writeFileSync('newData.arrow', arrowBuffer);
DataFrame Operations
This feature provides DataFrame-like operations, such as creating tables, selecting columns, and filtering rows. The code sample shows how to create a DataFrame, select a column, and filter rows based on a condition.
const arrow = require('apache-arrow');
// Creating a DataFrame
const df = new arrow.Table({
name: arrow.Utf8Vector.from(['Alice', 'Bob']),
age: arrow.Int32Vector.from([30, 25])
});
// Selecting a column
const names = df.getColumn('name');
console.log(names.toArray());
// Filtering rows
const filtered = df.filter(row => row.get('age') > 25);
console.log(filtered.toString());
Interoperability with Other Languages
Apache Arrow supports interoperability with other languages like Python, R, and Java. The code sample demonstrates how to create a table in JavaScript and pass it to Python using the pyarrow library.
const arrow = require('apache-arrow');
const pyarrow = require('pyarrow'); // Assuming you have a Python environment set up
// Create a table in JavaScript
const table = arrow.Table.new([{ name: 'Alice', age: 30 }, { name: 'Bob', age: 25 }]);
const arrowBuffer = table.serialize();
// Pass the buffer to Python
const pyTable = pyarrow.Table.from_buffer(arrowBuffer);
print(pyTable)
Pandas is a powerful data manipulation and analysis library for Python. It provides DataFrame and Series data structures for handling tabular and time series data. While pandas is highly efficient for in-memory data manipulation, it does not use a columnar memory format like Apache Arrow, which can be less efficient for certain types of operations.
Dask is a parallel computing library in Python that scales the existing Python ecosystem, including pandas and NumPy. It provides advanced parallelism for analytics, enabling the processing of large datasets that do not fit into memory. Unlike Apache Arrow, Dask focuses on parallel computing and distributed data processing.
Loading big native dataframes in JavaScript is finally awesome. apache-arrow
is Graphistry's production JavaScript Arrow bindings, providing an easy, modern, and efficient zero-copy JS interface to parse, iterate, and access Apache Arrow columnar data on CPUs (GPU support via GoAI is occurring in parallel).
apache-arrow
is tested on Apache's sample Arrow files and MapD Core's Arrow output, and powers much of Graphistry's GPU visual analytics platform. It is in active development by Graphistry for its GPU client/cloud visual graph analytics platform.
npm install apache-arrow
apache-arrow
is written in TypeScript, but the project is compiled to multiple JS versions and common module formats. The base apache-arrow
package includes all the compilation targets for convenience, but if you're conscientious about your node_modules
footprint, don't worry -- we got you. The targets are also published under the @apache-arrow
namespace:
npm install @apache-arrow/es5-cjs # ES5 CommonJS target
npm install @apache-arrow/es5-esm # ES5 ESModules target
npm install @apache-arrow/es5-umd # ES5 UMD target
npm install @apache-arrow/es2015-cjs # ES2015 CommonJS target
npm install @apache-arrow/es2015-esm # ES2015 ESModules target
npm install @apache-arrow/es2015-umd # ES2015 UMD target
npm install @apache-arrow/esnext-esm # ESNext CommonJS target
npm install @apache-arrow/esnext-esm # ESNext ESModules target
npm install @apache-arrow/esnext-umd # ESNext UMD target
The JS community is a diverse group with a varied list of target environments and tool chains. Publishing multiple packages accommodates projects of all types. Friends targeting the latest JS runtimes can pull in the ESNext + ESM build. Friends needing wide browser support and small download size can use the UMD bundle, which has been run through Google's Closure Compiler with advanced optimizations.
If you think we missed a compilation target and it's a blocker for adoption, please open an issue. We're here for you ❤️.
Apache Arrow is a columnar memory layout specification for encoding vectors and table-like containers of flat and nested data. The Arrow spec aligns columnar data in memory to maximize caches and take advantage of the latest SIMD (Single input multiple data) and GPU operations on modern processors.
Apache Arrow is the emerging standard for large in-memory columnar data (Spark, Pandas, Drill, ...). By standardizing on a common interchange format, big data systems can reduce the costs and friction associated with cross-system communication.
import { readFileSync } from 'fs';
import { Table } from 'apache-arrow';
const table = Table.from(readFileSync('test/arrows/file/simple.arrow'));
console.log(table.toString());
/*
foo, bar, baz
1, 1, aa
null, null, null
3, null, null
4, 4, bbb
5, 5, cccc
*/
import { readFileSync } from 'fs';
import { Table } from 'apache-arrow';
const table = Table.from(...[
'test/arrows/multi/latlong/schema.arrow',
'test/arrows/multi/latlong/records.arrow'
].map((file) => readFileSync(file)));
console.log(table.toString());
/*
origin_lat, origin_lon
35.393089294433594, -97.6007308959961
35.393089294433594, -97.6007308959961
35.393089294433594, -97.6007308959961
29.533695220947266, -98.46977996826172
29.533695220947266, -98.46977996826172
*/
import { readFileSync } from 'fs';
import { Table } from 'apache-arrow';
const table = Table.from(...[
'test/arrows/file/multi/latlong/schema.arrow',
'test/arrows/file/multi/latlong/records.arrow'
].map(readFileSync));
const column = table.getColumn('origin_lat');
const columnTyped = column.slice();
assert(columnTyped instanceof Float32Array);
for (let i = -1, n = column.length; ++i < n;) {
assert(column.get(i) === columnTyped[i]);
}
import MapD from 'rxjs-mapd';
import { Table } from 'apache-arrow';
const host = `localhost`, port = 9091, encrypted = false;
const username = `mapd`, password = `HyperInteractive`, dbName = `mapd`, timeout = 5000;
MapD.open(host, port, encrypted)
.connect(dbName, username, password, timeout)
.flatMap((session) => session
.queryDF(`SELECT origin_city FROM flights WHERE dest_city ILIKE 'dallas' LIMIT 5`)
.map(([schema, records]) => Table.from(schema, records))
.disconnect()
)
.subscribe((table) => {
console.log(table.toString({ index: true })
/*
Index, origin_city
0, Oklahoma City
1, Oklahoma City
2, Oklahoma City
3, San Antonio
4, San Antonio
*/
}));
See develop.md
Please create an issue if you encounter any bugs!
PR's welcome! Here's some ideas:
FAQs
Apache Arrow columnar in-memory format
The npm package apache-arrow receives a total of 169,342 weekly downloads. As such, apache-arrow popularity was classified as popular.
We found that apache-arrow demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 7 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
Security News
NVD’s backlog surpasses 20,000 CVEs as analysis slows and NIST announces new system updates to address ongoing delays.
Security News
Research
A malicious npm package disguised as a WhatsApp client is exploiting authentication flows with a remote kill switch to exfiltrate data and destroy files.