Research
Security News
Malicious npm Package Targets Solana Developers and Hijacks Funds
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
@apla/clickhouse
Advanced tools
npm install @apla/clickhouse
const ClickHouse = require('@apla/clickhouse')
const ch = new ClickHouse({ host, port, user, password })
const stream = ch.query("SELECT 1", (err, data) => {})
stream.pipe(process.stdout)
// promise interface, not recommended for selects
// (requires 'util.promisify' for node < 8, Promise shim for node < 4)
await ch.querying("CREATE DATABASE test")
Examples:
new ClickHouse(options: Options)
Options
required | default | description | |
---|---|---|---|
host | ✓ | Host to connect. | |
user | Authentication user. | ||
password | Authentication password. | ||
path (pathname ) | / | Pathname of ClickHouse server. | |
port | 8123 | Server port number. | |
protocol | 'http:' | 'https:' or 'http:' . | |
dataObjects | false | By default (false ), you'll receive array of values for each row. If you set dataObjects: true , every row will become an object with format: { fieldName: fieldValue, … } . Alias to format: 'JSON' . | |
format | JSONCompact | Adds the FORMAT statement for query if it did not have one. Specifies format of selected or inserted data. See "Formats for input and output data" to find out possible values. | |
queryOptions | Object, can contain any ClickHouse option from Settings, Restrictions and Permissions. See example. | ||
readonly | false | Tells driver to send query with HTTP GET method. Same as readonly=1 setting. More details. | |
timeout , headers , agent , localAddress , servername , etc… | Any http.request or https.request options are also available. |
const ch = new ClickHouse({
host: "clickhouse.msk",
dataObjects: true,
readonly: true,
queryOptions: {
profile: "web",
database: "test",
},
})
clickHouse.query(query, [options], [callback])
Sends a query statement to a server.
query: string
SQL query statement.
options: Options
The same Options
, excluding connection options.
callback: (error, result) => void
Will be called upon completion.
DuplexStream
It supports .pipe
to process records.
You should have at least one error handler listening. Via query callback or via stream error
event.
Stream event | Description |
---|---|
metadata | When a column information is parsed. |
data | When a row is available. |
end | When entire response is processed. |
error | Query execution finished with error. If you have both query callback and stream error listener, you'll have error notification in both listeners. |
stream.supplemental
After response is processed, you can read a supplemental response data from it, such as row count.
Examples:
clickHouse.ping(callback)
Sends an empty query. Doesn't requires authorization.
callback: (error, result) => void
Will be called upon completion.
Promise interface is not recommended for INSERT
and SELECT
queries.
INSERT
can't do bulk load data with promise interface.SELECT
will collect entire query result in the memory. See the Memory size section.With promise interface query result are parsed synchronously. This means that large query result in promise interface:
Use it only for queries where resulting data size is is known and extremely small.
The good cases to use it is DESCRIBE TABLE
or EXISTS TABLE
clickHouse.querying(query, [options])
Similar to ch.query(query)
but collects entire response in memory and resolves with complete query result.
See the Memory size section.
options: Options
The same Options
, excluding connection options.
Promise
Will be resolved with entire query result.
Example of promise interface.
clickHouse.pinging()
Promise interface for .ping
.
Promise
INSERT
statementsINSERT
can be used for bulk data loading. There is a 2 formats easily implementable
with javascript: CSV and TabSeparated/TSV.
CSV is useful for loading from file, thus you can read and .pipe
into clickhouse
file contents.
To activate CSV parsing you should set format
driver option or query FORMAT
statement to CSV
:
var csvStream = fs.createReadStream('data.csv')
var clickhouseStream = ch.query(statement, { format: CSV })
csvStream.pipe(clickhouseStream)
TSV is useful for loading from file and bulk loading from external sources, such as other databases.
Only \\
, \t
and \n
need to be escaped in strings; numbers, nulls,
bools and date objects need some minor processing. You can send prepared TSV data strings
(line ending will be appended automatically), buffers (always passed as is) or Arrays with fields.
Internally, every field will be converted to the format which ClickHouse can accept. Then escaped and joined with delimiter for the particular format. If you ever need to store rows (in arrays) and send preformatted data, you can do it.
ClickHouse also supports JSONEachRow format which can be useful to insert javascript objects if you have such recordset.
const stream = ch.query(statement, { format: 'JSONEachRow' })
stream.write(object) // Do write as many times as possible
stream.end() // And don't forget to finish insert query
You can read all the records into memory in single call like this:
var ch = new ClickHouse({ host: host, port: port })
ch.querying("SELECT number FROM system.numbers LIMIT 10", (err, result) => {
// result will contain all the data you need
})
In this case whole JSON response from the server will be read into memory, then parsed into memory hogging your CPU. Default parser will parse server response line by line and emits events. This is slower, but much more memory and CPU efficient for larger datasets.
const readableStream = ch.query(
'SELECT * FROM system.contributors FORMAT JSONEachRow',
(err, result) => {},
)
const writableStream = fs.createWriteStream('./contributors.json')
readableStream.pipe(writableStream)
const readableStream = fs.createReadStream('./x.csv')
const writableStream = ch.query('INSERT INTO table FORMAT CSV', (err, result) => {})
readableStream.pipe(writableStream)
const ch = new ClickHouse(options)
const writableStream = ch.query(`INSERT INTO table FORMAT TSV`, (err) => {
console.log('Insert complete!')
})
// data will be formatted for you
writableStream.write([1, 2.22, "erbgwerg", new Date()])
// prepare data yourself
writableStream.write("1\t2.22\terbgwerg\t2017-07-17 17:17:17")
writableStream.end()
const ch = new ClickHouse(options)
// it is better to use stream interface to fetch select results
const stream = ch.query("SELECT * FROM system.numbers LIMIT 10000000")
stream.on('metadata', (columns) => { /* do something with column list */ })
let rows = []
stream.on('data', (row) => rows.push(row))
stream.on('error', (err) => { /* handler error */ })
stream.on('end', () => {
console.log(
rows.length,
stream.supplemental.rows,
stream.supplemental.rows_before_limit_at_least, // how many rows in result are set without windowing
)
})
const ch = new ClickHouse(options)
// insert from file
const tsvStream = fs.createReadStream('data.tsv')
const clickhouseStream = ch.query('INSERT INTO table FORMAT TSV')
tsvStream.pipe(clickhouseStream)
const ch = new ClickHouse({
host: 'clickhouse.msk',
queryOptions: {
database: "test",
profile: "web",
readonly: 2,
force_index_by_date: 1,
max_rows_to_read: 10 * 1e6,
},
})
const ch = new ClickHouse({ host: 'clickhouse.msk' })
const stream = ch.query('INSERT INTO table FORMAT TSV', {
queryOptions: {
database: "test",
insert_quorum: 2,
},
})
const ch = new ClickHouse(options)
// Check connection to server. Doesn't requires authorization.
await ch.pinging()
const { data } = await ch.querying("SELECT 1")
// [ [ 1 ] ]
const { data } = await ch.querying("DESCRIBE TABLE system.numbers", { dataObjects: true })
// [ { name: 'number', type: 'UInt64', default_type: '', default_expression: '' } ]
FAQs
Yandex ClickHouse database interface
The npm package @apla/clickhouse receives a total of 3,496 weekly downloads. As such, @apla/clickhouse popularity was classified as popular.
We found that @apla/clickhouse demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 2 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A malicious npm package targets Solana developers, rerouting funds in 2% of transactions to a hardcoded address.
Security News
Research
Socket researchers have discovered malicious npm packages targeting crypto developers, stealing credentials and wallet data using spyware delivered through typosquats of popular cryptographic libraries.
Security News
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.