
Security News
Axios Supply Chain Attack Reaches OpenAI macOS Signing Pipeline, Forces Certificate Rotation
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.
elasticsetup
Advanced tools
Lightweight library to quickly setup an Elasticsearch index from settings, analyzer, normalizer, tokenizer and mapping. Optionally reindex data from the same or another index of the same cluster.
Lightweight library to quickly setup an Elasticsearch index from settings, analyzer, normalizer, tokenizer and mapping. Optionally reindex data from the same or another index of the same cluster.
npm install -g elasticsetup
elasticsetup -h 192.168.0.10 -i products -s settings.json -a analyzer.json -n normalizer.json -t tokenizer.json -m mapping.json
elasticsetup -h 192.168.0.10 -i products -s settings.json -a analyzer.json -n normalizer.json -t tokenizer.json -m mapping.json -o products_old
elasticsetup -h 192.168.0.10 -i products -s settings.json -a analyzer.json -n normalizer.json -t tokenizer.json -m mapping.json -o products
elasticsetup -h 192.168.0.10 -i products -c credentials.json -a analyzer.json mapping.json
elasticsetup -h 192.168.0.10-i products -s settings.json -a analyzer.json -m mapping.json -d ./data
elasticsetup -h 192.168.0.10-i products -s settings.json -a analyzer.json -m mapping.json -d ./data/products.ndjson
elasticsetup -h 192.168.0.10-i products -s settings.json -a analyzer.json -m mapping.json -d ./data -o products_old
elasticsetup -h 192.168.0.10-i products -s settings.json -a analyzer.json -m mapping.json -d ./data
{
"username": "user",
"password": "password"
}
{
"max_ngram_diff": 10
}
{
"lowercase_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase"
]
},
"lowercase_asciifolding_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase",
"asciifolding"
]
}
}
{
"lowercase_asciifolding_normalizer": {
"type": "custom",
"char_filter": [],
"filter": ["lowercase", "asciifolding"]
}
}
{
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 10,
"token_chars": [
"letter"
]
},
"edge_ngram_whitespace": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 10,
"token_chars": [
"letter",
"whitespace"
]
}
}
{
"properties": {
"title": {
"type": "keyword"
},
"description": {
"type": "text",
"analyzer": "lowercase_asciifolding_analyzer"
},
"price": {
"type": "float"
},
"category": {
"properties": {
"title": {
"type": "keyword"
},
"description": {
"type": "text",
"analyzer": "lowercase_asciifolding_analyzer"
}
}
},
"excerpt": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword"
}
}
}
}
}
npm install --save elasticsetup
const { setup } = require('elasticsetup')
const host = ' 192.168.0.10'
const index = 'products'
const otherIndex = 'oldProducts'
const dataDirPath = './data'
const dataFilePath = './data/products.json'
// if authorization required
const username = "user"
const password = "password"
process.env.ELASTICSEARCH_AUTHORIZATION_TOKEN = Buffer.from(`${username}:${password}`).toString('base64');
}
const settings = {
"max_ngram_diff": 10
}
const analyzer = {
"lowercase_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase"
]
},
"lowercase_asciifolding_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase",
"asciifolding"
]
}
}
const normalizer = {
"lowercase_asciifolding_normalizer": {
"type": "custom",
"char_filter": [],
"filter": ["lowercase", "asciifolding"]
}
}
const tokenizer = {
"edge_ngram": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 10,
"token_chars": [
"letter"
]
},
"edge_ngram_whitespace": {
"type": "edge_ngram",
"min_gram": 2,
"max_gram": 10,
"token_chars": [
"letter",
"whitespace"
]
}
}
const mapping = {
"properties": {
"title": {
"type": "keyword",
},
"description": {
"type": "text",
"analyzer": "lowercase_asciifolding_analyzer"
},
"price": {
"type": "float"
},
"category": {
"properties": {
"title": {
"type": "keyword"
},
"description": {
"type": "text",
"analyzer": "lowercase_asciifolding_analyzer"
}
}
},
"excerpt": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword"
}
}
}
}
}
(async function() {
// setup index without reindexing
await setup(host, index, settings, analyzer, normalizer, tokenizer, mapping)
// setup index with data reindexing from distinct index
await setup(host, index, settings, analyzer, normalizer, tokenizer, mapping, otherIndex)
// setup index with data reindexing from same index
await setup(host, index, settings, analyzer, normalizer, tokenizer, mapping, index)
// setup index with data reindexing from local data file path
await setup(host, index, settings, analyzer, normalizer, tokenizer, mapping, null, dataFilePath)
// setup index with data reindexing from local data dir path
await setup(host, index, settings, analyzer, normalizer, tokenizer, mapping, null, dataDirPath)
// setup index with data reindexing from distinct index and local data path
await setup(host, index, settings, analyzer, normalizer, tokenizer, mapping, otherIndex, dataFilePath)
})()
In order for the basic authorization to be properly handled by the lib http calls to the Elasticsearch cluster, the environment variable ELASTICSEARCH_AUTHORIZATION_TOKEN must be set to the base64 encoded value of "username:password" string.
In case of data reindexing ( ie the name of the index containing the origin data passed as the last parameter to the setup method ) the following situation might occur :
Distinct origin index The index to setup will be deleted ( if already existing ) and created with the settings and mapping provided. At the end of the process, the data stored in the origin index will be indexed into the newly created index
Same origin index A new temporary index will be firstly created and the original data will then be indexed into that temporary index. The original index will then be deleted and recreated with the settings and mapping provided. At the end of the process, the data stored into the temporary index will be indexed back into the newly created index and the temporary index will be ultimately deleted.
Local data file or directory Local data files are expected to be formatted as ndjson. Both file path and dir path can be passed as a data argument to the setup function. In case of dir path argument, all of the directory files will be parsed one by one and their records indexed. Local record indexation is performed through the use of the _bulk elasticsearch endpoint. Each batch is 1000 records big by default.
Notes :
ISC
FAQs
Lightweight library to quickly setup an Elasticsearch index from settings, analyzer, normalizer, tokenizer and mapping. Optionally reindex data from the same or another index of the same cluster.
We found that elasticsetup demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Security News
OpenAI rotated macOS signing certificates after a malicious Axios package reached its CI pipeline in a broader software supply chain attack.

Security News
Open source is under attack because of how much value it creates. It has been the foundation of every major software innovation for the last three decades. This is not the time to walk away from it.

Security News
Socket CEO Feross Aboukhadijeh breaks down how North Korea hijacked Axios and what it means for the future of software supply chain security.