Research
Security News
Threat Actor Exposes Playbook for Exploiting npm to Build Blockchain-Powered Botnets
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
algoliasearch
Advanced tools
The algoliasearch npm package is a JavaScript client for Algolia, a hosted search API that provides a fast and accurate search experience for websites and mobile applications. The package allows developers to integrate Algolia's search capabilities into their JavaScript applications, enabling features such as full-text search, faceting, filtering, and geolocation queries.
Search
Perform a search query on an Algolia index and retrieve the results.
const algoliasearch = require('algoliasearch');
const client = algoliasearch('YourApplicationID', 'YourAdminAPIKey');
const index = client.initIndex('your_index_name');
index.search('query string').then(({ hits }) => {
console.log(hits);
});
Indexing
Add or update a record in an Algolia index.
const algoliasearch = require('algoliasearch');
const client = algoliasearch('YourApplicationID', 'YourAdminAPIKey');
const index = client.initIndex('your_index_name');
index.saveObject({ objectID: '1', name: 'Foo' }).then(() => {
console.log('Object indexed');
});
Configure Index Settings
Configure settings of an Algolia index to define ranking, attributes for faceting, etc.
const algoliasearch = require('algoliasearch');
const client = algoliasearch('YourApplicationID', 'YourAdminAPIKey');
const index = client.initIndex('your_index_name');
index.setSettings({
searchableAttributes: ['name', 'description'],
customRanking: ['desc(popularity)']
}).then(() => {
console.log('Settings updated');
});
Manage Indices
List all indices in your Algolia application and manage them.
const algoliasearch = require('algoliasearch');
const client = algoliasearch('YourApplicationID', 'YourAdminAPIKey');
client.listIndices().then(({ items }) => {
console.log(items);
});
Elasticsearch is a distributed, RESTful search and analytics engine capable of solving a growing number of use cases. The elasticsearch npm package is the official Elasticsearch client for Node.js. It provides similar search capabilities but is typically self-hosted, unlike Algolia which is a managed service.
Solr-node is a client library for interacting with Apache Solr, which is an open-source search platform. Similar to Algolia, it provides full-text search, but it requires self-hosting and manual scaling, whereas Algolia offers a fully managed service.
Typesense is an open-source, typo-tolerant search engine that provides a similar developer experience to Algolia. The typesense npm package allows you to integrate Typesense into your JavaScript applications. It is designed to be easy to use and deploy, offering an alternative to Algolia with a focus on simplicity and speed.
Algolia Search is a hosted full-text, numerical, and faceted search engine capable of delivering realtime results from the first keystroke.
The JavaScript client lets you use the Algolia Search API on the frontend (browsers) or on the backend (Node.js) with the same API.
The backend (Node.js) API can be used to index your data using your Algolia admin API keys.
Our JavaScript library is UMD compatible, you can use it with any module loader.
When not using any module loader, it will export an algoliasearch
method in the window
object.
Getting Started
Commands Reference
To setup your project, follow these steps:
You can either use a package manager like npm or include a <script>
tag.
npm install algoliasearch --save
We are browserifyable and webpack friendly.
bower install algoliasearch -S
jsDelivr is a global CDN delivery for JavaScript libraries.
To include the latest releases and all upcoming features and patches, use this:
<script src="//cdn.jsdelivr.net/algoliasearch/3/algoliasearch.min.js"></script>
npm install algoliasearch --save
var algoliasearch = require('algoliasearch');
var client = algoliasearch('applicationID', 'apiKey');
var index = client.initIndex('indexName');
index.search('something', function searchDone(err, content) {
console.log(err, content);
});
npm install algoliasearch --save
var algoliasearch = require('algoliasearch');
var client = algoliasearch('applicationID', 'apiKey');
var index = client.initIndex('indexName');
index.search('something', function searchDone(err, content) {
console.log(err, content);
});
curl https://raw.githubusercontent.com/algolia/algoliasearch-client-js/master/dist/algoliasearch.parse.js -o /your/parse/project/cloud/algoliasearch.parse.js
In cloud/main.js
for example:
var algoliasearch = require('cloud/algoliasearch.parse.js');
var client = algoliasearch('latency', '6be0576ff61c053d5f9a3225e2a90f76');
var index = client.initIndex('contacts');
Parse.Cloud.define("hello", function(request, response) {
index.search('Atlenta', function(err, results) {
if (err) {
throw err;
}
response.success('We got ' + results.nbHits + ' results');
});
});
To build your frontend search experience, also check out our examples and tutorials.
<script src="//cdn.jsdelivr.net/algoliasearch/3/algoliasearch.min.js"></script>
<script>
var client = algoliasearch('ApplicationID', 'apiKey');
var index = client.initIndex('indexName');
index.search('an example', function searchDone(err, content) {
console.log(err, content)
});
index.search('another example')
.then(function searchSuccess(content) {
console.log(content);
})
.catch(function searchFailure(err) {
console.error(err);
});
</script>
We provide a specific jQuery build that will use jQuery.ajax.
It can be used with callbacks or jQuery promises.
<script src="//cdn.jsdelivr.net/jquery/2.1.3/jquery.min.js"></script>
<script src="//cdn.jsdelivr.net/algoliasearch/3/algoliasearch.jquery.min.js"></script>
<script>
var client = $.algolia.Client('ApplicationID', 'apiKey');
var index = client.initIndex('indexName');
index.search('something', function searchDone(err, content) {
console.log(err, content)
});
</script>
We provide a specific AngularJS build that is using the $http service.
It can be used with callbacks or AngularJS promises.
<script src="//cdn.jsdelivr.net/angularjs/1.3.14/angular.min.js"></script>
<script src="//cdn.jsdelivr.net/algoliasearch/3/algoliasearch.angular.min.js"></script>
<script>
angular
.module('myapp', ['algoliasearch'])
.controller('SearchCtrl', ['$scope', 'algolia', function($scope, algolia) {
$scope.query = '';
$scope.hits = [];
var client = algolia.Client('ApplicationID', 'apiKey');
var index = client.initIndex('indexName');
index.search('something')
.then(function searchSuccess(content) {
console.log(content);
}, function searchFailure(err) {
console.log(err);
});
}]);
</script>
In 30 seconds, this quick start tutorial will show you how to index and search objects.
Without any prior configuration, you can start indexing 500 contacts in the contacts
index using the following code:
var index = client.initIndex('contacts');
var contactsJSON = require('./contacts.json');
index.addObjects(contactsJSON, function(err, content) {
if (err) {
console.error(err);
}
});
You can now search for contacts using firstname, lastname, company, etc. (even with typos):
// firstname
index.search('jimmie', function(err, content) {
console.log(content.hits);
});
// firstname with typo
index.search('jimie', function(err, content) {
console.log(content.hits);
});
// a company
index.search('california paint', function(err, content) {
console.log(content.hits);
});
// a firstname & company
index.search('jimmie paint', function(err, content) {
console.log(content.hits);
});
Settings can be customized to tune the search behavior. For example, you can add a custom sort by number of followers to the already great built-in relevance:
index.setSettings({
'customRanking': ['desc(followers)']
}, function(err, content) {
console.log(content);
});
You can also configure the list of attributes you want to index by order of importance (first = most important):
index.setSettings({
'attributesToIndex': [
'lastname',
'firstname',
'company',
'email',
'city',
'address'
]
}, function(err, content) {
console.log(content);
});
Since the engine is designed to suggest results as you type, you'll generally search by prefix. In this case the order of attributes is very important to decide which hit is the best:
index.search('or', function(err, content) {
console.log(content.hits);
});
index.search('jim', function(err, content) {
console.log(content.hits);
});
In most situations, there is no need to tune the options. We provide this list to be transparent with our users.
timeout
(Number) timeout for requests to our servers, in milliseconds
protocol
(String) protocol to use when communicating with algolia
hosts.read
([String]) array of read hosts to use to call Algolia servers, computed automaticallyhosts.write
([String]) array of write hosts to use to call Algolia servers, computed automaticallyhttpAgent
(HttpAgent) node-only Node.js httpAgent instance to use when communicating with Algolia servers.To pass an option, use:
var client = algoliasearch(applicationId, apiKey, {
timeout: 4000
})
Every API call takes a callback as last parameter. This callback will then be called with two arguments:
error.message
.If you do not provide a callback, you will get a promise (but never both).
Promises are the native Promise implementation.
We use jakearchibald/es6-promise as a polyfill when needed.
The request strategy used by the JavaScript client includes:
http
moduleConnections are always keep-alive
.
Browser only
To avoid performing the same API calls twice search results will be stored in a cache
that will be tied to your JavaScript client
and index
objects.
It's particularly useful when your users are deleting characters or words from the current query but has a chance of ending up with outdated results if the page isn't refreshed for some time.
If at any point you want to clear the cache, just do this:
// clear the queries cache
index.clearCache();
// if you're performing multi-queries using the API client instead of the index
// you'll need to use the following code
client.clearCache();
Node.js only
If you are behind a proxy, just set HTTP_PROXY
or HTTPS_PROXY
environment variables before starting your Node.js program.
HTTP_PROXY=http://someproxy.com:9320 node main.js
Node.js only
Keep-alive is activated by default.
Because of the nature of keepalive connections, your process will hang even if you do not do any more command using the client
.
To fix this, we expose a client.destroy()
method that will terminate all remaining alive connections.
You should call this method when you are finished working with the AlgoliaSearch API. So that your process will exit gently.
Note: keep-alive is still always activated in browsers, this is a native behavior of browsers.
Check our online documentation:
Check out our tutorials:
In April 2015, we released the V3 of our JavaScript client (the one you are looking at) able to work in Node.js and the browser.
If you were using our browser version (V2), read the migration guide
If you were using our Node.js version (V1, npm algolia-search
), read the migration guide
Each entry in an index has a unique identifier called objectID
. There are two ways to add en entry to the index:
objectID
assignment. You will be able to access it in the answer.objectID
.You don't need to explicitly create an index, it will be automatically created the first time you add an object. Objects are schema less so you don't need any configuration to start indexing. If you wish to configure things, the settings section provides details about advanced settings.
Example with automatic objectID
assignment:
index.addObject({
firstname: 'Jimmie',
lastname: 'Barninger'
}, function(err, content) {
console.log('objectID=' + content.objectID);
});
Example with manual objectID
assignment:
index.addObject({
firstname: 'Jimmie',
lastname: 'Barninger'
}, 'myID', function(err, content) {
console.log('objectID=' + content.objectID);
});
You have three options when updating an existing object:
Example on how to replace all attributes of an existing object:
index.saveObject({
firstname: 'Jimmie',
lastname: 'Barninger',
city: 'New York',
objectID: 'myID'
}, function(err, content) {
console.log(content);
});
You have many ways to update an object's attributes:
Example to update only the city attribute of an existing object:
index.partialUpdateObject({
city: 'San Francisco',
objectID: 'myID'
}, function(err, content) {
console.log(content);
});
Example to add a tag:
index.partialUpdateObject({
_tags: {
value: 'MyTag',
_operation: 'Add'
},
objectID: 'myID'
}, function(err, content) {
console.log(content);
});
Example to remove a tag:
index.partialUpdateObject({
_tags: {
value: 'MyTag',
_operation:'Remove'
},
objectID: 'myID'
}, function(err, content) {
console.log(content);
});
Example to add a tag if it doesn't exist:
index.partialUpdateObject({
_tags: {
value: 'MyTag',
_operation: 'AddUnique'
},
objectID: 'myID'
}, function(err, content) {
console.log(content);
});
Example to increment a numeric value:
index.partialUpdateObject({
price: {
value: 42,
_operation: 'Increment'
},
objectID: 'myID'
}, function(err, content) {
console.log(content);
});
Note: Here we are incrementing the value by 42
. To increment just by one, put
value:1
.
Example to decrement a numeric value:
index.partialUpdateObject({
price: {
value: 42,
_operation: 'Decrement'
},
objectID: 'myID'
}, function(err, content) {
console.log(content);
});
Note: Here we are decrementing the value by 42
. To decrement just by one, put
value:1
.
To perform a search, you only need to initialize the index and perform a call to the search function.
The search query allows only to retrieve 1000 hits, if you need to retrieve more than 1000 hits for seo, you can use Backup / Retrieve all index content
You can use the following optional arguments:
attributesToIndex
index setting). Attributes are separated with a comma such as "name,address"
. You can also use JSON string array encoding such as encodeURIComponent("[\"name\",\"address\"]")
. By default, this list is empty.attributesToIndex
index setting). Attributes are separated with a comma such as "name,address"
. You can also use JSON string array encoding such as encodeURIComponent("[\"name\",\"address\"]")
. By default, all attributes specified in attributesToIndex
settings are used to search."
. For example, "search engine"
will retrieve records having search
next to engine
only. Typo tolerance is disabled on phrase queries.-
symbol. For example, search -engine
will retrieve records containing search
but not engine
.page=9
.aroundLatLng: Search for entries around a given latitude/longitude (specified as two floats separated by a comma).
For example, aroundLatLng=47.316669,5.016670
.
By default the maximum distance is automatically guessed based on the density of the area but you can specify it manually in meters with the aroundRadius parameter. The precision for ranking can be set with aroundPrecision parameter. For example, if you set aroundPrecision=100, the distances will be considered by ranges of 100m, for example all distances 0 and 100m will be considered as identical for the "geo" ranking parameter.
When aroundRadius is not set, the radius is computed automatically using the density of the area, you can retrieve the computed radius in the automaticRadius attribute of the answer, you can also use the minimumAroundRadius query parameter to specify a minimum radius in meters for the automatic computation of aroundRadius.
At indexing, you should specify geoloc of an object with the _geoloc attribute (in the form "_geoloc":{"lat":48.853409, "lng":2.348800}
or "_geoloc":[{"lat":48.853409, "lng":2.348800},{"lat":48.547456, "lng":2.972075}]
if you have several geo-locations in your record).
aroundLatLngViaIP: Search for entries around a given latitude/longitude automatically computed from user IP address.
For example, aroundLatLng=47.316669,5.016670
.
You can specify the maximum distance in meters with the aroundRadius parameter and the precision for ranking with aroundPrecision. For example, if you set aroundPrecision=100, two objects that are in the range 0-99m will be considered as identic in the ranking for the "geo" ranking parameter (same for 100-199, 200-299, ... ranges).
At indexing, you should specify the geo location of an object with the _geoloc
attribute in the form {"_geoloc":{"lat":48.853409, "lng":2.348800}}
.
insideBoundingBox: Search entries inside a given area defined by the two extreme points of a rectangle (defined by 4 floats: p1Lat,p1Lng,p2Lat,p2Lng).
For example, insideBoundingBox=47.3165,4.9665,47.3424,5.0201
).
At indexing, you should specify geoloc of an object with the _geoloc attribute (in the form "_geoloc":{"lat":48.853409, "lng":2.348800}
or "_geoloc":[{"lat":48.853409, "lng":2.348800},{"lat":48.547456, "lng":2.972075}]
if you have several geo-locations in your record). You can use several bounding boxes (OR) by passing more than 4 values. For example instead of having 4 values you can pass 8 to use or OR between two bounding boxes.
insidePolygon: Search entries inside a given area defined by a set of points (defined by a minimum of 6 floats: p1Lat,p1Lng,p2Lat,p2Lng,p3Lat,p3Long).
For example, insideBoundingBox=47.3165,4.9665,47.3424,5.0201
).
At indexing, you should specify geoloc of an object with the _geoloc attribute (in the form "_geoloc":{"lat":48.853409, "lng":2.348800}
or "_geoloc":[{"lat":48.853409, "lng":2.348800},{"lat":48.547456, "lng":2.972075}]
if you have several geo-locations in your record).
"name,address"
). You can also use a string array encoding (for example ["name","address"]
). By default, all attributes are retrieved. You can also use *
to retrieve all values when an attributesToRetrieve setting is specified for your index.["name","address"]
). If an attribute has no match for the query, the raw value is returned. By default all indexed text attributes are highlighted. You can use *
if you want to highlight all textual attributes. Numerical attributes are not highlighted. A matchLevel is returned for each highlighted attribute and can contain:attributeName:nbWords
). Attributes are separated by commas (Example: attributesToSnippet=name:10,content:10
). attributesToSnippet: ["name:10","content:10"]
). By default, no snippet is computed.attributeName
followed by operand
followed by value
. Supported operands are <
, <=
, =
, >
and >=
.You can easily perform range queries via the :
operator. This is equivalent to combining a >=
and <=
operand. For example, numericFilters=price:10 to 1000
.
You can also mix OR and AND operators. The OR operator is defined with a parenthesis syntax. For example, (code=1 AND (price:[0-100] OR price:[1000-2000]))
translates to encodeURIComponent("code=1,(price:0 to 10,price:1000 to 2000)")
.
You can also use a string array encoding (for example numericFilters: ["price>100","price<1000"]
).
tags=tag1,(tag2,tag3)
means tag1 AND (tag2 OR tag3). You can also use a string array encoding. For example, tagFilters: ["tag1",["tag2","tag3"]]
means tag1 AND (tag2 OR tag3).{"_tags":["tag1","tag2"]}
.attributeName:value
. To OR facets, you must add parentheses. For example: facetFilters=(category:Book,category:Movie),author:John%20Doe
. You can also use a string array encoding. For example, [["category:Book","category:Movie"],"author:John%20Doe"]
."category,author"
. You can also use JSON string array encoding. For example, ["category","author"]
. Only the attributes that have been added in attributesForFaceting index setting can be used in this parameter. You can also use *
to perform faceting on all attributes specified in attributesForFaceting. If the number of results is important, the count can be approximate, the attribute exhaustiveFacetsCount
in the response is true when the count is exact.maxValuesPerFacet=10
will retrieve a maximum of 10 values per facet.available=1 AND (category:Book OR NOT category:Ebook) AND public
date: 1441745506 TO 1441755506 AND inStock > 0 AND author:"John Doe"
The list of keywords is:
OR: create a disjunctive filter between two filters.
AND: create a conjunctive filter between two filters.
TO: used to specify a range for a numeric filter.
NOT: used to negate a filter. The syntax with the ‘-‘ isn’t allowed.Note: To specify a value with spaces or with a value equal to a keyword, it's possible to add quotes.
Warning:
attributeForDistinct
index setting is set. This feature is similar to the SQL "distinct" keyword. When enabled in a query with the distinct=1
parameter, all hits containing a duplicate value for the attributeForDistinct attribute are removed from results. For example, if the chosen attribute is show_name
and several hits have the same value for show_name
, then only the best one is kept and the others are removed.var client = algoliasearch('ApplicationID', 'Search-Only-API-Key');
var index = client.initIndex('indexName');
// only query string
index.search('query string', function searchDone(err, content) {
if (err) {
console.error(err);
return;
}
for (var h in content.hits) {
console.log('Hit(' + content.hits[h].objectID + '): ' + content.hits[h].toString());
}
});
// with params
index.search('query string', {
attributesToRetrieve: ['firstname', 'lastname'],
hitsPerPage: 50
}, function searchDone(err, content) {
if (err) {
console.error(err);
return;
}
for (var h in content.hits) {
console.log('Hit(' + content.hits[h].objectID + '): ' + content.hits[h].toString());
}
});
The server response will look like:
{
"hits": [
{
"firstname": "Jimmie",
"lastname": "Barninger",
"objectID": "433",
"_highlightResult": {
"firstname": {
"value": "<em>Jimmie</em>",
"matchLevel": "partial"
},
"lastname": {
"value": "Barninger",
"matchLevel": "none"
},
"company": {
"value": "California <em>Paint</em> & Wlpaper Str",
"matchLevel": "partial"
}
}
}
],
"page": 0,
"nbHits": 1,
"nbPages": 1,
"hitsPerPage": 20,
"processingTimeMS": 1,
"query": "jimmie paint",
"params": "query=jimmie+paint&attributesToRetrieve=firstname,lastname&hitsPerPage=50"
}
You can send multiple queries with a single API call using a batch of queries:
var client = algoliasearch('ApplicationID', 'apiKey');
var queries = [{
indexName: 'categories',
query: 'search in categories index',
params: {
hitsPerPage: 3
}
}, {
indexName: 'products',
query: 'first search in products',
params: {
hitsPerPage: 3,
tagFilters: 'promotion'
}
}, {
indexName: 'products',
query: 'another search in products',
params: {
hitsPerPage: 10
}
}];
function searchCallback(err, content) {
if (err) {
console.error(err);
return;
}
var categories = content.results[0];
for (var i = 0; i < categories.hits.length; ++i) {
console.log(categories.hits[i]);
}
var products_promotion = content.results[1];
for (var i = 0; i < products_promotion.hits.length; ++i) {
console.log(products_promotion.hits[i]);
}
var products = content.results[2];
for (var i = 0; i < products.hits.length; ++i) {
console.log(products.hits[i]);
}
}
// perform 3 queries in a single API call:
// - 1st query targets index `categories`
// - 2nd and 3rd queries target index `products`
client.search(queries, searchCallback);
The resulting JSON answer contains a results
array storing the underlying queries answers. The answers order is the same than the requests order.
You can specify a strategy to optimize your multiple queries:
You can easily retrieve an object using its objectID
and optionally specify a comma separated list of attributes you want:
// Retrieves all attributes
index.getObject('myID', function(err, content) {
console.log(content.objectID + ": " + content.toString());
});
// Retrieves firstname and lastname attributes
index.getObject('myID', ['firstname', 'lastname'], function(err, content) {
console.log(content.objectID + ": " + content.toString());
});
You can also retrieve a set of objects:
index.getObjects(['myObj1', 'myObj2'], function(err, content) {
console.log(content);
});
You can delete an object using its objectID
:
index.deleteObject('myID', function(error) {
if (!err) {
console.log('success');
}
});
You can delete all objects matching a single query with the following code. Internally, the API client performs the query, deletes all matching hits, and waits until the deletions have been applied.
// no query parameters
index.deleteByQuery('John', function(error) {
if (!err) {
console.log('success');
}
});
// with query parameters
index.deleteByQuery('John', {
some: 'query',
param: 'eters'
}, function(error) {
if (!err) {
console.log('success');
}
});
You can retrieve all settings using the getSettings
function. The result will contain the following attributes:
unordered(AttributeName)
. For example, attributesToIndex: ["title", "unordered(text)"]
.
You can decide to have the same priority for two attributes by passing them in the same string using a comma as a separator. For example title
and alternative_title
have the same priority in this example, which is different than text priority: attributesToIndex:["title,alternative_title", "text"]
.equalOnly(AttributeName)
. The other operators will be disabled.Distinct
feature. This feature is similar to the SQL "distinct" keyword. When enabled in queries with the distinct=1
parameter, all hits containing a duplicate value for this attribute are removed from results. For example, if the chosen attribute is show_name
and several hits have the same value for show_name
, then only the best one is kept and others are removed.optionalWords
query parameter to have results with the most matched words first."customRanking" => ["desc(population)", "asc(name)"]
.+#
to be able to search Google+ or C#."synomyms": [ [ "black", "dark" ], [ "small", "little", "mini" ], ... ]
. Synonym feature also supports multi-words expression like "synonyms": [ ["NY", "New York"] ]
"placeholders": { "<streetnumber>": ["1", "2", "3", ..., "9999"]}
would allow it to be able to match all street numbers. We use the < >
tag syntax to define placeholders in an attribute. For example:{ "name" : "Apple Store", "address" : "<streetnumber> Opera street, Paris" }
."placeholders": { "<streetnumber>" : ["1", "2", "3", "4", "5", ... ], ... }
.attributesToIndex
index setting). By default the list is empty.For example "altCorrections": [ { "word" : "foot", "correction": "feet", "nbTypos": 1 }, { "word": "feet", "correction": "foot", "nbTypos": 1 } ]
.
minWordSizefor1Typo: (integer) The minimum number of characters needed to accept one typo (default = 4).
minWordSizefor2Typos: (integer) The minimum number of characters needed to accept two typos (default = 8).
hitsPerPage: (integer) The number of hits per page (default = 10).
attributesToRetrieve: (array of strings) Default list of attributes to retrieve in objects. If set to null, all attributes are retrieved.
attributesToHighlight: (array of strings) Default list of attributes to highlight. If set to null, all indexed attributes are highlighted.
attributesToSnippet: (array of strings) Default list of attributes to snippet alongside the number of words to return (syntax is 'attributeName:nbWords').
By default, no snippet is computed. If set to null, no snippet is computed.
highlightPreTag: (string) Specify the string that is inserted before the highlighted parts in the query result (defaults to "<em>").
highlightPostTag: (string) Specify the string that is inserted after the highlighted parts in the query result (defaults to "</em>").
optionalWords: (array of strings) Specify a list of words that should be considered optional when found in the query.
allowTyposOnNumericTokens: (boolean) If set to false, disable typo-tolerance on numeric tokens (=numbers) in the query word. For example the query "304"
will match with "30450"
, but not with "40450"
that would have been the case with typo-tolerance enabled. Can be very useful on serial numbers and zip codes searches. Default to false.
ignorePlurals: (boolean) If set to true, simple plural forms won’t be considered as typos (for example car/cars will be considered as equal). Default to false.
advancedSyntax: Enable the advanced query syntax. Defaults to 0 (false).
Phrase query: a phrase query defines a particular sequence of terms. A phrase query is build by Algolia's query parser for words surrounded by "
. For example, "search engine"
will retrieve records having search
next to engine
only. Typo-tolerance is disabled on phrase queries.
Prohibit operator: The prohibit operator excludes records that contain the term after the -
symbol. For example search -engine
will retrieve records containing search
but not engine
.
replaceSynonymsInHighlight: (boolean) If set to false, words matched via synonyms expansion will not be replaced by the matched synonym in the highlighted result. Default to true.
maxValuesPerFacet: (integer) Limit the number of facet values returned for each facet. For example: maxValuesPerFacet=10
will retrieve max 10 values per facet.
distinct: (integer) Enable the distinct feature (disabled by default) if the attributeForDistinct
index setting is set. This feature is similar to the SQL "distinct" keyword: when enabled in a query with the distinct=1
parameter, all hits containing a duplicate value for theattributeForDistinct
attribute are removed from results. For example, if the chosen attribute is show_name
and several hits have the same value for show_name
, then only the best one is kept and others are removed.
typoTolerance: (string) This setting has four different options:
true: activate the typo-tolerance (default value).
false: disable the typo-tolerance
min: keep only results with the lowest number of typo. For example if one result match without typos, then all results with typos will be hidden.
strict: if there is a match without typo, then all results with 2 typos or more will be removed. This option is useful if you want to avoid as much as possible false positive.
removeStopWords: (boolean) Remove stop words from query before executing it. Defaults to false. Contains stop words for 41 languages (Arabic, Armenian, Basque, Bengali, Brazilian, Bulgarian, Catalan, Chinese, Czech, Danish, Dutch, English, Finnish, French, Galician, German, Greek, Hindi, Hungarian, Indonesian, Irish, Italian, Japanese, Korean, Kurdish, Latvian, Lithuanian, Marathi, Norwegian, Persian, Polish, Portugese, Romanian, Russian, Slovak, Spanish, Swedish, Thai, Turkish, Ukranian, Urdu)
You can easily retrieve settings or update them:
index.getSettings(function(err, content) {
console.log(content);
});
index.setSettings({'customRanking': ['desc(followers)']}, function(err) {
if (!err) {
console.log('success');
}
});
You can list all your indices along with their associated information (number of entries, disk size, etc.) with the listIndexes
method:
client.listIndexes(function(err, content) {
console.log(content);
});
You can delete an index using its name:
client.deleteIndex('contacts', function(error) {
if (!err) {
console.log('success');
}
});
You can delete the index contents without removing settings and index specific API keys by using the clearIndex command:
index.clearIndex(function(err, content) {
console.log(content);
});
All write operations in Algolia are asynchronous by design.
It means that when you add or update an object to your index, our servers will
reply to your request with a taskID
as soon as they understood the write
operation.
The actual insert and indexing will be done after replying to your code.
You can wait for a task to complete using the waitTask
method on the taskID
returned by a write operation.
For example, to wait for indexing of a new object:
var object = {
firstname: 'Jimmie',
lastname: 'Barninger'
};
index.addObject(object, function(err, content) {
index.waitTask(content.taskID, function() {
console.log('object ' + content.objectID + ' indexed');
});
});
If you want to ensure multiple objects have been indexed, you only need to check
the biggest taskID
.
You may want to perform multiple operations with one API call to reduce latency. We expose four methods to perform batch operations:
addObjects
: Add an array of objects using automatic objectID
assignment.saveObjects
: Add or update an array of objects that contains an objectID
attribute.deleteObjects
: Delete an array of objectIDs.partialUpdateObjects
: Partially update an array of objects that contain an objectID
attribute (only specified attributes will be updated).Example using automatic objectID
assignment:
var objects = [{
firstname: 'Jimmie',
lastname: 'Barninger'
}, {
firstname: 'Warren',
lastname: 'Speach'
}];
index.addObjects(objects, function(err, content) {
console.log(content);
});
Example with user defined objectID
(add or update):
var objects = [{
firstname: 'Jimmie',
lastname: 'Barninger',
objectID: 'myID1'
}, {
firstname: 'Warren',
lastname: 'Speach',
objectID: 'myID2'
}];
index.saveObjects(objects, function(err, content) {
console.log(content);
});
Example that deletes a set of records:
index.deleteObjects(['myID1', 'myID2'], function(err, content) {
console.log(content);
});
Example that updates only the firstname
attribute:
var objects = [{
firstname: 'Jimmie',
objectID: 'myID1'
}, {
firstname: 'Warren',
objectID: 'myID2'
}];
index.partialUpdateObjects(objects, function(err, content) {
console.log(content);
});
If you have one index per user, you may want to perform a batch operations across severals indexes. We expose a method to perform this type of batch:
client.batch([{
action: 'addObject',
indexName: 'clients',
body: {
name: 'Bill'
}
}, {
action: 'udpateObject',
indexName: 'fruits',
body: {
objectID: '29138',
name: 'banana'
}
}], cb)
The attribute action can have these values:
The admin API key provides full control of all your indices. You can also generate user API keys to control security. These API keys can be restricted to a set of operations or/and restricted to a given index.
To list existing keys, you can use listUserKeys
method:
// Lists global API Keys
client.listUserKeys(function(err, content) {
console.log(content);
});
// Lists API Keys that can access only to this index
index.listUserKeys(function(err, content) {
console.log(content);
});
Each key is defined by a set of permissions that specify the authorized actions. The different permissions are:
Example of API Key creation:
// Creates a new global API key that can only perform search actions
client.addUserKey(['search'], function(err, content) {
console.log('Key:' + content['key']);
});
// Creates a new API key that can only perform search action on this index
index.addUserKey(['search'], function(err, content) {
console.log('Key:' + content['key']);
});
You can also create an API Key with advanced settings:
Note: If you are sending the query through your servers, you must use the enableRateLimitForward("TheAdminAPIKey", "EndUserIP", "APIKeyWithRateLimit")
function to enable rate-limit.
// Creates a new global API key that is valid for 300 seconds
client.addUserKey(['search'], {
validity: 300
}, function(err, content) {
console.log('Key:' + content['key']);
});
// Creates a new index specific API key:
// - valid for 300 seconds
// - with a rate limit of 100 calls per hour per IP
// - and a maximum of 20 hits
index.addUserKey(['search'], {
validity: 300,
maxQueriesPerIPPerHour: 100,
maxHitsPerQuery: 20
}, function(err, content) {
console.log('Key:' + content['key']);
});
Update the permissions of an existing key:
// Update an existing global API key that is valid for 300 seconds
client.updateUserKey(
'myAPIKey',
['search'], {
validity: 300
}, function(err, content) {
console.log('Key:' + content['key']);
}
);
// Update an index specific API key:
// - valid for 300 seconds
// - with a rate limit of 100 calls per hour per IP
// - and a maximum of 20 hits
index.updateUserKey(
'myAPIKey',
['search'], {
validity: 300,
maxQueriesPerIPPerHour: 100,
maxHitsPerQuery: 20
}, function(err, content) {
console.log('Key:' + content['key']);
}
);
Get the permissions of a given key:
// Gets the rights of a global key
client.getUserKeyACL('7f2615414bc619352459e09895d2ebda', function(err, content) {
console.log(content);
});
// Gets the rights of an index specific key
index.getUserKeyACL('9b9335cb7235d43f75b5398c36faabcd', function(err, content) {
console.log(content);
});
Delete an existing key:
// Deletes a global key
client.deleteUserKey('7f2615414bc619352459e09895d2ebda', function(err, content) {
console.log(content);
});
// Deletes an index specific key
index.deleteUserKey('9b9335cb7235d43f75b5398c36faabcd', function(err, content) {
console.log(content);
});
You may have a single index containing per user data. In that case, all records should be tagged with their associated user_id in order to add a tagFilters=user_42
filter at query time to retrieve only what a user has access to. If you're using the JavaScript client, it will result in a security breach since the user is able to modify the tagFilters
you've set by modifying the code from the browser. To keep using the JavaScript client (recommended for optimal latency) and target secured records, you can generate a secured API key from your backend:
// generate a public API key for user 42. Here, records are tagged with:
// - 'user_XXXX' if they are visible by user XXXX
var public_key = client.generateSecuredApiKey('YourSearchOnlyApiKey', {tagFilters: 'user_42'});
This public API key can then be used in your JavaScript code as follow:
var client = algoliasearch('YourApplicationID', '<%= public_api_key %>');
var index = client.initIndex('indexName')
index.search('something', function(err, content) {
if (err) {
console.error(err);
return;
}
console.log(content);
});
You can mix rate limits and secured API keys by setting a userToken
query parameter at API key generation time. When set, a unique user will be identified by her IP + user_token
instead of only by her IP
. This allows you to restrict a single user to performing a maximum of N
API calls per hour, even if she shares her IP
with another user.
// generate a public API key for user 42. Here, records are tagged with:
// - 'user_XXXX' if they are visible by user XXXX
var public_key = client.generateSecuredApiKey('YourRateLimitedApiKey', {tagFilters: 'user_42', userToken: 'user_42'});
This public API key can then be used in your JavaScript code as follow:
var client = algoliasearch('YourApplicationID', '<%= public_api_key %>');
var index = client.initIndex('indexName')
index.search('another query', function(err, content) {
if (err) {
console.error(err);
return;
}
console.log(content);
});
You can easily copy or rename an existing index using the copy
and move
commands.
Note: Move and copy commands overwrite the destination index.
// Rename MyIndex in MyIndexNewName
client.moveIndex('MyIndex', 'MyIndexNewName', function(err, content) {
console.log(content);
});
// Copy MyIndex in MyIndexCopy
client.copyIndex('MyIndex', 'MyIndexCopy', function(err, content) {
console.log(content);
});
The move command is particularly useful if you want to update a big index atomically from one version to another. For example, if you recreate your index MyIndex
each night from a database by batch, you only need to:
MyNewIndex
.MyNewIndex
to MyIndex
using the move command. This will automatically override the old index and new queries will be served on the new one.// Rename MyNewIndex in MyIndex (and overwrite it)
client.moveIndex('MyNewIndex', 'MyIndex', function(err, content) {
console.log(content);
});
You can retrieve all index content by using the browseAll() method:
// browseAll can take any query and queryParameter like the
// search function. Here we do not provide any because we want all the index's
// objects
var browser = index.browseAll();
var hits = [];
browser.on('result', function onResult(content) {
hits = hits.concat(content.hits);
});
browser.on('end', function onEnd() {
console.log('Finished!');
console.log('We got %d hits', hits.length);
});
browser.on('error', function onError(err) {
throw err;
});
// You can stop the process at any point with
// browser.stop();
// Retrieve the next cursor from the browse method
index.browse(query, function(err, content) {
console.log(content[cursor]);
});
You can also use the browse(query, queryParameters)
and browseFrom(browseCursor)
methods to programmatically browse your index content:
index.browse('jazz', function browseDone(err, content) {
if (err) {
throw err;
}
console.log('We are at page %d on a total of %d pages, with %d hits.', content.page, content.nbPages, content.hits.length);
if (content.cursor) {
index.browseFrom(content.cursor, function browseFromDone(err, content) {
if (err) {
throw err;
}
console.log('We are at page %d on a total of %d pages, with %d hits.', content.page, content.nbPages, content.hits.length);
});
}
});
You can retrieve the latest logs via this API. Each log entry contains:
You can retrieve the logs of your last 1,000 API calls and browse them using the offset/length parameters:
// Get last 10 log entries (default)
client.getLogs(function(err, content) {
console.log(content);
});
// Get last 100 log entries
client.getLogs(0, 100, function(err, content) {
console.log(content);
});
FAQs
A fully-featured and blazing-fast JavaScript API client to interact with Algolia API.
The npm package algoliasearch receives a total of 1,664,751 weekly downloads. As such, algoliasearch popularity was classified as popular.
We found that algoliasearch demonstrated a healthy version release cadence and project activity because the last version was released less than a year ago. It has 10 open source maintainers collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
A threat actor's playbook for exploiting the npm ecosystem was exposed on the dark web, detailing how to build a blockchain-powered botnet.
Security News
NVD’s backlog surpasses 20,000 CVEs as analysis slows and NIST announces new system updates to address ongoing delays.
Security News
Research
A malicious npm package disguised as a WhatsApp client is exploiting authentication flows with a remote kill switch to exfiltrate data and destroy files.