Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

scraper-api-datachaser

Package Overview
Dependencies
Maintainers
1
Versions
2
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

scraper-api-datachaser

The scraping SaaS platform provides a RESTful API for developers to perform web scraping tasks. Users can submit scraping tasks, monitor task status, retrieve scraped data, and manage their account through the API.

  • 1.0.1
  • latest
  • npm
  • Socket score

Version published
Weekly downloads
12
increased by500%
Maintainers
1
Weekly downloads
 
Created
Source

scraper_api

ScraperApi - JavaScript client for scraper_api The scraping SaaS platform provides a RESTful API for developers to perform web scraping tasks. Users can submit scraping tasks, monitor task status, retrieve scraped data, and manage their account through the API. This SDK is automatically generated by the OpenAPI Generator project:

  • API version: 1.0.0
  • Package version: 1.0.0
  • Generator version: 7.6.0
  • Build package: org.openapitools.codegen.languages.JavascriptClientCodegen

Installation

For Node.js

npm

To publish the library as a npm, please follow the procedure in "Publishing npm packages".

Then install it via:

npm install scraper_api --save

Finally, you need to build the module:

npm run build
Local development

To use the library locally without publishing to a remote npm registry, first install the dependencies by changing into the directory containing package.json (and this README). Let's call this JAVASCRIPT_CLIENT_DIR. Then run:

npm install

Next, link it globally in npm with the following, also from JAVASCRIPT_CLIENT_DIR:

npm link

To use the link you just defined in your project, switch to the directory you want to use your scraper_api from, and run:

npm link /path/to/<JAVASCRIPT_CLIENT_DIR>

Finally, you need to build the module:

npm run build
git

If the library is hosted at a git repository, e.g.https://github.com/GIT_USER_ID/GIT_REPO_ID then install it via:

    npm install GIT_USER_ID/GIT_REPO_ID --save

For browser

The library also works in the browser environment via npm and browserify. After following the above steps with Node.js and installing browserify with npm install -g browserify, perform the following (assuming main.js is your entry file):

browserify main.js > bundle.js

Then include bundle.js in the HTML pages.

Webpack Configuration

Using Webpack you may encounter the following error: "Module not found: Error: Cannot resolve module", most certainly you should disable AMD loader. Add/merge the following section to your webpack config:

module: {
  rules: [
    {
      parser: {
        amd: false
      }
    }
  ]
}

Getting Started

Please follow the installation instruction and execute the following JS code:

var ScraperApi = require('scraper_api');

var defaultClient = ScraperApi.ApiClient.instance;
// Configure API key authorization: key
var key = defaultClient.authentications['key'];
key.apiKey = "YOUR API KEY"
// Uncomment the following line to set a prefix for the API key, e.g. "Token" (defaults to null)
//key.apiKeyPrefix['Key'] = "Token"

var api = new ScraperApi.APIKeysApi()
var body = new ScraperApi.DeleteApiKeyRequest(); // {DeleteApiKeyRequest} 
var callback = function(error, data, response) {
  if (error) {
    console.error(error);
  } else {
    console.log('API called successfully. Returned data: ' + data);
  }
};
api.keysDelete(body, callback);

Documentation for API Endpoints

All URIs are relative to https://scraper.datachaser.local/api/v1

ClassMethodHTTP requestDescription
ScraperApi.APIKeysApikeysDeleteDELETE /keysDelete API key by id
ScraperApi.APIKeysApikeysGetGET /keysRetrieve API keys list
ScraperApi.APIKeysApikeysPostPOST /keysCreate new API Key
ScraperApi.AuthenticationApiaccountPutPUT /accountUpdate user account
ScraperApi.DataApidataGetGET /dataGet all data
ScraperApi.DataApidataIdDeleteDELETE /data/{id}Delete data by id
ScraperApi.DataApidataJobIdGetGET /data/job/{id}Get data by job id
ScraperApi.DataApidataUserIdPostPOST /data/user/{id}WebHook to get data when done scraped by user id
ScraperApi.JobsApijobAsyncPostPOST /job/asyncReceives a job and process it asyncrhonously
ScraperApi.JobsApijobGetGET /jobGet jobs
ScraperApi.JobsApijobIdDeleteDELETE /job/{id}Delete existing job
ScraperApi.JobsApijobIdGetGET /job/{id}Retrieve specific job status
ScraperApi.JobsApijobIdPutPUT /job/{id}Update existing job
ScraperApi.JobsApijobPostPOST /jobCreate new job
ScraperApi.JobsApijobSyncPostPOST /job/syncReceives a job and process it syncrhonously
ScraperApi.LogsApilogJobIdGetGET /log/job/{id}Get log by job id
ScraperApi.LogsApilogsGetGET /logsGet all logs

Documentation for Models

Documentation for Authorization

Authentication schemes defined for the API:

key

  • Type: API key
  • API key parameter name: Key
  • Location: HTTP header

Keywords

FAQs

Package last updated on 04 Jun 2024

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc