New Case Study:See how Anthropic automated 95% of dependency reviews with Socket.Learn More
Socket
Sign inDemoInstall
Socket

ghcrawler

Package Overview
Dependencies
Maintainers
3
Versions
43
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

ghcrawler - npm Package Compare versions

Comparing version 0.1.0 to 0.1.1

.npmignore

37

package.json
{
"name": "ghcrawler",
"version": "0.1.0",
"description": "A simple, resilient GitHub API client for bulk retrieval of GitHub resources",
"main": "./lib/ghrequestor.js",
"version": "0.1.1",
"description": "A robust GitHub API crawler that walks a queue of GitHub entities retrieving and storing their contents.",
"main": "./lib/crawler.js",
"bin": {
"ghrequestor": "./lib/ghrequestor.js"
"ghcrawler": "./lib/crawler.js"
},

@@ -15,25 +15,22 @@ "scripts": {

"api",
"rate-limit",
"retry",
"crawler",
"pagination"
],
"author": "Jeff McAffer <jmcaffer@microsoft.com>",
"author": "William Bartholomew <willbar@microsoft.com>",
"license": "MIT",
"files": [
"index.js",
"lib/ghrequestor.js"
],
"repository": {
"type": "git",
"url": "git+https://github.com/microsoft/ghcrawler.git"
},
"homepage": "https://github.com/microsoft/ghcrawler#readme",
"bugs": {
"url": "https://github.com/microsoft/ghcrawler"
},
"dependencies": {
"q": "^1.4.1",
"requestretry": "^1.12.0"
"moment": "2.15.2"
},
"devDependencies": {
"chai": "^3.5.0",
"grunt": "^1.0.1",
"grunt-mocha-test": "^0.13.2",
"mocha": "^3.1.2"
},
"devDependencies": {},
"jshintConfig": {
"esversion": "es6"
}
}
}

@@ -1,17 +0,13 @@

![Version](https://img.shields.io/npm/v/ghrequestor.svg)
![License](https://img.shields.io/github/license/Microsoft/ghrequestor.svg)
![Downloads](https://img.shields.io/npm/dt/ghrequestor.svg)
![Version](https://img.shields.io/npm/v/ghcrawler.svg)
![License](https://img.shields.io/github/license/Microsoft/ghcrawler.svg)
![Downloads](https://img.shields.io/npm/dt/ghcrawler.svg)
# GHRequestor
A simple, resilient GitHub API client that:
# GHCrawler
A robust GitHub API crawler that walks a queue of GitHub entities transitively retrieving and storing their contents. GHCrawler is great for:
* Is great for bulk fetching of resources from the GitHub API.
* Retries on errors and various forms of rate-limiting.
* Backs off as you reach your API limit.
* Automatically fetches all pages of a multi-page resource.
* Reports comprehensive data on number of pages fetched, retry attempts, and length of delays.
* Retreiving all GitHub entities related to an org, repo, or user
* Efficiently storing and the retrieved entities
* Keeping the stored data up to date when used in conjunction with a GitHub event tracker
GHRequestor is a relatively low-level facility intended for traversing large graphs of GitHub resources.
Its primary usecase is GHCrawler, an engine that walks subsections of GitHub collecting related resources.
This not intended to replace great modules like octonode or github.
GHCrawler focuses on successively retrieving and walking GitHub resources supplied on a queue. Each resource is fetched, analyzed, stored and plumbed for more resources to fetch. Discovered resources are themselves queued for further processing. The crawler is careful to not repeatedly fetch the same resource.

@@ -18,0 +14,0 @@ # Examples

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc