Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

github-repo-gpt-scraper

Package Overview
Dependencies
Maintainers
1
Versions
6
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

github-repo-gpt-scraper

Welcome to the GitHub Repo GPT Scraper! This powerful tool is designed to help you effortlessly scrape GitHub repositories in order to create an [OpenAI GPT](https://chat.openai.com/create) based on your code! It works with either a public GitHub reposito

  • 1.0.5
  • latest
  • Source
  • npm
  • Socket score

Version published
Maintainers
1
Created
Source

🤖 🔥 GitHub Repo GPT Scraper 🔥 🤖

Welcome to the GitHub Repo GPT Scraper! This powerful tool is designed to help you effortlessly scrape GitHub repositories in order to create an OpenAI GPT based on your code! It works with either a public GitHub repository URL or a local directory (defaulting to the cwd if no URL is passed).

Getting Started

Prerequisites

  • Node.js installed.

Usage

  • Scrape a GitHub Repository:

    npx github-repo-gpt-scraper --url=https://github.com/user/repo --out=repo.json
    

    Replace https://github.com/user/repo with the URL of the repository you wish to scrape.

  • Scrape the Current Working Directory:

    npx github-repo-gpt-scraper --out=repo.json
    

    This will scrape all the files in your current directory, excluding gitignored files per the .gitignore file in cwd, and excluding common lockfiles and binary files.

  • Filter Files with Include and Exclude Options:

Use the --include option to specify a glob pattern for files you want to include. Use the --exclude option to specify a glob pattern for files you want to exclude.

Example:

npx github-repo-gpt-scraper --include="src/**/*.ts" --out=repo.json

Or:

npx github-repo-gpt-scraper --exclude="tests/**" --out=repo.json
  • Create a GPT Using the Scraped Data:
  1. Visit https://chat.openai.com/create and click the "Configure" tab.

  2. Under "Knowledge," click "Upload files" and select the JSON file output by the scraper.

  3. Add the following basic instructions to the "Instructions" field:

    You are the creator of the codebase documented in the attached file and an expert in all of its code and the dependencies it uses. All of the user's question will relate to this code, so reference it heavily. Give factual, detailed answers and help the user make updates to the code in as efficient a manner possible while explaining more complex points to them along the way.
    

The simple instructions above cover the essentials and seem to work pretty well, but feel free to experiment with your own!

Output

The tool outputs a JSON file (repo.json in the above examples) containing the path, URL, and content of each file scraped. I haven't yet experimented with different ways of formatting the file data (or adding supplemental info) and their impact on GPTs, but I'd be eager to hear about anyone's findings if they do so!

Contribute

Contributions are welcome! Open a PR 😎

License

This project is licensed under the MIT License.


Happy Scraping and GPTs'ing! 🚀🤖

FAQs

Package last updated on 28 Nov 2023

Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc