🕷 makescraper

Create your very own web scraper and crawler using Go and Colly!
📚 Table of Contents
- Project Structure
- Getting Started
- Deliverables
- Resources
Project Structure
📂 makescraper
├── README.md
└── scrape.go
Getting Started
-
Visit github.com/new and create a new repository named bew25-go_web_scraper
.
-
Run each command line-by-line in your terminal to set up the project:
$ git clone git@github.com:Make-School-Labs/makescraper.git
$ cd makescraper
$ git remote rm origin
$ git remote add origin git@github.com:sprajjwal/bew25-go_web_scraper.git
$ go mod download
-
Open README.md
in your editor and replace all instances of YOUR_GITHUB_USERNAME
with your GitHub username to enable the Go Report Card badge.
Deliverables
Complete each task in the order they appear. Use GitHub Task List syntax to update the task list.
Requirements
Scraping
Stretch Challenges
Serializing & Saving
Resources
Lesson Plans
Example Code
Scraping
Serializing & Saving
- JSON to Struct: Paste any JSON data and convert it into a Go structure that will support storing that data.
- GoByExample - JSON: Covers Go's built-in support for JSON encoding and decoding to and from built-in and custom data types (structs).
- GoByExample - Writing Files: Covers creating new files and writing to them.