Huge News!Announcing our $40M Series B led by Abstract Ventures.Learn More
Socket
Sign inDemoInstall
Socket

samssimplescraper

Package Overview
Dependencies
Maintainers
1
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

samssimplescraper

tool to help scrape sitemaps and the links they scrape

  • 0.1.3
  • PyPI
  • Socket score

Maintainers
1

Simple Scraper


Logo


Explore the docs »

View Demo · Report Bug · Request Feature

Table of Contents
  1. About The Project
  2. Getting Started
  3. Usage
  4. Roadmap
  5. Contributing
  6. License
  7. Contact

About The Project

Wrote this program to scrape some sitemaps and the following sitemap links on multiple servers. In order to save time it was pip packaged for easy repeated use.

(back to top)

Built With

(back to top)

Getting Started

Follow the installation instructions. The docstrings have detailed explainations for use.

Prerequisites

This program uses python 3.8

Installation

pip install package and use as needed.

Install pip package

pip install samssimplescraper==0.1.3

(back to top)

Usage

The package has two modules.

  1. sitemapscraper is used to scrape sitemaps and can also scrape further levels of sub-sitemaps The methods will return lists of the scraped links that can be used to scrape the wanted links.
  2. scraper is used to scrape the the list that is returned from the sitemapscraper or a user-made list of links. There is also a method that returns a status check of how many links have been scraped of the total.

(back to top)

Roadmap

  1. Find sitemap for the site you are looking to scrape. Some tips can be found here:

how-to-find-your-sitemap https://writemaps.com/blog/how-to-find-your-sitemap/

  1. Scrape sitemap:
from samssimplescraper import LinksRetriever

# instantiate LinksRetriever with the sitemap you wish to scrape
links_retriever = LinksRetriever(url='https://www.example.com/sitemap_index.xml')
# get a list of the link using .get_sitemap_links method, can also add filter
mainpage_links = links_retriever.get_sitemap_links(tag='loc')
# if website has more layers use the method to get the links on those pages
final_links = links_retriever.get_next_links(links=mainpage_links, tag='loc')

Note: If you are not going to continue scraping in the same script then be sure to save your list using pickle:

import pickle

# the data folder is automatically created when LinksRetriever is instantiated
with open('./data/pickled_lists/sitemap_links_list.pkl', 'wb') as fp:
        pickle.dump(final_links, fp)
  1. Now you can scrape the list of links that the LinksRetriever module has produced for you. The files will be saved in the data/scraped_html folder.
from samssimplescraper import Scraper

# pass the list of links and for naming purposes the root_url
Scraper.get_html(link_list=final_links, root_url='https://www.example.com/)

See the open issues for a full list of proposed features (and known issues).

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Contact

Samuel Adams McGuire - samuelmcguire@engineer.com

Pypi Link: https://pypi.org/project/samssimplescraper/0.1.3/

Linkedin: LinkedIn

Project Link: https://github.com/SamuelAdamsMcGuire/simplescraper

(back to top)

FAQs


Did you know?

Socket

Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.

Install

Related posts

SocketSocket SOC 2 Logo

Product

  • Package Alerts
  • Integrations
  • Docs
  • Pricing
  • FAQ
  • Roadmap
  • Changelog

Packages

npm

Stay in touch

Get open source security insights delivered straight into your inbox.


  • Terms
  • Privacy
  • Security

Made with ⚡️ by Socket Inc