Security News
Weekly Downloads Now Available in npm Package Search Results
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.
This is a Ruby library for scraping web pages and saving data. It is a fork/rewrite of the original scraperwiki-ruby gem, extracting the SQLite utility methods into the sqlite_magic gem.
It is a work in progress (for example, it doesn't yet create indices automatically), but should allow ScraperWiki classic scripts to be run locally.
Add this line to your application's Gemfile:
gem 'scraperwiki'
And then execute:
$ bundle
Returns the downloaded string from the given url. params are sent as a POST if set.
Helper functions for saving and querying an SQL database. Updates the schema automatically according to the data you save.
Currently only supports SQLite. It will make a local SQLite database.
Saves a data record into the datastore into the table given by table_name.
data is a hash with field names as keys (can be strings or symbols).
unique_keys is a subset of data.keys() which determines when a record is overwritten.
For large numbers of records data can be an array of hashes.
verbose, kept for smooth migration from classic, doesn't do anything yet.
Executes provided query with the parameters against the database and returns the results in key value pairs
query is a sql statement
params, if prepared statement will contains an array of values
Allows the user to save a single variable (at a time) to carry state across runs of the scraper.
name, the variable name
value, the value of the variable
verbose, verbosity level
Allows the user to retrieve a previously saved variable
name, The variable name to fetch
value, The value to use if the variable name is not found
verbose, verbosity level
Allows for a simplified select statement
partial_query, A valid select statement, without the select keyword
params Any data provided for ? replacements in the query
verbose, verbosity level
Run your Ruby scraper and any data saved will be put in an SQLite database in the current directory called scraperwiki.sqlite
.
If you're using scrapers from ScraperWiki Classic, remember to add require 'scraperwiki'
to your file if it's not already there.
You need the sqlite3
program installed to run tests. To install run sudo apt-get install sqlite3
on Ubuntu.
FAQs
Unknown package
We found that scraperwiki demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
Socket's package search now displays weekly downloads for npm packages, helping developers quickly assess popularity and make more informed decisions.
Security News
A Stanford study reveals 9.5% of engineers contribute almost nothing, costing tech $90B annually, with remote work fueling the rise of "ghost engineers."
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.