Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
This is a Ruby library for scraping web pages and saving data. It is a fork/rewrite of the original scraperwiki-ruby gem, extracting the SQLite utility methods into the sqlite_magic gem.
It is a work in progress (for example, it doesn't yet create indices automatically), but should allow ScraperWiki classic scripts to be run locally.
Add this line to your application's Gemfile:
gem 'scraperwiki'
And then execute:
$ bundle
Returns the downloaded string from the given url. params are sent as a POST if set.
Helper functions for saving and querying an SQL database. Updates the schema automatically according to the data you save.
Currently only supports SQLite. It will make a local SQLite database.
Saves a data record into the datastore into the table given by table_name.
data is a hash with field names as keys (can be strings or symbols).
unique_keys is a subset of data.keys() which determines when a record is overwritten.
For large numbers of records data can be an array of hashes.
verbose, kept for smooth migration from classic, doesn't do anything yet.
Executes provided query with the parameters against the database and returns the results in key value pairs
query is a sql statement
params, if prepared statement will contains an array of values
Allows the user to save a single variable (at a time) to carry state across runs of the scraper.
name, the variable name
value, the value of the variable
verbose, verbosity level
Allows the user to retrieve a previously saved variable
name, The variable name to fetch
value, The value to use if the variable name is not found
verbose, verbosity level
Allows for a simplified select statement
partial_query, A valid select statement, without the select keyword
params Any data provided for ? replacements in the query
verbose, verbosity level
Run your Ruby scraper and any data saved will be put in an SQLite database in the current directory called scraperwiki.sqlite
.
If you're using scrapers from ScraperWiki Classic, remember to add require 'scraperwiki'
to your file if it's not already there.
You need the sqlite3
program installed to run tests. To install run sudo apt-get install sqlite3
on Ubuntu.
FAQs
Unknown package
We found that scraperwiki demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.