Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
@brafdlog/israeli-bank-scrapers-core
Advanced tools
Provide scrapers for all major Israeli banks and credit card companies
What you can find here is scrapers for all major Israeli banks and credit card companies. That's the plan at least. Currently only the following banks are supported:
To use this you will need to have Node.js >= 8.10.0 installed.
To use these scrapers you'll need to install the package from npm:
npm install israeli-bank-scrapers --save
Then you can simply import and use it in your node module:
import { CompanyTypes, createScraper } from 'israeli-bank-scrapers';
(async function() {
try {
// read documentation below for available options
const options = {
companyId: CompanyTypes.leumi,
startDate: new Date('2020-05-01'),
combineInstallments: false,
showBrowser: true
};
// read documentation below for information about credentials
const credentials = {
username: 'vr29485',
password: 'sometingsomething'
};
const scraper = createScraper(options);
const scrapeResult = await scraper.scrape(credentials);
if (scrapeResult.success) {
scrapeResult.accounts.forEach((account) => {
console.log(`found ${account.txns.length} transactions for account number ${account.accountNumber}`);
});
}
else {
throw new Error(scrapeResult.errorType);
}
} catch(e) {
console.error(`scraping failed for the following reason: ${e.message}`);
}
})();
Check the options declaration here for available options.
Regarding credentials, you should provide the relevant credentials for the chosen company. See this file with list of credentials per company.
The structure of the result object is as follows:
```node
{
success: boolean,
accounts: [{
accountNumber: string,
txns: [{
type: string, // can be either 'normal' or 'installments'
identifier: int, // only if exists
date: string, // ISO date string
processedDate: string, // ISO date string
originalAmount: double,
originalCurrency: string,
chargedAmount: double,
description: string,
memo: string, // can be null or empty
installments: { // only if exists
number: int, // the current installment number
total: int, // the total number of installments
},
status: string //can either be 'completed' or 'pending'
}],
}],
errorType: "invalidPassword"|"changePassword"|"timeout"|"generic", // only on success=false
errorMessage: string, // only on success=false
}
You can also use the SCRAPERS
list to get scraper metadata:
import { SCRAPERS } from 'israeli-bank-scrapers';
The return value is a list of scraper metadata:
{
<companyId>: {
name: string, // the name of the scraper
loginFields: [ // a list of login field required by this scraper
'<some field>' // the name of the field
]
}
}
This library is currently deployed to NPM manually and not as part of automatic process. You should expect situations when code was pushed to master and wasn't deployed to NPM yet.
If you are a developer and want to access the next
version, install the library with next
tag as shown below:
npm install israeli-bank-scrapers@next --save
Keep in mind that although this
next
version should be stable as it passed our code review, it was deployed automatically using github action workflow without the usual tests we run manually before we deploy the official version.
Israeli-bank-scrapers-core
libraryTL;DR this is the same library as the default library. The only difference is that it is using
puppeteer-core
instead ofpuppeteer
which is useful if you are using frameworks like Electron to pack your application.In most cases you will probably want to use the default library (read Getting Started section).
Israeli bank scrapers library is published twice:
node_modules
like Electron applications.The default variation israeli-bank-scrapers is using puppeteer which handles the installation of local chroumium on its' own. This behavior is very handy since it takes care on all the hard work figuring which chromium to download and manage the actual download process. As a side effect it increases node_modules by several hounded megabytes.
The core variation israeli-bank-scrapers-core is using puppeteer-core which is exactly the same library as puppeteer
except that it doesn't download chromium when installed by npm. It is up to you to make sure the specific version of chromium is installed locally and provide a path to that version. It is useful in Electron applications since it doesn't bloat the size of the application and you can provide a much friendlier experience like loading the application and download it later when needed.
To install israeli-bank-scrapers-core
:
npm install israeli-bank-scrapers-core --save
When using the israeli-bank-scrapers-core
it is up to you to make sure the relevant chromium version exists. You must:
puppeteer-core
library being used.israeli-bank-scrapers-core
scrapers.Please read the following to learn more about the process:
import { getPuppeteerConfig } from 'israeli-bank-scrapers-core';
const chromiumVersion = getPuppeteerConfig().chromiumRevision;
Once you have the chromium revision, you can either download it manually or use other liraries like download-chromium to fetch that version. The mentioned library is very handy as it caches the download and provide useful helpers like download progress information.
provide the path to chromium to the library using the option key executablePath
.
This scraper expects the following credentials object:
const credentials = {
userCode: <user identification code>,
password: <user password>
};
This scraper supports fetching transaction from up to one year.
This scraper expects the following credentials object:
const credentials = {
username: <user name>,
password: <user password>
};
This scraper supports fetching transaction from up to one year.
This scraper expects the following credentials object:
const credentials = {
id: <user identification number>,
password: <user password>,
num: <user identificaiton code>
};
This scraper supports fetching transaction from up to one year (minus 1 day).
This scraper expects the following credentials object:
const credentials = {
username: <user identification number>,
password: <user password>
};
This scraper supports fetching transaction from up to one year.
This scraper expects the following credentials object:
const credentials = {
username: <user name>,
password: <user password>
};
This scraper supports fetching transaction from up to one year.
This scraper expects the following credentials object:
const credentials = {
username: <user name>,
password: <user password>
};
This scraper supports fetching transaction from up to one year.
This scraper expects the following credentials object:
const credentials = {
username: <user name>,
password: <user password>
};
This scraper supports fetching transaction from up to one year.
This scraper expects the following credentials object:
const credentials = {
id: <user identification number>,
card6Digits: <6 last digits of card>
password: <user password>
};
This scraper supports fetching transaction from up to one year.
This scraper expects the following credentials object:
const credentials = {
id: <user identification number>,
card6Digits: <6 last digits of card>
password: <user password>
};
This scraper supports fetching transaction from up to one year.
These are the projects known to be using this module:
Built something interesting you want to share here? Let me know.
The MIT License
FAQs
Provide scrapers for all major Israeli banks and credit card companies
The npm package @brafdlog/israeli-bank-scrapers-core receives a total of 0 weekly downloads. As such, @brafdlog/israeli-bank-scrapers-core popularity was classified as not popular.
We found that @brafdlog/israeli-bank-scrapers-core demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.