Research
Security News
Malicious npm Packages Inject SSH Backdoors via Typosquatted Libraries
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
@javivelasco/isbot
Advanced tools
Detect bots/crawlers/spiders using the user agent string.
import isbot from 'isbot'
// Nodejs HTTP
isbot(request.getHeader('User-Agent'))
// ExpressJS
isbot(req.get('user-agent'))
// Browser
isbot(navigator.userAgent)
// User Agent string
isbot('Mozilla/5.0 (iPhone; CPU iPhone OS 6_0 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10A5376e Safari/8536.25 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)') // true
isbot('Mozilla/5.0 (Windows NT 6.1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.36') // false
Add rules to user agent match RegExp: Array of strings
isbot('Mozilla/5.0') // false
isbot.extend([
'istat',
'^mozilla/\\d\\.\\d$'
])
isbot('Mozilla/5.0') // true
Remove rules to user agent match RegExp (see existing rules in src/list.json
file)
isbot('Chrome-Lighthouse') // true
isbot.exclude(['chrome-lighthouse']) // pattern is case insensitive
isbot('Chrome-Lighthouse') // false
Return the respective match for bot user agent rule
isbot.find('Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0 DejaClick/2.9.7.2') // 'DejaClick'
Create new instances of isbot. Instance is spawned using spawner's list as base
const one = isbot.spawn()
const two = isbot.spawn()
two.exclude(['chrome-lighthouse'])
one('Chrome-Lighthouse') // true
two('Chrome-Lighthouse') // false
Create isbot using custom list (instead of the maintained list)
const lean = isbot.spawn([ 'bot' ])
lean('Googlebot') // true
lean('Chrome-Lighthouse') // false
This package aims to identify "Good bots". Those who voluntarily identify themselves by setting a unique, preferably descriptive, user agent, usually by setting a dedicated request header.
It does not try to recognise malicious bots or programs disguising themselves as real users.
Recognising good bots such as web crawlers is useful for multiple purposes. Although it is not recommended to serve different content to web crawlers like Googlebot, you can still elect to
It is not recommended to whitelist requests for any reason based on user agent header only. Instead other methods of identification can be added such as reverse dns lookup.
We use external data sources on top of our own lists to keep up to date
Missing something? Please open an issue
Remove testing for node 6 and 8
Change return value for isbot: true
instead of matched string
No functional change
Execution times in milliseconds |
---|
FAQs
🤖 detect bots/crawlers/spiders via the user agent.
The npm package @javivelasco/isbot receives a total of 3 weekly downloads. As such, @javivelasco/isbot popularity was classified as not popular.
We found that @javivelasco/isbot demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Research
Security News
Socket’s threat research team has detected six malicious npm packages typosquatting popular libraries to insert SSH backdoors.
Security News
MITRE's 2024 CWE Top 25 highlights critical software vulnerabilities like XSS, SQL Injection, and CSRF, reflecting shifts due to a refined ranking methodology.
Security News
In this segment of the Risky Business podcast, Feross Aboukhadijeh and Patrick Gray discuss the challenges of tracking malware discovered in open source softare.