crawler
Crawler is a ready-to-use web spider that works with proxies, asynchrony, rate limit, configurable request pools, jQuery, and HTTP/2 support.
es6-crawler-detect
This is an ES6 adaptation of the original PHP library CrawlerDetect, this library will help you detect bots/crawlers/spiders vie the useragent.
spider-detector
A tiny node module to detect spiders/crawlers quickly and comes with optional middleware for ExpressJS
robots-parser
A specification compliant robots.txt parser with wildcard (*) matching support.
x-crawl
x-crawl is a flexible Node.js AI-assisted crawler library.