node.js - How does nodejs crawler control the number of requests?
滿天的星座
滿天的星座 2017-06-27 09:19:30
0
2
1062

When using nodejs to crawl web page content, if there are too many requests, sometimes an exception will be thrown, prompting errors such as too many connections. Does nodejs have such keywords or class libraries for thread locking? Or how to handle it better? First, thank you!

滿天的星座
滿天的星座

reply all(2)
typecho

nodejs does not have functions such as sleep.
I usually use event to match

const EventEmitter = require('events').EventEmitter;
const ee = new EventEmitter();

ee.on('next',(数据)=>{
    // 爬网站
});

// 每秒执行一次
setInterval(()=>ee.emit('next','数据'),1000);
ringa_lee

async ?

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template