admin管理员组文章数量:1297091
I have an array of ids, and I want to make an api request for each id, but I want to control how many requests are made per second, or better still, have only 5 open connections at any time, and when a connection is plete, fetch the next one.
Currently I have this, which just fires off all the requests at the same time:
_.each([1,2,3,4,5,6,7,8,9,10], function(issueId) {
github.fetchIssue(repo.namespace, repo.id, issueId, filters)
.then(function(response) {
console.log('Writing: ' + issueId);
writeIssueToDisk(fetchIssueCallback(response));
});
});
I have an array of ids, and I want to make an api request for each id, but I want to control how many requests are made per second, or better still, have only 5 open connections at any time, and when a connection is plete, fetch the next one.
Currently I have this, which just fires off all the requests at the same time:
_.each([1,2,3,4,5,6,7,8,9,10], function(issueId) {
github.fetchIssue(repo.namespace, repo.id, issueId, filters)
.then(function(response) {
console.log('Writing: ' + issueId);
writeIssueToDisk(fetchIssueCallback(response));
});
});
Share
edited Feb 16, 2016 at 2:09
Billy Moon
asked Feb 16, 2016 at 2:01
Billy MoonBilly Moon
58.6k26 gold badges147 silver badges244 bronze badges
5
- 1 You want capped concurrency promises. Take a look at ES6 Promise Pool, and also the list of Alternatives specified on that page. Or, you can implement your own counting semaphore :D – Amadan Commented Feb 16, 2016 at 2:10
- I faced a similar problem, I needed to rate limit chat messages ing through. There is an algorithm called the "Leaky Bucket". en.wikipedia/wiki/Leaky_bucket I think this is similar to what you are facing. – Michael Joseph Aubry Commented Feb 16, 2016 at 2:13
- 1 I'm surprised no one has mentioned async. Your "ideal" request sounds like a job for eachLimit – Patrick Roberts Commented Feb 16, 2016 at 3:23
-
@PatrickRoberts: I am surprised no-one read my link - the Alternatives at ES6 Promise Pool clearly lists async's
queue()
. – Amadan Commented Feb 16, 2016 at 3:26 - @Amadan you'll have to redefine "clearly" for me, then. That's like the equivalent of expecting me to read fine-print. – Patrick Roberts Commented Feb 16, 2016 at 3:28
5 Answers
Reset to default 5Personally, I'd use Bluebird's .map()
with the concurrency
option since I'm already using promises and Bluebird for anything async. But, if you want to see what a hand-coded counter scheme that restricts how many concurrent requests can run at once looks like, here's one:
function limitEach(collection, max, fn, done) {
var cntr = 0, index = 0, errFlag = false;
function runMore() {
while (!errFlag && cntr < max && index < collection.length) {
++cntr;
fn(collection[index++], function(err, data) {
--cntr;
if (errFlag) return;
if (err) {
errFlag = true;
done(err);
} else {
runMore();
}
});
}
if (!errFlag && cntr === 0 && index === collection.length) {
done();
}
}
runMore();
}
With Bluebird:
function fetch(id) {
console.log("Fetching " + id);
return Promise.delay(2000, id)
.then(function(id) {
console.log(" Fetched " + id);
});
}
var ids = [1,2,3,4,5,6,7,8,9,10];
Promise.map(ids, fetch, { concurrency: 3 });
<script src="https://cdnjs.cloudflare./ajax/libs/bluebird/3.3.1/bluebird.min.js"></script>
<!-- results pane console output; see http://meta.stackexchange./a/242491 -->
<script src="http://gh-canon.github.io/stack-snippet-console/console.min.js"></script>
i'd remend using throat just for this: https://github./ForbesLindesay/throat
Divide your data into as many arrays as you want concurrent connections. Schedule with setTimeout, and have the pletion callback handle the rest of the sub-array.
Wrap the setTimeout in a function of its own so that the variable values are frozen to their values at the time of delayed_fetch() invocation.
function delayed_fetch(delay, namespace, id, issueIds, filters) {
setTimeout(
function() {
var issueId=issueIds.shift();
github.fetchIssue(namespace, id, issueId, filters).then(function(response) {
console.log('Writing: ' + issueId);
writeIssueToDisk(fetchIssueCallback(response));
delayed_fetch(0, namespace, id, issueIds, filters);
});
}, delay);
}
var i=0;
_.each([ [1,2] , [3,4], [5,6], [7,8], [9,10] ], function(issueIds) {
var delay=++i*200; // millisecond
delayed_fetch(delay, repo.namespace, repo.id, issueIds, filters);
});
Using Bluebird
function getUserFunc(user) {
//Get a collection of user
}
function getImageFunc(id) {
//get a collection of image profile based on id of the user
}
function search(response) {
return getUsersFunc(response).then(response => {
const promises = response.map(items => return items.id);
const images = id => {
return getImagesFunc(id).then(items => items.image);
};
return Promise.map(promises, images, { concurrency: 5 });
});
}
Previously i used ES6 function Promise.all()
, but it doesn't work like what i'm expecting. Then go with third party library bluebird.js and Work like a charm.
本文标签: javascriptHow can I throttle stack of api requestsStack Overflow
版权声明:本文标题:javascript - How can I throttle stack of api requests? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1741622539a2388893.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论