admin管理员组文章数量:1344410
I have a list of data that I am sending to google cloud. My current code looks like this:
const teams = ['LFC', 'MUFC', 'CFC'];
teams.forEach(team => {
fetch({
url: URL,
method: 'PUT',
body: team
});
})
This works with one team
but it is timing out if sending multiple files and the files are bigger. I am sending images over and not strings. To solve this I need to POST
the data one file by one, and wait for the previous POST
to plete before sending the subsequent one. Can anyone advise the best way of doing this?
Worth noting that I don't have any control over the number of files that are uploaded.
I have a list of data that I am sending to google cloud. My current code looks like this:
const teams = ['LFC', 'MUFC', 'CFC'];
teams.forEach(team => {
fetch({
url: URL,
method: 'PUT',
body: team
});
})
This works with one team
but it is timing out if sending multiple files and the files are bigger. I am sending images over and not strings. To solve this I need to POST
the data one file by one, and wait for the previous POST
to plete before sending the subsequent one. Can anyone advise the best way of doing this?
Worth noting that I don't have any control over the number of files that are uploaded.
Share edited Mar 17, 2022 at 8:29 VLAZ 29.1k9 gold badges63 silver badges84 bronze badges asked Oct 21, 2019 at 18:59 peter flanaganpeter flanagan 9,84027 gold badges81 silver badges140 bronze badges 1-
If you have access to the
npm
, you could install thebluebird
and usePromise.reduce
; Promise.reduce will execute an array of promises one by one as they resolved and allow you to "reduce" the previous results into one final result. – gabriel.hayes Commented Oct 21, 2019 at 19:04
3 Answers
Reset to default 9Use a reduce
instead of forEach
, with .then()
.
The following will store the promise of the last fetch
in acc
(the accumulator parameter of reduce
), and appends the new fetch
inside of a then
listener, to ensure that the previous fetch
is finished:
const teams = ['LFC', 'MUFC', 'CFC'];
teams.reduce((acc, team) => {
return acc.then(() => {
return fetch({
url: URL,
method: 'PUT',
body: team
});
})
}, Promise.resolve())
.then(() => console.log("Everything's finished"))
.catch(err => console.error("Something failed:", err))
//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))
const teams = ['LFC', 'MUFC', 'CFC'];
teams.reduce((acc, team) => {
return acc.then(() => {
return fetch({
url: URL,
method: 'PUT',
body: team
});
})
}, Promise.resolve())
.then(() => console.log("Everything's finished"))
.catch(err => console.error("Something failed:", err))
You can even write a general helper function for it:
const teams = ['LFC', 'MUFC', 'CFC'];
const promiseSeries = (arr, cb) => arr.reduce((acc, elem) => acc.then(() => cb(elem)), Promise.resolve())
promiseSeries(teams, (team) => {
return fetch({
url: URL,
method: 'PUT',
body: team
})
})
.then(() => console.log("Everything's finished"))
.catch(err => console.error("Something failed:", err))
//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))
const teams = ['LFC', 'MUFC', 'CFC'];
const promiseSeries = (arr, cb) => arr.reduce((acc, elem) => acc.then(() => cb(elem)), Promise.resolve())
promiseSeries(teams, (team) => {
return fetch({
url: URL,
method: 'PUT',
body: team
})
})
.then(() => console.log("Everything's finished"))
.catch(err => console.error("Something failed:", err))
Or, even better, if you can (it's ES2017), use async/await
(it's more readable):
const teams = ['LFC', 'MUFC', 'CFC'];
async function upload(teams){
for(const team of teams){
await fetch({
url: URL,
method: 'PUT',
body: team
});
}
}
upload(teams)
.then(() => console.log("Everything's finished"))
.catch(err => console.error("Something failed:", err))
//Simulate fetch:
const fetch = team => new Promise(rs => setTimeout(() => {rs();console.log(team)}, 1000))
const teams = ['LFC', 'MUFC', 'CFC'];
async function upload(teams) {
for (const team of teams) {
await fetch({
url: URL,
method: 'PUT',
body: team
});
}
}
upload(teams)
.then(() => console.log("Everything's finished"))
.catch(err => console.error("Something failed:", err))
You can use async/await with a for...of loop. Each call will "hold" the loop, until it's done, and then the loop will continue the next call:
const teams = ['LFC', 'MUFC', 'CFC'];
async function send(teams) {
for (const team of teams) {
await fetch({
url: URL,
method: 'PUT',
body: team
});
}
}
You can make use of async/await, as follows:
const teams = ['LFC', 'MUFC', 'CFC'];
teams.forEach(async (team) => {
await fetch({
url: URL,
method: 'PUT',
body: team
});
})
本文标签: javascriptwait for one fetch to finish before starting the nextStack Overflow
版权声明:本文标题:javascript - wait for one fetch to finish before starting the next - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1743692353a2522921.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论