admin管理员组文章数量:1353646
I am using PapaPase to parse big CSV file, using chunk mode.
I am validating csv data, and I want to stop streaming when validation fails.
But I am unable to stop streaming, after some parsing.
I tried to stop using return false from chunk callback, but it's not working.
Below is the code.
$("#fileselect").on("change", function(e){
if (this.files.length) {
var file = this.files[0]
count = 0;
Papa.parse(file, {
worker: true,
delimiter: "~~",
skipEmptyLines:true,
chunk: function (result) {
count += result.data.length;
console.clear();
console.log(count);
if (count>60000) {
return false;
}
},
plete: function (result, file) {
console.log(result)
}
});
}
});
I am using PapaPase to parse big CSV file, using chunk mode.
I am validating csv data, and I want to stop streaming when validation fails.
But I am unable to stop streaming, after some parsing.
I tried to stop using return false from chunk callback, but it's not working.
Below is the code.
$("#fileselect").on("change", function(e){
if (this.files.length) {
var file = this.files[0]
count = 0;
Papa.parse(file, {
worker: true,
delimiter: "~~",
skipEmptyLines:true,
chunk: function (result) {
count += result.data.length;
console.clear();
console.log(count);
if (count>60000) {
return false;
}
},
plete: function (result, file) {
console.log(result)
}
});
}
});
Share
Improve this question
edited Sep 4, 2018 at 9:01
Alexandru Olaru
7,1126 gold badges31 silver badges54 bronze badges
asked Sep 4, 2018 at 8:27
Lalitkumar TarsariyaLalitkumar Tarsariya
6762 gold badges10 silver badges25 bronze badges
1
- any luck with this? I'm trying the same... – cojack20 Commented Jan 7, 2019 at 17:28
2 Answers
Reset to default 9Chunk, and Step, both have access to parser too, you can use that to pause, resume, or (as you might want) abort.
step: function(results, parser) {
console.log("Row data:", results.data);
console.log("Row errors:", results.errors);
}’
So in your instance, you would need to do this (untested):
$("#fileselect").on("change", function(e){
if (this.files.length) {
var file = this.files[0]
count = 0;
Papa.parse(file, {
worker: true,
delimiter: "~~",
skipEmptyLines:true,
chunk: function (result, parser) {
count += result.data.length;
console.clear();
console.log(count);
if (count>60000) {
//return false;
parser.abort(); // <-- stop streaming
}
},
plete: function (result, file) {
console.log(result)
}
});
}
});
Take a look at the documentation for step and chunk.
https://www.papaparse./docs
Hope this helped!
In my case, I just needed the first 10 rows of data from a file. If anyone needs a solution for this, here's an example of how I got it to work:
In order to stop streaming after a number of rows, simply pass in the 'preview' option in the configs.
let fileInput = document.getElementById('myFile');
let file = fileInput.files[0];
let parsedData; //variable to store the chunked results
Papa.parse(file, {
worker: true,
preview: 10, //this is what you need to do the trick,
chunk: function(results){
parsedData = results; //set results to the parsedData variable.
//I'm doing this because "When streaming, parse results are not available in
the 'plete' callback."
},
plete: function(){
console.log(parsedData); //log the results once parsing is pleted
/**
Do whatever else you want with parsedData here.
In my case, I just created an html table to show a preview of the data.
*/
}
});
with this, you should be able to parse very large files without crashing the browser. I tested with a .csv file that has over 1 million rows and I got no issues.
see the docs: https://www.papaparse./docs#config-details
本文标签: javascriptHow to stop papaparse streaming after some resultsStack Overflow
版权声明:本文标题:javascript - How to stop papaparse streaming after some results - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1743933173a2564217.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论