admin管理员组

文章数量:1287579

On a web page I have a quite large list of items (say, product cards, each contains image and text) - about 1000 of them. I want to filter this list on client (only those items, which are not filtered away should be shown), but there is a rendering performance problem. I apply a very narrow filter and only 10-20 items remain, then cancel it (so all items have to be shown again), and browser (Chrome on very nice machine) hangs up for a second or two.

I re-render the list using following routine:

for (var i = 0, l = this.entries.length; i < l; i++) {
    $(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}

dict is the hash of allowed items' ids

This function itself runs instantly, it's rendering that hangs up. Is there a more optimal re-render method than changing "display" property of DOM elements?

Thanks for your answers in advance.

On a web page I have a quite large list of items (say, product cards, each contains image and text) - about 1000 of them. I want to filter this list on client (only those items, which are not filtered away should be shown), but there is a rendering performance problem. I apply a very narrow filter and only 10-20 items remain, then cancel it (so all items have to be shown again), and browser (Chrome on very nice machine) hangs up for a second or two.

I re-render the list using following routine:

for (var i = 0, l = this.entries.length; i < l; i++) {
    $(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
}

dict is the hash of allowed items' ids

This function itself runs instantly, it's rendering that hangs up. Is there a more optimal re-render method than changing "display" property of DOM elements?

Thanks for your answers in advance.

Share Improve this question asked Apr 14, 2012 at 5:45 Alex ZaretskyAlex Zaretsky 931 gold badge1 silver badge5 bronze badges 2
  • 1 You're surprised that re-rendering of 1000 elements takes 1-2 seconds? Since I doubt 1000 elements are all visible at any moment, perhaps you should handle the visible items, and then work in the background to make the rest available (doing 50 per pass using setTimeout() between each batch to keep the browser alive). Or perhaps you should only rerender when they would actually bee visible due to scrolling. It also isn't helping you to run 1000 separate selector operations that each has to search the entire DOM. – jfriend00 Commented Apr 14, 2012 at 5:49
  • Give us a jsFiddle to work on and I'm sure we could improve the switchover performance by a factor 10x. There's a lot of juicy fat in that code. – jfriend00 Commented Apr 14, 2012 at 5:52
Add a ment  | 

3 Answers 3

Reset to default 7

Why load 1000 items? First you should consider something like pagination. Showing around 30 items per page. that way, you are not loading that much.

then if you are really into that "loop a lot of items", consider using timeouts. here's a demo i had once that illustrates the consequences of looping. it blocks the UI and will cause the browser to lag, especially on long loops. but when using timers, you delay each iteration, allowing the browser to breathe once in a while and do something else before the next iteration starts.

another thing to note is that you should avoid repaints and reflows, which means avoid moving around elements and changing styles that often when it's not necessary. also, another tip is to remove from the DOM the nodes that are not actually visible. if you don't need to display something, remove it. why waste memory putting something that isn't actually seen?

You can use the setTimeout trick that offloads the loop calls from the main thread and avoids the client freeze. I suspect that the total processing – from start to finish – would last the same amount of time, but at least this way the interface can still be used and the result is a better user experience:

for (var i = 0, l = this.entries.length; i < l; i++) {
  setTimeout(function(){
    $(this.cls_prefix + this.entries[i].id).css("display", this.entries[i].id in dict ? "block" : "none")
  }, 0);
}

Dude - the best way to handle "large amounts of DOM elements" is to NOT do it on the client, and/or DON'T use Javascript if you can avoid it.

If there's no better solution (and I'm sure there probably is!), then at LEAST partition your working set down to what you actually need to display at that moment (instead of the whole, big, honkin' enchilada!)

本文标签: performanceHow to optimally render large amounts of DOM elements using javascriptStack Overflow