admin管理员组文章数量:1415421
What if I had a pilation step for my website that turned all external scripts and styles into a single HTML file with embedded <script>
and <style>
tags? Would this improve page load times due to not having to send extra GETs for the external files? If so, why isn't this done more often?
What if I had a pilation step for my website that turned all external scripts and styles into a single HTML file with embedded <script>
and <style>
tags? Would this improve page load times due to not having to send extra GETs for the external files? If so, why isn't this done more often?
- 4 This is kind of what browersify and webpack does, it turns all the external js files into a single bundle to reduce the number of GET requests sent. This is pretty much the norm in web app development where you have multiple js, html and css files for modularity and a build step to bine it all together. – derp Commented Jun 22, 2016 at 4:27
- 2 Fewer requests doesn't always mean faster, because web browser sends requests in parallel. Also, split files makes cache possible. – Chef Commented Jun 22, 2016 at 6:45
2 Answers
Reset to default 6Impossible to say in general, because it is very situational.
- If you're pulling resources from many different servers, these requests can slow your page loading down (especially with some bad DNS on the visiting side).
- Requesting many different files may also slow down page load even if they're from the same origin/server.
- Keep in mind not everyone has gigabit internet (or even on megabit level). So putting everything directly into your HTML file (inlining or using data URIs) will definitely reduce network overhead at first (less requests, less headers, etc.).
- In addition (and making the previous point even worse) this will also break many other features often used to reduce page loading times. For example, resources can't be cached - neither locally nor on some proxy - and are always transferred. This might be costly for both the visitor as well as the hosting party.
So often the best way to approach this is going the middle ground, if loading times are an issue to you:
If you're using third party scripts, e.g. jQuery, grab these from a public hosted CDN that's used by other pages as well. If you're lucky, your visitor's browser will have a cached copy and won't do the request.
Your own scripts should be condensed and potentially minified into a single script (tools such as browersify, webpack, etc.). This must not include often changing parts, as these would force you to transfer even more data more often.
If you've got any scripts or resources that are really only part of your current visitor's experience (like logged in status, colors picked in user preferences, etc.), it's okay to put these directly into the parent HTML file, if that file is customized anyway and delivering them as separate files wouldn't work or would cause more overhead. A perfect example for this would be CSRF tokens. Don't do this if you're able to deliver some static HTML file that's filled/updated by Javascript though.
Yes, it will improve page load time but still this method is not often used due to these reasons:
- Debugging will be difficult for that.
- If we want to update later, it also won't be so easy.
- Separate css and .js files remove these issues
And yeah, for faster page load, you can use a BUILD SYSTEM like GRUNT, GULP, BRUNCH etc. for better performance.
版权声明:本文标题:javascript - Is it faster to load if all webpage resources are compiled into a single HTML file? - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745226650a2648637.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论