Make sure the code is as concise as possible
Don’t rely on JavaScript for everything. Don't write repetitive scripts. Think of JavaScript as a candy tool, just for beautification. Don't add a lot of JavaScript code to your website. Only use it when necessary. Only use it if it can really improve the user experience.
Minimize DOM access
Using JavaScript to access DOM elements is easy and the code is easier to read, but it is very slow. Here are a few key points: Limit the use of JavaScript to modify web page layout, and cache references to accessed elements. Sometimes, when your site relies on extensive DOM modifications, you should consider limiting your markup. This is a good reason to switch to HTML5 and abandon the old XHTML and HTML4. You can check the number of DOM elements by typing document.getElementsByTagName('*').length in the Firebug plug-in's console.
Compression code
To provide compressed JavaScript pages, the most effective way is to first use a JavaScript compression tool to compress your code. This compression tool can compress variable and parameter names. The resulting code is then provided, using gzip compression.
Yes, I did not compress my main.js, but you should check if there are any jQuery plugins that are uncompressed and don’t forget to compress them. Below I've listed a few options for compression.
◆ YUI compression tool (used by the jQuery development team), Beginner’s Guide
(http://www.slideshare.net/nzakas/extreme-JavaScript-compression-with-yui -compressor), the second guide (http://vilimpoc.org/research/js-speedup/) and the official website (http://developer.yahoo.com/yui/compressor/).
◆ Dean Edwards Packer(http://dean.edwards.name/packer/)
◆ JSMin(http://crockford.com/JavaScript/jsmin)
GZip compression: The idea behind it is to shorten the time it takes to transfer data between the browser and the server. After shortening the time, you get a file with the title Accept-Encoding: gzip,deflate. However, this compression method has some disadvantages. It takes up processor resources on both the server and client sides (for compression and decompression), as well as disk space.
Avoid eval(): Although sometimes eval() can bring some efficiency in terms of time, using it is definitely the wrong approach. eval() makes your code look dirty and will escape compression by most compression tools.
Tools to speed up JavaScript loading: Lab.js
There are many great tools to speed up JavaScript loading. One tool worth mentioning is Lab.js.
With LAB.js (Loading and Blocking JavaScript), you can load JavaScript files in parallel, speeding up the overall loading process. In addition, you can set a certain order for the scripts that need to be loaded, which ensures the integrity of dependencies. Additionally, the developer claims a 2x speed increase on its website.
Use an appropriate CDN
Many web pages now use a content delivery network (CDN). It improves your caching mechanism because everyone can use it. It also saves you some bandwidth. You can easily ping or use Firebug to debug those servers to figure out where you can speed up the data. When choosing a CDN, take into account the location of those visitors to your website. Remember to use public repositories whenever possible.
Several CDN solutions for jQuery:
◆ http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js——Google Ajax , see http://code.google.com/apis/libraries/devguide.html#Libraries for information on more libraries.
◆ http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.min.js——Microsoft’s CDN
•http://code.jquery. com/jquery-1.4.2.min.js - Edgecast (mt).
Loading JavaScript at the end of the page
If you are concerned about users and the users did not leave your page due to a slow Internet connection, this is a very good practice. Ease of use and users come first, JavaScript comes last. This may be painful, but you should be prepared for the fact that some users disable JavaScript. You can put some JavaScript in the header that needs to be loaded, but only if it is loaded asynchronously.
Loading tracking code asynchronously
This is very important. Most of us use Google Analytics to get statistics. This is good. Now take a look at where you put your tracking code. Is it in the header? Or is it using document.write? Then, if you're not using Google Analytics to track your code asynchronously, you only have yourself to blame.
This is what the Google Analytics asynchronous tracking code looks like. We have to admit, it uses DOM instead of using document.write, which may suit you better. It can detect some of these events before the web page loads, which is very important. Now think about this situation, your page hasn't even loaded and all users have closed the page. Found a solution to missing page views
var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-XXXXXXX-XX']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/JavaScript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.m.sbmmt.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();
Not using Google Analytics? That’s not a problem, most of today’s analytics providers allow you to use asynchronous tracking.
Ajax Optimization
Ajax requests have a significant impact on the performance of your website. Below I introduce several key points about Ajax optimization.
Cache your ajax
Take a look at your code first. Is your ajax cacheable? Yes, it relies on data, but most of your ajax requests should be cacheable. In jQuery, your requests are cached by default, excluding script and jsonp data types.
Use GET for Ajax requests
POST type requests need to send two TCP packets (first send the header, then send the data). GET type requests only need to send one packet (this may depend on the number of cookies you have). So, when your URL is less than 2K in length and you want to request some data, you might as well use GET.
Using ySlow
When it comes to performance, ySlow is both simple and extremely effective. It scores your website, showing which areas need correction and which areas should be focused on.
Another trick: package your JavaScript into a PNG file
Imagine: add your JS and CSS to the end of the image, and then use CSS to crop it, through an HTTP Request to get all the information you need in the application.
I recently found this method. It basically packs your JavaScript/css data into a PNG file. After that, you can unpack it and just use the canvas API's getImageData(). Plus, it's very efficient. You can compress about 35% more without shrinking the data. And it's lossless compression! I have to point out that for larger scripts, you will feel that there is a "some" loading time while the image is pointed to the canvas and the pixels are read.
For more js performance optimization, please pay attention to the PHP Chinese website for related articles on how to load your JavaScript pages faster!