This is a simple article about some tips on using JavaScript arrays. We will use different methods to combine/merge two JS arrays, as well as discuss the advantages/disadvantages of each method.
Let us first consider the following situation:
var a = [ 1, 2, 3, 4, 5, 6, 7, 8, 9 ]; var b = [ "foo", "bar", "baz", "bam", "bun", "fun" ];
Obviously the simplest combination result should be:
[ 1, 2, 3, 4, 5, 6, 7, 8, 9, "foo", "bar", "baz", "bam" "bun", "fun" ]
This is the most common approach:
var c = a.concat( b ); a; // [1,2,3,4,5,6,7,8,9] b; // ["foo","bar","baz","bam","bun","fun"] c; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
As you can see, C is a brand new array, representing the combination of two arrays a and b, leaving A and B unchanged. Simple right?
But what if a has 10,000 elements and b also has 10,000 elements? C will have 20,000 elements, so the memory usage of a and b will double.
“No problem!”, you said. Let them be garbage collected, set A and B to null, problem solved!
a = b = null; // 'a'和'b'就被回收了
Haha. For small arrays with only a few elements, this is no problem. But for large arrays, or in systems with limited memory that need to repeat this process frequently, it actually has a lot of room for improvement.
Okay, let’s copy the contents of one array to another, using: Array#push(..)
// `b` onto `a` for (var i=0; i < b.length; i++) { a.push( b[i] ); } a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"] b = null;
Now, array a has the contents of array b.
Seems to have better memory usage.
But what if array a is relatively small? For memory and speed reasons, you may want to put the smaller a in front of b. No problem, just replace push(..) with unshift(..):
// `a` into `b`: for (var i=a.length-1; i >= 0; i--) { b.unshift( a[i] ); } b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
However, the for loop is indeed ugly and difficult to maintain. Can we do better?
This is our first attempt, using Array#reduce:
// `b` onto `a`: a = b.reduce( function(coll,item){ coll.push( item ); return coll; }, a ); a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"] // or `a` into `b`: b = a.reduceRight( function(coll,item){ coll.unshift( item ); return coll; }, b ); b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
Array#reduce(..) and Array#reduceRight(..) are nice, but they are a A little clumsy. ES6=>'s arrow functions will reduce the amount of code some, but it still requires a function that needs to be called once for each element, which is not perfect.
How about this one:
// `b` onto `a`: a.push.apply( a, b ); a; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"] // or `a` into `b`: b.unshift.apply( b, a ); b; // [1,2,3,4,5,6,7,8,9,"foo","bar","baz","bam","bun","fun"]
This one is much better, right? Especially since the unshift(..) method doesn't need to worry about the previous reverse sorting here. ES6's span operation will be more beautiful: a.push( ...b) or b.unshift( ...a
Maximum array length limit
The first major problem Yes, the memory usage has doubled (only temporarily of course!). The appended content is basically copying the elements to the stack through function calls. In addition, different JS engines have limitations on the length of the copied data.
So, if the array has a million elements, you will definitely exceed the limit of the call stack allowed for push(...) or unshift(...) , which it will do with a few thousand elements. That's fine, but you have to be careful not to exceed reasonable length limits.
Note: You can try splice(...), which is similar to push(...) and unshift(.. .) There is a way to avoid this maximum length limit. Wait, our readability will go backwards. The more you change, the worse it gets, haha.
The above is the content of N methods of merging JavaScript arrays. For more related content, please pay attention to the PHP Chinese website (m.sbmmt.com)!
#