Removing duplicate values from a JavaScript array can be a common task, especially when working with data that may contain redundant entries. There are several approaches to achieve this effectively, depending on the nature of your data and performance considerations.
This method leverages the built-in Set constructor and the spread syntax to create a new array that contains only unique values from the original array:
const uniq = [...new Set(array)];
This approach uses the filter() method to iterate over the array and check if the first occurrence of each element matches its current position within the array. Elements with mismatched positions are considered duplicates:
const uniqueArray = a.filter((item, pos) => a.indexOf(item) === pos);
Hashtables provide an efficient way to detect duplicate elements by using key-value pairs. In this approach, each element is added to a hashtable, and the presence of an element is checked instantly:
function uniq(a) { const seen = {}; return a.filter((item) => { return seen.hasOwnProperty(item) ? false : (seen[item] = true); }); }
To combine the benefits of the previous two approaches, this solution utilizes hashtables for primitive values and linear search for objects:
function uniq(a) { const prims = { boolean: {}, number: {}, string: {} }, objs = []; return a.filter((item) => { const type = typeof item; if (type in prims) return prims[type].hasOwnProperty(item) ? false : (prims[type][item] = true); else return objs.indexOf(item) >= 0 ? false : objs.push(item); }); }
Sorting the array before removing duplicates can simplify the process:
function uniq(a) { return a.sort().filter((item, pos, ary) => !pos || item != ary[pos - 1]); }
When you need to remove duplicates based on a specific criteria, such as a property of an object, you can provide a callback function to the uniqBy() method:
function uniqBy(a, key) { const seen = {}; return a.filter((item) => { const k = key(item); return seen.hasOwnProperty(k) ? false : (seen[k] = true); }); }
If you want to keep only the first or last occurrence of duplicate objects, you can use the Set or Map data structures:
function uniqByKeepFirst(a, key) { const seen = new Set(); return a.filter((item) => { const k = key(item); return seen.has(k) ? false : seen.add(k); }); } function uniqByKeepLast(a, key) { return [...new Map(a.map((x) => [key(x), x])).values()]; }
The above is the detailed content of How to Efficiently Remove Duplicate Values from a JavaScript Array?. For more information, please follow other related articles on the PHP Chinese website!