(Full Explanation, Multiple Solutions, Best Practices, Time Complexity & Real Examples)
📝 Introduction
One of the most frequently asked JavaScript interview coding challenges is:
“Given an array, remove all duplicates and return the array of unique values.”
It sounds simple — and it can be.
But interviewers love this question because it reveals your understanding of:
- Core JavaScript data structures
- ES6 features like Set and spread syntax
- Array iteration methods
- Algorithms and time complexity
- Clean and maintainable coding style
A candidate who can explain WHY a solution is correct — and how it compares to alternatives — instantly stands out in an interview.
This article presents every common solution, from beginner to advanced, with detailed explanations, performance considerations, edge cases, and real-world applications.
✅ What Does “Removing Duplicates From an Array” Mean?
Given an input array like:
[1, 2, 3, 3, 2, 5, 1]
We want:
[1, 2, 3, 5]
Key requirements:
- Keep only one instance of each value.
- Maintain order of appearance (in most interview tasks).
- Handle different types of values:
numbers, strings, objects, booleans, mixed arrays.
✅ Why Interviewers Ask This Question
This challenge tests your mastery of:
✅ Understanding array behaviour
✅ Using JavaScript Set
✅ Using filter(), reduce(), forEach()
✅ Handling edge cases
✅ Knowing time complexity
✅ Writing clean, readable, modern JS code
✅ Communicating your reasoning
A skilled developer:
- Discusses multiple approaches
- Explains complexity differences
- Shows awareness of ES6 features
- Knows pitfalls and edge cases
This article will help you master all of that.
🧩 Approach 1 — Removing Duplicates Using a Set (Most Common Modern Solution)
✅ How it works
A JavaScript Set stores unique values only.
You can convert an array into a set → remove duplicates → convert back to array.
✅ Code Example
function removeDuplicates(arr) {
return [...new Set(arr)];
}
console.log(removeDuplicates([1, 2, 2, 3, 3, 3, 4]));
// Output: [1, 2, 3, 4]
✅ Why it’s great
- Extremely simple
- Very fast
- Uses modern ES6 syntax
- Clean and easy to discuss in interviews
✅ Time complexity
- O(n) — each element inserted into Set once
- Best option for 90% of use cases
🧩 Approach 2 — Using filter() + indexOf() (Classic Beginner Method)
✅ Code Example
function removeDuplicates(arr) {
return arr.filter((value, index) => arr.indexOf(value) === index);
}
console.log(removeDuplicates(["a", "b", "a", "c", "b"]));
// Output: ["a", "b", "c"]
✅ Explanation
filterkeeps only the first occurrence of each valuearr.indexOf(value)gets index of first appearance- If current index matches → keep it
- Otherwise → skip (duplicate)
⚠️ Drawbacks
indexOf()is O(n)- Combined with filter → O(n²) worst case
- Not optimal for large arrays
However, it’s a good demonstration of understanding array traversal logic.
🧩 Approach 3 — Using an Object or Map to Track Seen Values
A more algorithmic approach that impresses interviewers.
✅ Code Example (Object version)
function removeDuplicates(arr) {
const seen = {};
const result = [];
for (let item of arr) {
if (!seen[item]) {
seen[item] = true;
result.push(item);
}
}
return result;
}
console.log(removeDuplicates([4, 4, 5, 6, 6, 7]));
// Output: [4, 5, 6, 7]
✅ Why it’s useful
- Works in older browsers
- Avoids nested loops
- Efficient
✅ Time complexity
- O(n)
⚠️ Limitation
Object keys become strings → does not handle objects or arrays well.
🧩 Approach 4 — Using reduce() (Functional Programming Style)
✅ Code Example
const removeDuplicates = (arr) =>
arr.reduce((unique, item) => {
return unique.includes(item) ? unique : [...unique, item];
}, []);
console.log(removeDuplicates([1, 1, 2, 2, 3, 4, 4]));
// Output: [1, 2, 3, 4]
✅ Advantages
- Elegant
- Great for demonstrating functional programming knowledge
- Clear transformation pipeline
⚠️ Disadvantage
includes() inside reduce = O(n) → total O(n²) complexity
🧩 Approach 5 — Using a Map for Better Key Handling
Unlike objects, Maps keep non-string keys properly.
✅ Code Example
function removeDuplicates(arr) {
const map = new Map();
const result = [];
for (let item of arr) {
if (!map.has(item)) {
map.set(item, true);
result.push(item);
}
}
return result;
}
console.log(removeDuplicates([{a:1}, {a:1}, {b:2}]));
// Output: [{a:1}, {a:1}, {b:2}] (objects counted separately)
✅ Why use Map?
- Supports object keys properly
- Preserves insertion order
- Efficient lookups
✅ Time complexity
- O(n)
🧩 Approach 6 — Removing Duplicates From an Array of Objects
This is often asked in more advanced interviews.
✅ Example Problem
We want:
[
{ id: 1, name: "A" },
{ id: 2, name: "B" },
{ id: 1, name: "A" }
]
Output:
[
{ id: 1, name: "A" },
{ id: 2, name: "B" }
]
✅ Code Example (Unique by key)
function uniqueByKey(arr, key) {
const seen = new Set();
return arr.filter(item => {
const val = item[key];
if (seen.has(val)) return false;
seen.add(val);
return true;
});
}
console.log(
uniqueByKey(
[{id:1},{id:2},{id:1}],
"id"
)
);
// Output: [{id:1},{id:2}]
✅ Why interviewers love this
- Real-world relevance
- Shows understanding of objects
- Demonstrates ability to generalize solutions
🧩 Approach 7 — Sorting + Filtering (Useful For Large Sorted Data)
✅ Code Example
function removeDuplicates(arr) {
const sorted = [...arr].sort();
return sorted.filter((value, index) => value !== sorted[index - 1]);
}
console.log(removeDuplicates([5, 3, 3, 2, 1, 5]));
// Output: [1, 2, 3, 5]
✅ When this is useful
If the array is already sorted, removing duplicates becomes extremely fast.
✅ Complexity
- Sorting → O(n log n)
- Filtering → O(n)
Total: O(n log n)
🚀 Edge Cases You Must Know (Interview Gold)
✅ 1. Empty array
→ return empty array
✅ 2. Array with one element
→ return itself
✅ 3. Array with strings and numbers
e.g. "1" and 1 are not the same in Set.
✅ 4. NaN handling
Set treats NaN as equal to NaN.
✅ 5. Removing duplicates in nested arrays or objects
Requires deep comparison.
✅ 6. Mixed types
Example:
[1, "1", true, "true"]
Each one is unique.
⚖️ Comparison Table of All Methods
| Method | Complexity | Pros | Cons | Best Use |
|---|---|---|---|---|
| Set | O(n) | Fastest, cleanest | None | All modern JS |
| filter + indexOf | O(n²) | Beginner-friendly | Slow for big arrays | Small arrays |
| Object mapping | O(n) | Fast, reliable | Fails for objects | Numeric/string arrays |
| Map | O(n) | Supports objects | Slightly verbose | Advanced cases |
| reduce | O(n²) | Declarative | Slow | Functional demos |
| Sorted filter | O(n log n) | Great for sorted data | Requires sorting | Sorted datasets |
🧭 Real-World Use Cases
Removing duplicates appears everywhere:
✅ 1. Removing duplicate user IDs
✅ 2. Filtering duplicate API results
✅ 3. Unique tags/categories in a CMS
✅ 4. Avoiding duplicate DOM nodes
✅ 5. Cleaning imported CSV or Excel data
✅ 6. Preparing ML datasets
✅ 7. Avoiding duplicate email addresses, products, or values
This is not just a toy interview question — it’s a daily-use problem.
🧾 Time Complexity Summary
| Approach | Time Complexity |
|---|---|
| Set | ✅ O(n) |
| Map | ✅ O(n) |
| Object hash | ✅ O(n) |
| filter + indexOf | ❌ O(n²) |
| reduce + includes | ❌ O(n²) |
| Sort + filter | ⚠️ O(n log n) |
Using a Set is almost always the best choice.
🧠 Best Practices and Recommendations
✅ Use Set for modern code (fast, simple, universal)
✅ Use Map when working with objects
✅ Add input validation (interviewers LOVE this)
✅ Mention time and space complexity
✅ Write expressive variable names
✅ Handle edge cases
✅ Avoid unnecessary array copies
🧩 SEO Keywords Section (to rank in Google)
- remove duplicates from array javascript
- javascript remove duplicate values
- deduplicate array es6
- javascript set example unique values
- coding interview remove duplicates question
- how to remove duplicates from array js
- unique array values javascript
- remove duplicates from object array javascript
- javascript filter duplicates
- algorithm to find unique values in array
✅ Final Thoughts
The JavaScript interview coding challenge “Remove duplicates from an array” is one of the most essential algorithm tasks for developers. While it seems simple, it opens the door to discussing:
- ES6 features
- Hash-based algorithms
- Time complexity and optimization
- Functional vs imperative solutions
- Real-world data transformations
After studying all approaches above, you should have complete mastery of this problem — enough to confidently solve it in an interview and explain why your solution is correct.