Get free ebooK with 50 must do coding Question for Product Based Companies solved
Fill the details & get ebook over email
Thank You!
We have sent the Ebook on 50 Must Do Coding Questions for Product Based Companies Solved over your email. All the best!

Remove Duplicate Elements From an Array

Last Updated on September 22, 2023 by Mayank Dham

Remove Duplicate elements from array is a relatively straightforward process. In this blog post, our focus will be on eliminating duplicate elements from an array. We will explore methods to identify and efficiently remove these duplicates, streamlining the array for improved clarity and efficiency. Let’s talk about “Remove Duplicate Elements From Array” in Detail.

Removing Duplicate Elements from Array

We need to remove duplicate elements from array so that each element only appears once (all the values in the array should become unique).

Example
Input:

[1,2,2,3,4,4,4,5,5]

Output:

[1,2,3,4,5]
Explanation
The frequency of each element is shown.
Element Frequency
1 1
2 2
3 1
4 3
5 2

All the duplicate elements have been eliminated because once the duplicates were eliminated, the frequency of all the elements became 1.

Element Frequency
1 1
2 1
3 1
4 1
5 1

So the output is [1,2,3,4,5]

Approach 1 (Using Extra Space) For Remove Duplicate Elements From Array

This is how the array duplicates are removed using brute force. This approach makes use of extra .

Algorithm:

  1. The array is to be sorted first.
  2. To save the updated array, create a resultant array called res.
  3. If a[i]!=a[i+1], then run a loop from i=0 to i=n-2, and then push a[i] into the res array.
  4. a[n-1] element is added to the res array.
  5. Finally, return the res array.

Code Implementation

#include<bits/stdc++.h>
using namespace std;

void removeDuplicates(vector<int> a,int n){
    vector<int> res;
    sort(a.begin(),a.end());
    for(int i=0;i<n-1;i++){
        if(a[i]!=a[i+1])
            res.push_back(a[i]);
    }
    res.push_back(a[n-1]);
    for(auto x:res)
        cout<<x<<" ";
}

int main(){
    vector<int> a{1,2,1,2,3,4,5,6,4,4,5,5};
    removeDuplicates(a,a.size());
}
import java.util.*;
import java.lang.*;
import java.io.*;

public class Main {
  
    public static void removeDuplicates(int a[], int n)
    {
        int[] res = new int[n];
        int j = 0;
        Arrays.sort(a);
        for (int i = 0; i < n - 1; i++) {
            if (a[i] != a[i + 1]) {
                res[j++] = a[i];
            }
        }  
        res[j++] = a[n - 1];
        for (int i = 0; i < j; i++) {
            System.out.print(res[i]+" ");
        }
    }
    public static void main(String[] args)
    {
        int a[] = {1,2,1,2,3,4,5,6,4,4,5,5};
        int n = a.length;
        removeDuplicates(a, n);
    }
}
def removeDuplicates(a, n):
    a.sort()
    res = list(range(n))
    j = 0;
    for i in range(0, n-1):
        if a[i] != a[i+1]:
            res[j] = a[i]
            j += 1
    res[j] = a[n-1]
    j += 1
    for i in range(0, j):
        a[i] = res[i]
    for i in range(0, j):
        print(a[i],end=" ")
a = [1,2,1,2,3,4,5,6,4,4,5,5]
n = len(a)
removeDuplicates(a, n)

Output

1 2 3 4 5 6

Time Complexity Analysis For Remove Duplicate Elements From Array

  • We sort the array in the first step, which takes O(NlogN) time.
  • After that, we execute the loop from 1 to n-1, which requires O(N) time.
  • Therefore, this method’s worst-case time complexity for eliminating duplicates from the array is O(NlogN)+O(N)=O(NlogN).

Auxiliary Space Analysis For Remove Duplicate Elements From Array
This method utilizes O(N) auxiliary space to eliminate duplicate elements from the array because we will be storing the unique elements in the resultant array.

Time Complexity: O(NlogN)
Auxiliary Space: O(N)

Approach 2 (Using Constant Extra space) For Remove Duplicate Elements From Array

In this method, we won’t use any extra space; instead, we’ll just change the provided array so that the first k elements are the unique ones and the rest are duplicates. Then, we’ll return the value of k. Therefore, we may eliminate duplicates from the array in this method.

Algorithm:

  1. Array to be sorted first.
  2. Since we know that the first element will always be unique, we keep a reference called k that counts down the unique elements starting at 1.
  3. From i=0 to i=n-2, execute a loop.
  4. If a[i]!=a[i+1], then a[k]=a[i] and k++
  5. else continue;
  6. add the a[n-1] element into res array
  7. We will print the elements present in the index range from i=0 to i=k-1 before returning the value of k, which is now the total number of unique elements in the array.

Code Implementation

#include<bits/stdc++.h>
using namespace std;

int removeDuplicates(vector<int> &a,int n){
    int k=0;
    sort(a.begin(),a.end());
    for(int i=0;i<n-1;i++){
        if(a[i]!=a[i+1]){
            a[k]=a[i];
            k++;
        }
    }
    a[k]=a[n-1];
    k++;
    return k;
}

int main(){
    vector<int> a{1,2,1,2,3,4,5,6,4,4,5,5};
    int k = removeDuplicates(a,a.size());
    for(int i=0;i<k;i++)
        cout<<a[i]<<" ";
}
import java.util.*;
import java.lang.*;
import java.io.*;

class Main {
  
    public static void removeDuplicates(int a[], int n)
    {
        int j = 0;
        Arrays.sort(a);
        for (int i = 0; i < n - 1; i++) {
            if (a[i] != a[i + 1]) {
                a[j++] = a[i];
            }
        }  
        a[j++] = a[n - 1];
        for (int i = 0; i < j; i++) {
            System.out.print(a[i]+" ");
        }
    }
    public static void main(String[] args)
    {
        int a[] = {1,2,1,2,3,4,5,6,4,4,5,5};
        int n = a.length;
        removeDuplicates(a, n);
    }
}
def removeDuplicates(a, n):
    a.sort()
    j = 0;
    for i in range(0, n-1):
        if a[i] != a[i+1]:
            a[j] = a[i]
            j += 1
    a[j] = a[n-1]
    j += 1
    for i in range(0, j):
        print(a[i],end=" ")
a = [1,2,1,2,3,4,5,6,4,4,5,5]
n = len(a)
removeDuplicates(a, n)

Time Complexity Analysis For Remove Duplicate Elements From Array

  • The array is sorted in the first step, which takes O(NlogN) time. Next, the loop from 0 to n-2 is run, which takes O(N) time.
  • Therefore, this method’s worst-case time complexity for eliminating duplicates from the array is O(NlogN)+O(N)=O. (NlogN)

Auxiliary Space Analysis For Remove Duplicate Elements From Array
We only need one variable, k, in this method, thus the total amount of extra space used is O(1), which is constant.

Time Complexity: O(NlogN)
Auxiliary Space: O(1)

Approach 3 (Using a Set) For Remove Duplicate Elements From Array

The Set Data structure will be used in this approach to eliminate duplicates from the array. In the end, we will change the supplied array so that the first k elements are the unique elements, and we will then return k, the total number of unique elements.

Note: A set is a type of data structure that only keeps one version of each element that is added to it.

Algorithm:

  1. Make a set called s.
  2. Add all of the array’s components to the set.
  3. Maintain a pointer called k and set its initial value to 0.
  4. As you begin to iterate through the set, change the array as a[k]=x and k++, where x represents an element in the set.
  5. We will print the elements present in the index range from i=0 to i=k-1 before returning the value of k, which is now the total number of unique elements in the array.

Code Implementation

#include<bits/stdc++.h>
using namespace std;

int removeDuplicates(vector<int> &a,int n){
    int k=0;
    set<int> s;
    for(int i=0;i<n;i++)
        s.insert(a[i]);
    for(auto x:s){
        a[k]=x;
        k++;
    }
    return k;
}

int main(){
    vector<int> a{1,2,1,2,3,4,5,6,4,4,5,5};
    int k = removeDuplicates(a,a.size());
    for(int i=0;i<k;i++)
        cout<<a[i]<<" ";
}
import java.util.*;
import java.lang.*;
import java.io.*;
public class Main {
  
    public static void removeDuplicates(int a[], int n)
    {
        Set<Integer> hash_set = new HashSet<Integer>();
        for (int i = 0; i<n; i++) {
            hash_set.add(a[i]);
        }  
        System.out.print(hash_set);
        
    }
    public static void main(String[] args)
    {
        int a[] = {1,2,1,2,3,4,5,6,4,4,5,5};
        int n = a.length;
        removeDuplicates(a, n);
    }
}
def removeDuplicates(a, n):
    hashset=set()
    for i in range(0, n):
        hashset.add(a[i])
    print(hashset)

a = [1,2,1,2,3,4,5,6,4,4,5,5]
n = len(a)
removeDuplicates(a, n)

Time Complexity Analysis For Remove Duplicate Elements From Array

  • The first stage involves running a loop from i=0 to i=n-1 to put all of the array’s elements into the set. This will take O(N) time.
  • The set internally employs hashing, and sorting the elements takes O(NlogN) time.
  • Last but not least, we are traversing the set, which, in the worst scenario, will take O(N) time (if every element is unique)
  • As a result, this method’s worst-case time complexity to remove duplicates from an array is O(N)+O(NlogN)+O(N)=O(NlogN).

Auxiliary Space Analysis For Remove Duplicate Elements From Array
This method involves utilizing a set and adding each element of the array to the set. Therefore, in this strategy, we are consuming O(N) auxiliary space.

Time Complexity: O(NlogN)
Auxiliary Space: O(N)

Conclusion
In this brief article, we covered three distinct methods to remove duplicate elements from array.

  • Approach 1 involved sorting the array, which required O(N) more space.
  • Approach 2 involved sorting the array, although it only required O(1) extra space.
  • Approach 3 utilized a Set data structure and required O(N) additional storage.

FAQs related to Remove Duplicate Elements From Array:

FAQs related to Remove Duplicate elements from Array are discussed below:

1. Why is removing duplicate elements from an array important?
Removing duplicates can help streamline data for efficient processing, reduce memory usage, and enhance the clarity of information within the array.

2. Are there built-in functions to remove duplicates in programming languages?
Many programming languages provide built-in functions or methods to remove duplicates from an array, which can simplify the process.

3. What is the time complexity of the discussed approaches?
The time complexity varies based on the approach used. The brute force approach usually involves nested loops, resulting in O(n^2) time complexity. Using a hash set can bring it down to O(n), and using sorting can be O(n log n) followed by a linear scan.

4. Is it possible to remove duplicates in-place without using extra space?
Yes, the method mentioned, where duplicates are pushed to the end of the array and unique elements are retained at the beginning, achieves removal in-place without using extra space.

Leave a Reply

Your email address will not be published. Required fields are marked *