Sorting algorithms java: Everything you need to know

In case you are a programmer or a developer, Java is one of the important and must-have knowledge you have. Also, coming to Java, we can help but mention sorting algorithms in java. If you haven’t known any information about it, we think you will have many difficulties when working with Java. So, in order to help you solve this problem, ArrowHiTech will deliver helpful information on the sorting algorithms in java through this article below. So, let’s explore right now! 

What is sorting algorithms in java?

Firstly, in Java, a sorting algorithm is one that arranges a set of elements in a specified order.

First and foremost, it allows you to utilize a variety of ordering criteria, such as sorting numbers from least to largest or vice versa, or lexicographically sorting strings. Not only that, you are able to create your own criteria as you want. 

Sorting algorithms java

In fact, Java includes a wide range of sorting algorithms, and not all of them are equally effective. Because there are too many sorting algorithms Java, so in this blog, we just mention several popular and remarkable ones. Then, let’s start now! 

Explore types of sorting algorithms in java

Bubble Sort in sorting algorithms in java

Coming to the first sorting algorithms in java you need to know is Bubble Sort. Simply speaking, if adjacent components are not in the intended order, bubble sort works by exchanging them. This operation is repeated until all of the elements in the array are in order. 

Moreover, when we complete the loop without swapping, we know that all elements are in order; all elements we compared, and by extension, the entire array, were in the proper order with their adjacent elements.

Now, the steps to sorting an array of numbers from least to greatest are as follows. So, let’s take a look right now!

4 2 1 5 3: We swap the first two items because they are in the wrong sequence.

2 4 1 5 3: The second and third items are also out of order, so we exchange them.

2 1 4 5 3: We’ll leave these two alone because they’re in the correct order, 4 5.

2 1 4 5 3: Another swap

2 1 4 3 5: After one iteration, this is the resultant array.

How to run Bubble Sort?

For more details, Bubble Sort will be implemented in the same manner as we’ve described it in words. Our code enters a while loop, switching elements as needed throughout the array.

What’s more, assuming that the array is sorted, but if we’re proven wrong when sorting (for example, if a swap occurs), we repeat the process. After that, the while-loop continues until we have successfully passed through the entire array without swapping.

public static void bubbleSort(int[] a) {

    boolean sorted = false;

    int temp;

    while(!sorted) {

        sorted = true;

        for (int i = 0; i < array.length – 1; i++) {

            if (a[i] > a[i+1]) {

                temp = a[i];

                a[i] = a[i+1];

                a[i+1] = temp;

                sorted = false;

            }

        }

    }

}

In this case, we must be careful how we describe our swap condition when utilizing this approach. 

Let’s refer to the instance: if I had written a[i] >= a[i+1], I could have ended up with an infinite loop because this relation would always be true for equal components. As a result, it will cause them to be swapped.

Time Complexity of Bubble Sort

In order to determine the time complexity of Bubble Sort, we must consider the worst-case situation. Is there a limit to how many times we can traverse through the entire array before it’s sorted? Then, you should refer to the following instance:

Bubble Sort in Sorting algorithms java

As you can see, 5 will “bubble up to the surface” in the first iteration, but the rest of the elements will remain in descending order. So, this is the reason why We’d have to do one iteration for each element except one, and then another to make sure everything is in order, for a total of five iterations.

In addition, in case you extend this to any array with n elements, you’ll need to perform n iterations. According to the language, this means that our while loop can run up to n times.

Furthermore, we’re iterating through the entire array each of those n times (for-loop in the code), hence the worst-case time complexity is O(n2).

Java

Insertion sort in sorting algorithms in java

Now, let’s move to the second sorting algorithms in java – Insertion Sort. To begin, Insertion Sort works by splitting the array into sorted and unsorted subarrays. Besides, the sorted section starts with a length of 1 and corresponds to the array’s first (left-most) member. Then, we iterate through the array, adding one member to the sorted portion of the array with each iteration.

After that, we place the new element in its rightful place within the sorted subarray as soon as we expand it. This is accomplished by moving all of the elements to the right until we reach the first element that does not require shifting.

If the bolded component of the following array is sorted in ascending order, for example, it will happen things like below:

Example

3 5 7 8 4 2 1 9 6: We take 4 and take note that we need to put it in. We change because 8 is greater than 4.

3 5 7 x 8 2 1 9 6: Where the value of x isn’t important because it’ll be overwritten right away (either by 4 if it’s in the right spot or by 7 if we shift). We shift because 7 is greater than 4.

3 5 x 7 8 2 1 9 6 3 5 x 7 8 2 1 9 6 3 5 x 7 8 2

3 x 5 7 8 2 1 9 6 3 x 5 7 8 2 1 9 6 3 x 5 7 8 2

3 4 5 7 8 2 1 9 6 3 4 5 7 8 2 1 9 6 3 4 5 7 8 2 1

Following this procedure, the sorted section was increased by one element, resulting in a total of five elements rather than four. In particular, this is done in each iteration, and towards the end, the entire array will be sorted.

The way to implement Insertion Sort

Want to know how to implement Insertion Sort? Then, all you need to do is following our code instruction below:

public static void insertionSort(int[] array) {

    for (int i = 1; i < array.length; i++) {

        int current = array[i];

        int j = i – 1;

        while(j >= 0 && current < array[j]) {

            array[j+1] = array[j];

            j–;

        }

        // at this point we’ve exited, so j is either -1

        // or it’s at the first element where current >= a[j]

        array[j+1] = current;

    }

}

Time Complexity of Insertion Sort

In reality, considering the algorithm’s worst-case scenario is the only way to determine Time complexity of Insertion Sort. 

This is due to the fact that we’ll have to shift the entire sorted list by one in each iteration, which is O. (n). Besides, this must be done for each element in each array, which means it will take O(n2) time.

Sorting algorithms java Insertion Sort

Selection Sort in sorting algorithms in java

Coming to the third sorting algorithms in java, in terms of Selection Sort, it divides the array into two subarrays: sorted and unsorted. This time, though, the sorted subarray is created by swapping the minimum member of the unsorted subarray with the last element of the sorted array. Then, let’s have a glance below:

3 5 1 2 4

1 5 3 2 4

1 2 3 5 4

1 2 3 5 4

1 2 3 4 5

1 2 3 4 5

How does Selection Sort run? 

To start, we assume the first unsorted element is the smallest in each iteration and iterate over the remainder to determine if there is a smaller element.

Then, we swap it with the first element and consider it a part of the sorted array as soon as we identify the current minimum of the unsorted half of the array. Let’s refer to the program below to dig down more information:

public static void selectionSort(int[] array) {

    for (int i = 0; i < array.length; i++) {

        int min = array[i];

        int minId = i;

        for (int j = i+1; j < array.length; j++) {

            if (array[j] < min) {

                min = array[j];

                minId = j;

            }

        }

        // swapping

        int temp = array[i];

        array[i] = min;

        array[minId] = temp;

    }

}

Selection Sort’s Time complexity

Because we must examine all of the elements, finding the minimum takes O(n) for the length of the array. Besides, the process is bounded by O(n2) since we must determine the minimum for each entry of the array.

Merge Sort in sorting algorithms in java

What about Merge Sort in sorting algorithms in java? 

Simply speaking, Merge Sort in sorting algorithms in java employs recursion to handle the problem of sorting more efficiently than prior algorithms. Besides, it employs a divide-and-conquer strategy in particular.

For more details, we will divide the entire array into two subarrays using both of these concepts, and then:

#1. Sort the left half of the array (recursively)

#2. Sort the right half of the array (recursively)

#3. Merge the solutions

tree diagram of Merge Sort

In this tree diagram, you can easily understand how recursive calls function. We call the method for the arrays marked with the down arrow, while merging the arrays designated with the up arrow moving back up. So you take the down arrow all the way to the bottom of the tree, then return up and combine. 

As you can see through our example above, we have the array 3 5 3 2 1. Hence, we divide it into 3 5 4 and 2 1. So, we further divide them into their components to sort them. Then, we start merging up and sorting them as we go after we’ve reached the bottom.

Selection Sort’s Implementation

Now, let’s explore the codes below:

public static void mergeSort(int[] array, int left, int right) {

    if (right <= left) return;

    int mid = (left+right)/2;

    mergeSort(array, left, mid);

    mergeSort(array, mid+1, right);

    merge(array, left, mid, right);

}

First of all, in order to combine the sorted subarrays into a single array, we’ll need to compute their lengths and create temporary arrays into which we may transfer them. As a result, we can freely update our main array. 

The next step is that we go through the resulting array and assign it the current minimum after copying. Then, we simply choose the lesser of the two elements that haven’t been chosen yet, and move the iterator for that subarray forward. Also, since our subarrays are sorted, we simply choose the lesser of the two elements that haven’t been chosen yet, and move the iterator for that subarray forward. So, let’s follow our instruction below:

Code instruction

void merge(int[] array, int left, int mid, int right) {

    // calculating lengths

    int lengthLeft = mid – left + 1;

    int lengthRight = right – mid;

    // creating temporary subarrays

    int leftArray[] = new int [lengthLeft];

    int rightArray[] = new int [lengthRight];

    // copying our sorted subarrays into temporaries

    for (int i = 0; i < lengthLeft; i++)

        leftArray[i] = array[left+i];

    for (int i = 0; i < lengthRight; i++)

        rightArray[i] = array[mid+i+1];

    // iterators containing current index of temp subarrays

    int leftIndex = 0;

    int rightIndex = 0;

    // copying from leftArray and rightArray back into array

    for (int i = left; i < right + 1; i++) {

        // if there are still uncopied elements in R and L, copy minimum of the two

        if (leftIndex < lengthLeft && rightIndex < lengthRight) {

            if (leftArray[leftIndex] < rightArray[rightIndex]) {

                array[i] = leftArray[leftIndex];

                leftIndex++;

            }

            else {

                array[i] = rightArray[rightIndex];

                rightIndex++;

            }

        }

        // if all the elements have been copied from rightArray, copy the rest of leftArray

        else if (leftIndex < lengthLeft) {

            array[i] = leftArray[leftIndex];

            leftIndex++;

        }

        // if all the elements have been copied from leftArray, copy the rest of rightArray

        else if (rightIndex < lengthRight) {

            array[i] = rightArray[rightIndex];

            rightIndex++;

        }

    }

}

Time Complexity of Selection Sort

In fact, you have to go a little mathy in case you want to figure out how intricate recursive algorithms are.

Simply speaking, Recursive algorithms’ temporal complexity is determined using the Master Theorem. We can usually describe the precise time complexity of non-recursive algorithms as an equation. Then, use Big-O Notation to group them into classes of algorithms that behave similarly.

Specifically, the issue with recursive algorithms is that the identical equation would appear as follows. Then, let’s refer to it:

equation

In this equation, a indicates how many times the recursion occurs, and b indicates how many sections our problem is broken into. This may appear to be an insignificant distinction in this case because they are equal for mergesort, but they may not be for particular issues. 

Besides, the remaining part of the equation is the difficulty of combining all of those solutions into a single solution at the end. 

Solution

solution of Selection Sort

Merge Sort would run twice for arrays half the length of the original array if T(n) is the algorithm’s duration when sorting an array of length n.

Then, if a=2 and b=2, we have a=2 and b=2. Because the merge stage consumes O(n) memory, k=1. As a result, the Merge Sort equation will look like: 

Sorting algorithms java

In addition, because we have 2=21, we can utilize The Master Theorem to discover that our case is the one where a=bk. As a result, our level of complexity is O. (nlog n). This is an excellent time complexity for a sorting method, as it has been demonstrated that an array cannot be sorted faster than O(1) (nlog n).

Besides, although the version we’ve shown here uses a lot of memory, there are more intricate Merge Sort variants that just take up O(1) space.

Furthermore, because recursive calls from one node can be done fully independently from distinct branches, the technique is incredibly straightforward to parallelize. Additionally, although we won’t go into detail about how and why this algorithm works, it’s very important to remember the benefits of employing it.

Heapsort in sorting algorithms in java

The next sorting algorithms in java is Heapsort. Simply speaking, a heap is a tree that meets the heap property, which states that all of a node’s children have the same relation to it. In addition, a heap must be nearly complete. Following that, an almost complete binary tree of depth d has a complete subtree of depth d-1 with the same root and a complete left subtree for each node with a left descendent. To put it another way, we always go for the leftmost location in the highest incomplete level when adding a node.

With the heap is a max-heap, all of the offspring must be smaller than their parents and vice versa.

To put it another way, as you progress down the tree, you either obtain smaller and smaller numbers (min-heap) or larger and larger numbers (max-heap) (max-heap). Now, let’s refer the example about a max-heap:

tree diagram of Sorting algorithms java HeapSort

In the other hand, this max-heap in memory can be also represented as an array like below:

everything about Sorting algorithms java

Explanation

In this case, it’s as if you’re reading the graph from left to right, level after level. This means that if we pick the kth element in the array, the positions of its offspring are 2*k+1 and 2*k+2 (assuming the indexing starts at 0). 

In contrast, the parent’s location for the kth element is always (k-1)/2.

As a result, with this knowledge, you can easily “max-heapify” any array. Then, you have to check if any of the offspring of each element are smaller than it. If this is the case, swap one of them with the parent and repeat the process with the parent. 

In particular, with leaves that don’t have any children, they’re self-contained max-heaps. 

6 1 8 3 5 2 4: Because both children are smaller than their parents, nothing changes.

6 1 8 3 5 2 4: We exchange them because 5 > 1. Now we’ll heapify for 5 in a recursive manner.

6 5 8 3 1 2 4: Nothing happens since both of the youngsters are little

6 5 8 3 1 2 4: We exchange them since 8 is more than 6.

 8 5 6 3 1 2 4: We got the stack depicted above.

Finally, the rest is very straightforward once we’ve figured out how to heapify an array. We shorten the array by one by swapping the root of the heap with the end of the array. 

Then, we heapify the reduced array once more and repeat:

8 5 6 3 1 2 4

4 5 6 3 1 2 8: swapped

6 5 4 3 1 2 8: heapified

2 5 4 3 1 6 8: swapped

5 2 4 2 1 6 8: heapified

1 2 4 2 5 6 8: swapped

How to run Heapsort?

static void heapify(int[] array, int length, int i) {

    int leftChild = 2*i+1;

    int rightChild = 2*i+2;

    int largest = i;

    // if the left child is larger than parent

    if (leftChild < length && array[leftChild] > array[largest]) {

        largest = leftChild;

    }

    // if the right child is larger than parent

    if (rightChild < length && array[rightChild] > array[largest]) {

        largest = rightChild;

    }

    // if a swap needs to occur

    if (largest != i) {

        int temp = array[i];

        array[i] = array[largest];

        array[largest] = temp;

        heapify(array, length, largest);

    }

}

public static void heapSort(int[] array) {

    if (array.length == 0) return;

    // Building the heap

    int length = array.length;

    // we’re going from the first non-leaf to the root

    for (int i = length / 2-1; i >= 0; i–)

        heapify(array, length, i);

    for (int i = length-1; i >= 0; i–) {

        int temp = array[0];

        array[0] = array[i];

        array[i] = temp;

        heapify(array, i, 0);

    }

}

Time Complexity

In reality, everything appears to be done in O(1) when we look at the heapify() code, but then there’s that annoying recursive call.

 In the worst-case scenario, it will spread to the top of the stack. It will do so by jumping to each node’s parent, which is about position i/2. As a result, it will make at least log n jumps before reaching the summit, making the complexity O. (log n).

Heapsort’s total complexity would be O(n) because heapSort() is clearly O(n) owing to for-loops iterating through the entire array (nlog n).

QuickSort in sorting algorithms in java

Coming to the final sorting algorithms in java called QuickSort – a Divide and Conquer algorithm. Simply speaking, QuickSort in sorting algorithms in java chooses one element from an array as the pivot and arranges all other elements around it, such as smaller elements to the left and larger ones to the right.

What’s more, this ensures that the pivot is in the correct position at the end of the process. The method then repeats the process for the array’s left and right halves.

Implementation of QuickSort

static int partition(int[] array, int begin, int end) {

    int pivot = end;

    int counter = begin;

    for (int i = begin; i < end; i++) {

        if (array[i] < array[pivot]) {

            int temp = array[counter];

            array[counter] = array[i];

            array[i] = temp;

            counter++;

        }

    }

    int temp = array[pivot];

    array[pivot] = array[counter];

    array[counter] = temp;

    return counter;

}

public static void quickSort(int[] array, int begin, int end) {

    if (end <= begin) return;

    int pivot = partition(array, begin, end);

    quickSort(array, begin, pivot-1);

    quickSort(array, pivot+1, end);

}

QuickSort’s Time Complexity

First and foremost, the following equation can be used to express the time complexity of Quicksort:

QuickSort Sorting algorithms java

Then, let’s examine the worst scenario. Simply speaking, it is when the pivot is always chosen from the largest or smallest element. After that, we will get the following equation: 

Sorting algorithms java

As a result, it turns out that this is O(n2).

In fact, this may appear to be a terrible thing, given that we’ve already learnt about numerous algorithms that operate in O(nlog n) time as their worst case, yet Quicksort is really highly popular.

Also, this is due to the fact that it has a very good average runtime, which is likewise bounded by O(nlog n), and is very efficient for a substantial fraction of the potential inputs.

In particular, it comes with an outstanding advantage that it takes up no more space, all sorting is done in-place, and there are no costly allocation and deallocation calls.

Sorting in Java

Comparable Interface

First of all, in case you have your own types, developing a distinct sorting method for each one can be time consuming. As a result, Java includes an interface that allows you to use Collections.sort() on your own classes. Then, to accomplish this, your class must implement the ComparableT> interface, where T is your type, and override the.compareTo method ().

Moreover, if this is smaller than the argument element, 0 if they are equal, and a positive integer if this is greater, this method returns a negative integer. We’ve created a class called Student, and each student is recognized by an id and the year they began their studies.

In fact, we wish to sort them first and foremost by generations, but also by IDs. Then, let’s take a look at codes below:

public static class Student implements Comparable<Student> {

    int studentId;

    int studentGeneration;

    public Student(int studentId, int studentGeneration) {

        this.studentId = studentId;

        this.studentGeneration = studentGeneration;

    }

    @Override

    public String toString() {

        return studentId + “/” + studentGeneration % 100;

    }

    @Override

    public int compareTo(Student student) {

        int result = this.studentGeneration – student.studentGeneration;

        if (result != 0)

            return result;

        else

            return this.studentId – student.studentId;

    }

}

Then, this program below illustrates how to put it to work in a program:

public static void main(String[] args) {

    Student[] a = new SortingAlgorithms.Student[5];

    a[0] = new Student(75, 2016);

    a[1] = new Student(52, 2019);

    a[2] = new Student(57, 2016);

    a[3] = new Student(220, 2014);

    a[4] = new Student(16, 2018);

    Arrays.sort(a);

    System.out.println(Arrays.toString(a));

}

Finally, the output like below will appear:

Sorting algorithms java

Comparator Interface

In fact, we may wish to sort our objects in an unusual way for a specific reason, but not as the default behavior of our class, or we may be sorting a collection of a built-in type in an unusual way.

So, we can do this by using the Comparator interface. Take our Student class, for example, and sort simply by ID:

public static class SortByID implements Comparator<Student> {

    public int compare(Student a, Student b) {

        return a.studentId – b.studentId;

    }

}

On the other hand, in case we substitute the following for the sort call in main:

Arrays.sort(a, new SortByID());

As a result, you will get the outcome like:

code outcome Sorting algorithms java

How Does Everything Work?

To begin, the underlying Arrays sort() method is called by Collection sort(), and the sorting is done using Insertion Sort for arrays shorter than 47 and Quicksort for the remainder.

Besides, according to the JDK10 documentation, it’s built on a two-pivot Quicksort implementation that ensures it avoids most of the common reasons of quadratic performance decrease.

In conclusion

In short, through this article, ArrowHiTech hopes you will get a lot of useful information about Java and sorting algorithms in java. Then, you will master Java in a day not far. Of course, if you have any problem or inquiry, let’s CONTACT US instantly. With many experience in Kotlin and Java Android App Development Services, we will help you solve effectively. 

Tags

Share