Quick Sort
In quick sort algorithm, we take a large array and partition it into two subarrays where one holds the values that are smaller than the specified value. For example, say that we a have a pivot element based on which the partition is made. The other array will hold values that are greater than the pivot element.
While performing Quicksort, we should partition the array and then call it recursively twice so that it can sort the two subarrays. This algorithm is considered to be effective for the large-sized data sets as the complexity for its average and worst-case are O(n*logn) and O(n2), respectively
Pseudocode for partitioning:
/* This function takes last the element as pivot. The pivot element is placed at its correct position as we sort the array. All the smaller elements(smaller than pivot) are then placed to left of pivot and all the greater elements towards the right of pivot */
/* l –> Starting index, h –> Ending index */
partition (array[], l, h) { pivot = array[h]; i = (l - 1) for (j = l ; j <= h- 1; j++) { // If the current element is less than the pivot element if (array[j] < pivot) { i++; swap array[i] and array[j] } } swap array[i + 1] and array[h]) return (i + 1) }
Recursive Quick sort Pseudocode:
quickSort(array[], l, h) { if (low < high) { /* partin is partitioning index, array[partin] is now at right place */ partin = partition(array, l, h); quickSort(array, l, partin - 1); // Before partin quickSort(array, partin + 1, h); // After partin } }
Working of the Quick Sort Algorithm
To understand the working of quick sort, let us consider the following example.
Let the elements of the array are in unsorted order to make it more clear in understanding the concept.
The elements of the array are as follows:
In the given array, we are considering the leftmost element as the pivot element. So, in this case, arr[left] = 24, arr[right] = 27 and arr[pivot] = 24.
Since, our pivot element is at the left of the array, the algorithm starts from right of the array and move towards the left.

Now, arr[pivot] < arr[right], so algorithm(which is at right) moves one position forward towards left, i.e. –
Here, arr[left] = 24, arr[right] = 19, and arr[pivot] = 24.
Now, as arr[pivot] > arr[right], the algorithm will swap a[pivot] element with the a[right] element and the pivot moves to the right. This is illustrated as follows,
Now, arr[left] = 19, arr[right] = 24, and arr[pivot] = 24. Since the pivot is at right, so algorithm starts from the left and moves towards the right.
Here as arr[pivot] > arr[left], the algorithm moves one position to the right as –
Now, arr[left] = 9, arr[right] = 24, and arr[pivot] = 24. As arr[pivot] > arr[left], the algorithm will move one position to the right as –
Here, arr[left] = 29, arr[right] = 24, and arr[pivot] = 24. As arr[pivot] < arr[left], we swap arr[pivot] and arr[left], so now the pivot is at the left, i.e. –
Now, as the pivot is at left, the algorithm will start from right, and move towards the left. Here, arr[left] = 24, arr[right] = 29, and arr[pivot] = 24. As arr[pivot] < arr[right], the algorithm would move one position to the left, as –
Now, arr[pivot] = 24, arr[left] = 24, and arr[right] = 14. As arr[pivot] > arr[right], we swap arr[pivot] and arr[right] and now the pivot is at the right, i.e. –
Here, arr[pivot] = 24, arr[left] = 14, and arr[right] = 24. Since the pivot is at right, the algorithm will now start from the left and move towards the right.
Now, arr[pivot] = 24, arr[left] = 24, and arr[right] = 24. Left and right are pointing to the same element which is the pivot element. This point represents the termination of procedure.
Now the element 24, which is the pivot element is present at its exact position.
At this point, all the elements that are to the right side of element 24 are greater than it, and the elements that are left side of element 24 are smaller than it.
Similarly, the quick sort algorithm is applied separately to the left and the right sub-arrays. After sorting, the resultant array would be –
Complexity of the Quicksort algorithm:
Now, let’s us have a look at the time complexity of quicksort in various cases such as best case, average case, and the worst case. We will also examine the space complexity of quicksort.
1. Time Complexity
- Best Case Time Complexity – In Quicksort, the best-case is said to occur when the pivot element is the middle element or is near to the middle element. The time complexity of quicksort in the best-case is O(n*logn).
- Average Case Time Complexity – The average case occurs when the elements of the array are in jumbled order that is not properly ascending or not properly descending. In this case the time complexity of quicksort algorithm would be O(n*logn).
- Worst Case Time Complexity – In quick sort, the worst case will occur when the pivot element is either the greatest or the smallest element of the array. Suppose that the pivot element is always the last element of the array, the worst case would occur when the given array is sorted already in an ascending or the descending order. The time complexity of quicksort in the worst-case is O(n2).
2. Space Complexity
The space complexity for the quicksort algorithm is O(n*logn).
This is all about the quick sort algorithm, read the next article to know about the merge sort algorithm.