Given the following 2d array:
6 8 11 17
9 11 14 20
18 20 23 29
24 26 29 35
Each row and column is sorted as well as the diagonals are sorted too (top left to bottom right). Assuming we have n² elements in the array (n = 4
in this case), it is trivial to use quicksort which takes O(n² log(n²)) = O(n² log(n))
to sort the 2d array. My question is can we sort this in O(n²)
?
The goal is to use the given semi-sorted 2d array and come-up with a clever solution.
The target output is:
6 8 9 11
11 14 17 18
20 20 23 24
26 29 29 35
Time Complexity: O (N * M), where N is the number of rows and M is the number of columns.
Make the 2D array into a separate simple (1D) array (STEP 1). Then use the Arrays. sort() method to sort the simple array (STEP 2). Then set each space of the 2D array to be the number of columns across (X-coordinate where the space will be changed) multiplied by the number of spaces per row in the 2D array.
Naive approach: The simple idea is to traverse the array and to search elements one by one. Algorithm: Run a nested loop, outer loop for row and inner loop for the column. Check every element with x and if the element is found then print “element found”
We iterate over each element of the input array which contributes O(n) time complexity. We also iterate over the auxiliary array which contributes O(k) time. So the overall time complexity is O(n + k). Space Complexity: We are using an auxiliary array of size k, so the space complexity comes out to be O(k). 8. Radix Sort
Row wise sorting in 2D array. Given a 2D array, sort each row of this array and print the result. Examples: Start iterating through each row of given 2D array, and sort elements of each row using efficient sorting algorithm.
This way the overall complexity depends on the algorithm which is used for sorting each bucket which is generally insertion sort, thus giving quadratic complexity. Average case and best case: O(n + k)
We've covered the time and space complexities of 9 popular sorting algorithms:Bubble Sort, Selection Sort, Insertion Sort, Merge Sort, Quicksort, Heap Sort, Counting Sort, Radix Sort, and Bucket Sort. 1. Bubble Sort In bubble sort,we compare each adjacent pair.
Yes, we can sort this in O(n^2) time.
Reduction to sorting a 1D array
Let us first show that this new problem of sorting a 2D array (such that each row, column, and top-left-to-bottom-right diagonal is sorted) can be reduced to the problem of sorting a 1D array of n^2 elements.
Suppose we have a sorted 1D array of n^2 elements. We can trivially rearrange this in to a sorted n x n array by setting the first n numbers as the first row, followed by the next n numbers as the second row, and repeat until we exhaust the array.
Hence, given a 2D array of n^2 numbers, we can transform it into a 1D array in O(n^2) time, sort this array, then transform it back to the desired 2D array in O(n^2) time. Thus, if we can find a sorting algorithm for a 1D array in O(n^2), we can equivalently solve this new problem in O(n^2) time.
Sorting a 1D array in linear time
Given this, we simply need to provide a linear time sort. i.e. given n^2 elements, sort them in O(n^2) time. Conveniently, there are multiple algorithms you can use to accomplish this such as counting sort or radix sort, although they do come with various caveats. However, assuming a reasonable range of numerical values given the number of items to be sorted, these sorts will run in linear time.
Thus given n^2 elements in an n x n array, this 2D sorting problem can be reduced in O(n^2) time to a 1D sorting problem, which can then be solved with various linear time sorting algorithms in O(n^2) time. Hence, overall, this problem can be solved in O(n^2) time.
Sorting with a comparison sort
Following the discussion in the comments, the next step is to ask: what about comparison sorts. Comparison sorts are beneficial because it would allow us to avoid the previously mentioned caveats of counting and radix sorts.
However, even with this additional information, a linear time comparison sort is unlikely in practice, because this would require us to compute the final position of each number in O(1) time. We know this isn't possible using a comparison sort.
Let's consider a small example: what should be the final sorted position of the number originally in row 1, column 2? We know that it has to be the first of the numbers in columns 2...n. However, we don't know where it belongs relative to the numbers in column 1 (other than the number in row 1, column 1).
In general, for any number in the original square, we are uncertain of its final sorted position relative to all numbers to its lower left and the numbers to its upper right. It would take O(log_2(n)) comparisons to find the relative position of each number, and there are O(n^2) numbers to position. This uncertainty prevents us from achieving a linear time sort in practice.
But the additional information that we have should allow us to achieve some speedups. For example, we could adapt merge sort to this problem. In a standard merge sort we start by splitting our original array into half and repeat until we have arrays of size 1 that are guaranteed to be sorted, then we repeatedly merge these subarrays until we have one single array. For n^2 elements, we have to create a binary tree with log_2(n^2) layers, and each layer takes O(n^2) time to merge.
Using the additional information in your problem setup, we don't have to split the arrays until they are of size 1. Instead, we can start off with n sorted arrays of length n and start merging from there. This halves the number of layers we have to merge, and gives us a final runtime of O(n^2 log_2(n)).
Conclusion
In practice, this additional information allows some speedups for comparison sorts, allowing us to achieve O(n^2 log_2(n)) run times.
But in order to achieve a linear time sort that runs in O(n^2) time, we have to rely on algorithms such as counting or radix sort.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With