12.3k views
5 votes
In our discussion of Linear-Time Sorting, we talked about an algorithm called radix sort, which does successive iterations of another algorithm called counting sort, with each of those iterations focused on one digit (e.g., sorting by the last digit and ignoring all the others, sorting by the second-to-last digit and ignoring all the others, or whatever). Suppose that you replaced counting sort, at each step, with a hypothetical algorithm that was faster than counting sort but was not stable. If you did, would you expect radix sort to still come up with a correct result

User Klevison
by
3.8k points

1 Answer

5 votes

Answer:

Step-by-step explanation:

Suppose we use some other sorting technique as the subroutine for radix sort to sort the digit, it will be unstable. The final outcome will not be accurate and correct. This is due to the fact that Unstable sort does not retain the order for equal components, thus whatever sorting was done before for lower significant bits might be screwed up by the next significant bit at that level that possesses the same value.

Consider the following scenario:

[431, 135, 132]

⇒ [431,132,135] (The least significant bit is used in the first step of sorting.)

⇒ [132, 431, 135] (Because the sorting algorithm was not reliable, the second level of sorting screwed up the preceding sorting.)

As a result, we will never receive an accurate and coorect outcome in this scenario.

User BryanJ
by
4.2k points