Final answer:
When applying quickselect N^(1/2) times, the worst-case complexity is θ(N^(5/2)) and the average case is θ(N^(3/2)). Quicksort once followed by selecting N^(1/2) elements has an average complexity of θ(N log N) due to the dominant sorting part, remaining the same in the worst case.
Step-by-step explanation:
The quickselect algorithm is an efficient way of finding the k-th smallest element in an unordered list, particularly useful when such element needs to be found quickly, without sorting the entire array. It operates by partitioning the array around a pivot element, then only recursing into the part of the array that may contain the k-th smallest element. This is different from quicksort, in that quicksort would then go on to sort both parts of the array.
Applying quickselect N^(1/2) times to find different order statistics changes the computation analysis. For each application, in the worst case, quickselect has a time complexity of θ(N^2). Thus, when applied N^(1/2) times, the total worst-case complexity becomes θ(N^(5/2)). However, on average, quickselect has a time complexity of θ(N), leading to an overall average case complexity of θ(N^(3/2)) when applied N^(1/2) times.
If a quicksort was utilized instead to sort the entire array once, this changes the computation for obtaining the N^(1/2) order statistics substantially. The time complexity for quicksort on average is θ(N log N), and this would then allow for immediate lookup of any order statistic. After sorting, each of the N^(1/2) elements can be selected in constant time, leading to an overall time complexity of θ(N log N) for the sorting plus θ(N^(1/2)) for the selections.
However, the dominant term here is the sorting part, and so the total average case complexity remains θ(N log N), while the worst-case scenario for quicksort is θ(N^2). Consequently, if sorting the array is viable, and N^(1/2) selections are needed, it is computationally more efficient to sort first using quicksort and then perform the selections.