Runtime of tree sort algorithm confusion

Can anyone explain to me why the average runtime complexity of the program here – https://www.geeksforgeeks.org/tree-sort/ – is nlogn and not n^2logn? Similarly, why is the worst case time complexity n^2 and n^3?

The explanations for both the average and worst case runtime seem to only consider inserting the elements from the array into the tree. The runtime of doing an inorder tree traversal is O(n), so shouldn’t the runtimes in the link be multiplied by n?

Is it because the elements are simply being printed out and not added to a new array?

Combining merge sort with insertion sort – Time complexity

I am learning algorithms from the CLRS book on my own, without any help. It has an exercise which combines merge sort {O(n log n)} with insertion sort {O($ n^{2} $ )}. It says that when the sub-arrays in the merge-sorting reach a certain size “k”, then it is better to use insertion sort for those sub-arrays instead of merge sort. The reason given is that the constant factors in insertion sort make it fast for small n. Can someone please explain this ?

It asks us to show that (n/k) sublists, each of length k, can be sorted by insertion sort in O(nk) worst-case time. I found from somewhere that the solution for this is O($ nk^{2}/n $ ) = O(nk). How do we get this part O($ nk^{2}/n $ ) ?

Thanks !

Time complexity of a hybrid merge and selection sort algorithm

I’m trying to analyse the time and space complexity of the following algorithm, which is essentially a hybrid of a merge and selection sort. The algorithm is defined as follows:

def hybrid_merge_selection(L, k = 0):     N = len(L)     if N == 1:         return L     elif N <= k:         return selection_sort(L)     else:         left_sublist = hybrid_merge_selection(L[:N // 2])         right_sublist = hybrid_merge_selection(L[N // 2:])         return merge(left_sublist, right_sublist)  

My thinking is that the worst case scenario occurs when $ k$ is extremely large, which means that the insertion sort algorithm is always applied resulting in a time complexity of $ O(n^{2})$ , where $ n$ is the length of the list and the best case scenario occurs when $ N$ when $ k == 0$ , so the merge sort algorithm is only applied resulting in a time complexity of $ O(n\log_{2}n)$ . However, could somebody give me a more detailed and mathematical explanation of the time complexity, for all scenarios, namely worst, best, average.

Analysing worst case time complexity of quick sort for various cases

I am trying to understand worst case time complexity of quick sort for various pivots. Here is what I came across:

  1. When array is already sorted in either ascending order or descending order and we select either leftmost or rightmost element as pivot, then it results in worst case $ O(n^2)$ time complexity.

  2. When array is not already sorted and we select random element as pivot, then it gives worst case “expected” time complexity as $ O(n log n)$ . But worst case time complexity is still $ O(n^2)$ . [1]

  3. When we select median of [2] first, last and middle element as pivot, then it results in worst case time complexity of $ O(n log n)$ [1]

I have following doubts

D1. Link 2 says, if all elements in array are same then both random pivot and median pivot will lead to $ O(n^2)$ time complexity. However link 1 says median pivot yields $ O(n log n)$ worst case time complexity. What is correct?

D2. How median of first, last and middle element can be median of all elements?

D3. What we do when random pivot is ith element? Do we always have to swap it with either leftmost or rightmost element before partitioning? Or is there any algorithm which does not require such swap?

Selection Sort vs Merge Sort

I recently wrote a written test for the recruitment of Scientists/Engineers in ISRO(Indian Space Research Organization) few days back and the following question appeared in the test.
Of the following algorithms, which has execution time that is least dependent on initial ordering of the input?

  1. Insertion Sort
  2. Quick Sort
  3. Merge Sort
  4. Selection Sort

Well if array is sorted, when we are doing 2 way merge always the left subarray will get exhausted and we have to simply fill in the right sub array. So, number of comparisons will be equal to length of left subarray everytime. In selection sort, if array is already sorted, number of comparisons will be the same as that of the worst case, however the index of minimum element will change only after 1 full pass.

In worst case, number of comparisons in 2 way merge will be (length of left subarray+length of right subarray-1). In selection sort, worst case minimum element’s index will keep on changing after every comparison.

Only 1 option can be correct. So, what’s the best answer?

How to sort ints with user input?

I am trying to sort in ascending or descending order with user input however its telling me about a syntax error that i have no idea about. This is the code that I have.

N = [10, 2, 1, 8, 5, 7, 6, 4, 3] A = int(input(“Ascending or Desending order, press 1 for acsending or 2 for descending order”) if (number == 1): N.sort() print(N) elif (number == 2): N.sort(reverse = True) print(N) else: print(“Sorry, but there is no sorting pattern”)

A sort of job scheduling problem

I have been thinking about the following problem:

Let $ J$ be a set of jobs that need to be performed. Each jobs comprises of some number ($ >1$ ) of tasks, and a job is considered finished when all of the tasks have been completed. Find an ordering to perform the tasks such that the sum of the maximum time needed to finish each job is minimised. Assume that the time needed to finish each task is unary and that the time to finish each job is the time that passes from when one of the tasks of the job starts until the last task of the job finishes.

For example:

Suppose we have 3 jobs: $ J_1 = \{a,b,c\}$ , $ J_2 = \{b,d\}$ and $ J_3 = \{a,c\}$ , where $ a,b,c,d$ are tasks. An optimal ordering for this set of tasks is $ a-c-b-d$ because the time needed to finish $ J_1$ is 2, the time needed to finish $ J_2$ is 1 and the time needed to finish $ J_3$ is also 1. In total, the maximum time $ t$ to finish all jobs is 2+1+1 = 4.

An example of a bad ordering is $ b-a-c-d$ would result in $ t = 2+3+1 = 6$

I was thinking that maybe I could use dynamic programming to find an optimal ordering but ultimately I cannot do any better than ending up checking every possible order. Is there any trick to finding optimal orders or are there any related problems I could study?