Eight sorting, time complexity and space complexity

Posted by Mikemcs on Mon, 03 Jan 2022 08:07:31 +0100

1. Complexity

1.1 time complexity

Total execution times of statements: t (n) = O (f (n))
f (n) is a function of the problem scale n.

It means that with the increase of problem scale n, the growth rate of algorithm execution time is the same as that of F (n).
Generally, we only care about the term that has the greatest influence on the result of the function as the problem scale n tends to infinity, that is, the highest order term

Constant order O (1)

int i = 1;
int j = 2;
++i;
j++;
int m = i + j;

When the above code is executed, its consumption does not increase with the growth of a variable. Therefore, no matter how long this kind of code is, even if there are tens of thousands of lines, its time complexity can be expressed by O(1).

Linear order O (n)

for(i=1; i<=n; ++i){
   j = i;
   j++;
}

The code in the for loop will execute n times, so its time consumption changes with the change of N, so this kind of code can use O(n) to express its time complexity.

Logarithmic order O(logN)

int i = 1;
while(i<n){
    i = i * 2;
}

After log2^n cycles, the code ends.

Linear logarithmic order O(nlogN)

for(m=1; m<n; m++){
    i = 1;
    while(i<n)    {
        i = i * 2;
    }
}

If a code with a time complexity of O(logn) is cycled N times, its time complexity is n * O(logN), that is, O(nlogN).

Square order O(n2)

for(x=1; i<=n; x++){
   for(i=1; i<=n; i++)    {
       j = i;
       j++;
    }
}

If the n of one layer of loop is changed to m, its time complexity becomes O(m*n).

The commonly used time complexity is as follows:
O(1)< O(log n)< O(n)< O(nlog n)< O(n^2)< O(n^3)< O(2^n)< O(n!)< O(n^n)

1.2 space complexity

S (n) = the storage space consumed by O (f (n)) algorithm. Space complexity is a measure of the amount of storage space temporarily occupied by an algorithm during operation, which also reflects a trend
f (n) is a function of the storage space occupied by the statement with respect to n

Space complexity O(1)

If the temporary space required for the implementation of the algorithm does not change with the size of a variable n, that is, the space complexity of the algorithm is a constant, which can be expressed as O(1)

Spatial complexity O(n)

int[] m = new int[n]
for(i=1; i<=n; ++i){
   j = i;
   j++;
}

The first line new an array. The size of this data is n. although other lines have loops, they do not allocate new space. Therefore, the space complexity of this code mainly depends on the first line, that is, S(n) = O(n)

2. Eight sorting algorithms

2.1 quick sort

Basic idea: divide the sorted data into two independent parts through one-time sorting. All data in one part is smaller than all data in the other part, and then quickly sort the two parts of data according to this method.

def quicksort(lst):
    if len(lst) < 2:
        return lst
    else:
        temp = lst[0]  #Based on the first number
        less = [i for i in lst if i < temp] # Those smaller than the benchmark are placed in the less list
        more = [i for i in lst if i > temp]# Those larger than the benchmark are placed in the more list
        return quicksort(less) + [temp] + quicksort(more) # Returns the spliced list of three lists

testArr = [11, 23, 45, 23, 55, 99, 22]
print(quicksort(testArr))

2.2 bubble sorting

Basic idea: repeatedly visit the sequence to be sorted, compare two elements at a time, and sort according to the small left and the large right until there are no elements to be exchanged.
The larger elements will slowly "float" to the top of the sequence through exchange, so it is called "bubble sorting".

def bubblesort(lst):
    for i in range(len(lst) - 1): # The penultimate number must be compared with the last one, so there is no need for the len(lst) time. No one can compare with the last number
        flag = 0
        for j in range(len(lst) - 1 - i):  # The i-th number must be compared with the number after I. compare len(lst) - 1 - i times
            if lst[j] > lst[j + 1]:
                lst[j], lst[j + 1] = lst[j + 1], lst[j]
                flag = 1
        if flag == 0:  #Judge whether there is any order change after a round. If not, exit directly without sorting
            break
        print(lst)

testArr = [1, 12, 32, 24, 5, 8, 7]
bubblesort(testArr)

Operation results:

[1, 12, 24, 5, 8, 7, 32]
[1, 12, 5, 8, 7, 24, 32]
[1, 5, 8, 7, 12, 24, 32]
[1, 5, 7, 8, 12, 24, 32]

2.3 insert sort

Basic idea: the first part contains all the elements of the array (except the last one), and inserts the last element (i.e. the element to be inserted) into the appropriate position in front.

def insertionSort(arr):
    for i in range(len(arr)):
        print(i)
        preIndex = i - 1  # Records the subscript of the last element of an ordered sequence
        current = arr[i]  # Elements to be inserted
        while preIndex >= 0 and arr[preIndex] > current:  # When the last element is larger than the element to be inserted
            arr[preIndex + 1] = arr[preIndex]      # Move the last element back one subscript
            preIndex -= 1     # The subscript is gradually pushed forward to enter the next round of comparison until the element is smaller than the element to be inserted
        arr[preIndex + 1] = current   # Put the element to be inserted after the smaller element
    return arr


testArr = [12, 234, 22, 123, 33, 78, 95, 30]
print(insertionSort(testArr))

2.4 Hill sorting

Basic idea: perform multiple interval insertion sorting on the array
Firstly, the whole element sequence to be arranged is divided into several subsequences (composed of elements separated by a certain "increment") for direct insertion sorting, and then the increment is reduced in turn before sorting. When the elements in the whole sequence are basically orderly (the increment is small enough), all elements are inserted and sorted once again. Because direct insertion sorting is very efficient when the elements are basically ordered (close to the best case).

import math
def shellSort(arr):  
    gap=1  
    while(gap < len(arr)/3):
        gap = gap*3+1
    while gap > 0:
        for i in range(gap,len(arr)):
            temp = arr[i]
            j = i-gap
            while j >=0 and arr[j] > temp:
                arr[j+gap]=arr[j]
                j-=gap
            arr[j+gap] = temp
        gap = math.floor(gap/3)
    return arr

2.5 selection and sorting

Basic idea: first, find the smallest (large) element in the unordered sequence and store it at the beginning of the sorted sequence, then continue to find the smallest (large) element from the remaining unordered elements, and then put it at the end of the sorted sequence. And so on until all elements are sorted.

def selectionSort(arr):
    for i in range(len(arr) - 1):
        # Record the index of the smallest number
        minIndex = i
        for j in range(i + 1, len(arr)):
            if arr[j] < arr[minIndex]:
                minIndex = j
        # When i is not the minimum number, i is exchanged with the minimum number
        if i != minIndex:
            arr[i], arr[minIndex] = arr[minIndex], arr[i]
    return arr


testArr = [12, 54, 77, 46, 33, 41, 29, 75]
print(selectionSort(testArr))

2.6 heap sorting

Heap sort

2.7 merge sort

Merge sort

2.8 cardinality sorting

Cardinality sort

Topics: Algorithm data structure