Interview sites: bubble sort, selection sort, insertion sort, merge sort and quick sort

Posted by programming.name on Sun, 21 Nov 2021 03:42:01 +0100

Deep development post, in fact, sorting is also an inseparable link. Bubble sorting, selection sorting, insertion sorting, merge sorting, quick sorting and heap sorting are also the technical points I have frequently asked since the autumn move

Sorting algorithm has two important knowledge points

  1. Memory consumption: the memory consumption of the algorithm can be measured by space complexity, and the sorting algorithm is no exception. However, for the spatial complexity of sorting algorithm, there is a concept of in-situ sorting. In situ sorting algorithm refers to the sorting algorithm whose spatial complexity is O(1). Bubble sorting, insertion sorting and selection sorting all belong to in-situ sorting algorithm
  2. Stability: for sorting algorithms, we also have a measure of stability. This concept means that if there are elements with equal values in the sequence to be sorted, after sorting, the original order of equal elements remains unchanged.

For example, we have a set of data 2 9 3 4 8 3   The order from small to large is 2 to 3   3 4 8 9. After a sort algorithm, if the sequence of two 3's does not change, it is called a stable sort algorithm, otherwise it is an unstable sort algorithm

Algorithm nameTime complexitySort stablySort in place
Bubble sortingO(N^2)yesyes
Insert sortO(N^2)yesyes
Select sortO(N^2)noyes
Merge sortO(nlogn)yesno
Quick sortO(nlogn)noyes
Heap sortO(nlogn)yesyes

Bubble sorting

  1. The average complexity is O(N^2)
  2. The best case is that O(1) itself is ordered
  3. The worst is reverse O(N^2)
  4. The spatial complexity is O(1)

Bubble sorting only operates on two adjacent data. Each bubbling operation will compare two adjacent elements to see whether they meet the size relationship requirements. If you are not satisfied, let them exchange. A bubble will make at least one element move to the position it should be. Repeat n times to complete the sorting of N data.

 

class Sort{
public:
	void MaoPao_Sort(vector<int> &arr){
		//1. Judge overflow conditions
		if(arr.size() <2) return;
		int length =arr.size(); 
		for(int i =0;i < length;i++){
			for(int j=0; j < length -i -1 ;j++){
				if(arr[j] >arr[j+1]){
					int temp = arr[j];
					arr[j]= arr[j+1];
					arr[j+1]=temp;
				}
			}
		}
	}		
};

Insert sort

         The origin of the idea of insertion sorting is to find the appropriate position to insert and migrate the following elements according to the idea of inserting an element into an ordered array

First, we divide the data in the array into two intervals, sorted interval and unordered interval. The initial sorted interval has only one element, which is the first element of the array. The core idea of the insertion algorithm is to take the elements in the unordered interval, find the appropriate insertion position in the sorted interval, insert them, and ensure that the sorted interval data is always in order. Repeat this process until the element in the unordered interval is empty, and the algorithm ends.

 

class Sort{
public:
	void Insert_Sort(vector<int> &arr){
		//1. Judge overflow conditions
		if(arr.size() < 2) return;
		int length =arr.size();
		int j =0;//Subscript of the initial sorted interval 
		for(int i =1;i < length ;i++){ //Take elements from unordered intervals
			int temp =arr[i];
			j =i-1;    //Continuously update sorted intervals
			while(j >= 0 && temp <a[j]){
				//If it is small, move back to find the right insertion position 
				arr[j+1]=arr[j];
				j--; 
			} 
			arr[j+1]=temp;  //Insert element 
		} 
	}
};

Select sort

The implementation idea of selective sorting algorithm is a bit similar to insertion sorting, which is also divided into sorted interval and unordered interval. However, selecting sorting will find the smallest element from the sorted interval every time and put it at the end of the sorted interval

 

class Sort{
public:
	void Select_Sort(vector<int> &arr ,int length){
		for(int i =0;i < length -1;i++){
			int min_number =arr[i];
			int flag = i;
			for(int j =i;j <length ;j++){
				if(min_number > arr[j]){
					min_number = arr[j];
					flag =j;
				}
			}
			//Exchange numbers
			arr[flag] =arr[i];
			arr[i]=min_number; 
		}
	}
}; 

Merge sort

Merge sort is from bottom to top, using the idea of divide and conquer. The data is first split in the merge, and the merged data is stored in the temporary array to ensure that the original data position does not change. It is a stable sort, but not in-situ sort. The time complexity is O(nlogn) and the space complexity is O(N)

class Sort{
public:
	//Merge sort 
	void MergeSort(vector<int> & arr){
		if(arr.size() < 2){
			return ;
		} 
		//Split function 
		Merge_Process(arr,0,arr.size())-1);
	}
	//Split first, this is the split function 
	void Merge_Process(vector<int> &arr,int start,int end){
		//Recursive splitting requires the termination condition of recursion
		if(end -start == 0) return;
		int mid =((end -start)/2) +start;
		Merge_Process(arr,start,mid);
		Merge_Process(arr,mid+1,end);
		//In merge
		Merge(arr,start,mid,end); 
	} 
	//Merge function
	void Merge(vector<int> &arr,int start,int mid, int end){
		vector<int> temp(end-start+1,0);//Initialize a temporary array
		int tempIndex =0; //Secondary spatial index
		int leftIndex =start;
		int rightIndex =mid+1;
		while(leftIndex <= mid && rightIndex <= end){
			if(leftIndex <rightIndex){
				temp[tempIndex++] =arr[leftIndex++]; 
			}else{
				temp[tempIndex++] =arr[rightIndex++];
			}	 
		}
		while(leftIndex <= mid){
			temp[tempIndex++]=arr[leftIndex++];
		} 
		while(rightIndex <= end){
			temp[tempIndex++]=arr[rightIndex++];
		}
		for(int i =0;i< temp.size();i++){
			arr[start+i]=temp[i];
		}
	}
}; 

Quick sort

Quick sorting is to partition first. After dealing with the sub problem, any partition point is obtained by finding the interval. The small partition point is on the left and the large partition point is on the right. The time complexity is O(nlong) and the space complexity is O(1). It is in-situ sorting but not stable sorting

For fast scheduling optimization, there are three number centring method and random method. They are all to prevent duplicate elements in the array to be sorted. This I demonstrate is the random method

class Sort{
public:
	void quickSort(vector<int> &arr,int begin, int low){
		if(begin <end){
			//Generate a random value 
			int index =rand()%(end-begin+1)+begin;
			//Then replace the generated random value to the first place of the array 
			swap(arr[begin],arr[index]); 
			int i =begin;
			int j =end;
			int base =arr[i];//Reference position
			while(i <j){
				while(i<j&& arr[j] >= base){
					j--;
				}
				num[i]=num[j];
				while(i<j && arr[i] < base){
					i++;
				}
				num[j]=num[i];
			}
			//Regression benchmark 
			num[i]=base;
			//Recursion starts processing subproblems 
			quickSort(arr,begin,i-1);
			quickSort(arr,i+1,end); 
			 
		}
	}
}; 

Topics: Algorithm data structure