# Summary of time space complexity related knowledge

Posted by Bac on Thu, 18 Jun 2020 11:23:13 +0200

```//introduce:
//int Fib(int N)
//{
//	if (N < 3)
//		return 0;
//	return Fib(N - 1) + Fib(N - 2);
//}
//Code of recursive type, using Fibonacci sequence
```

//How to measure the quality of an algorithm? (fast, small memory)
//Measuring the complexity of an algorithm
//Time complexity
//Spatial complexity
//Essence: all are mathematical expressions - the mathematical expression of a basic statement about the problem scale N

//1. Algorithm efficiency
//There are two kinds of algorithm efficiency analysis: the first is time efficiency, the second is space efficiency. Time efficiency is called time complexity, while space efficiency is called space complexity. Time complexity mainly measures the running speed of an algorithm, while space complexity mainly measures the extra space needed by an algorithm. In the early stage of computer development, the storage capacity of the computer is very small. So I really care about the complexity of space. But with the rapid development of computer industry, the storage capacity of computer has reached a high level. So now we don't need to pay special attention to the spatial complexity of an algorithm.

//2. Time complexity
//2.1 concept of time complexity
//Definition of time complexity: in computer science, the time complexity of the algorithm is a function, which quantitatively describes the running time of the algorithm. An algorithm but do we need to test each algorithm on the computer? It can be tested on the computer, but it is very troublesome, so there is a time complexity analysis method. The time spent in an algorithm is proportional to the number of statements executed. The number of basic operations executed in the algorithm is the time complexity of the algorithm.
//Understanding: it is a mathematical expression about the problem scale N, calculating the total execution times of a basic statement in the algorithm
//2.2 progressive representation of big O

//Example:
//Please calculate how many times the basic operation of Func1 has been performed? (+ + count how many times has this been executed)

```//void Func1(int N)
//{
//	int count = 0;
//	for (int i = 0; i < N; ++i)
//	{
//		for (int j = 0; j < N; ++j)
//		{
//			++count;
//		}
//	}
//	for (int k = 0; k < 2 * N; ++k)
//	{
//		++count;
//	}
//	int M = 10;
//	while (M--)
//	{
//		++count;
//	}
//	printf("%d\n", count);
//}
```

//F(N) = N^2 + 2 * N +10
//N = 10 F(N) = 130
//N = 100 F(N) = 10210
//N = 1000 F(N) = 1002010
//In fact, when we calculate the time complexity, we do not need to calculate the exact number of executions, but only the approximate number of executions. Here we use the incremental representation of big O.
//Big O notation: a mathematical symbol used to describe the progressive behavior of a function.
//Derivation of the large O-order method:
//First, calculate the mathematical expression, and deal with it according to the following rules
//1. Replace all the addition constants in runtime with constant 1.
//For example: number of code runs: F(N)=100
// O(F(N))=O(100)---->O(1)
//2. In the modified run times function, only the highest order terms are retained.
//For Func1, we know: F(N) = N^2 + 2 * N +10
// O(F(N)) —> O(N^2 + 2 * N +10) —> O(N^2)
//3. If the highest order term exists and is not 1, the constant multiplied by this term is removed. The result is of order O.
//After using the progressive representation of big O, the time complexity of Func1 is as follows:
//O(N^2)
//N = 10 F(N) = 100
//N = 100 F(N) = 10000
//N = 1000 F(N) = 1000000
//Through the above we will find that the incremental representation of big O removes the items that have little impact on the results, and simply shows the number of executions.

//In addition, some algorithms have the best, average and worst time complexity:
//Worst case: the maximum number of operations of any input scale (upper bound)
//Average: expected number of runs of any input size
//Best case: the minimum number of operations of any input scale (lower bound)
//For example: search for a data x in an array of length N
//Best case scenario: 1 find
//Worst case scenario: N times found
//Average situation: N / 2 times found
//In practice, the general concern is the worst-case operation of the algorithm, so the time complexity of searching data in the array is O(N)

//Why should time complexity be referred to as the total number of times instead of time?
//Algorithms should be measured in different machines:
//Different machines run at different times, so there is no real value (different hardware environment, time has no reference value)
//Run code on a machine. At this time, the machine runs not only one code, but also other programs
//Running our programs on different machines, our programs run a certain number of times

//Common examples:
//Calculate the time complexity of Func2?

```//void Func2(int N)
//{
//	int count = 0;
//	for (int k = 0; k < 2 * N; ++k)
//	{
//		++count;
//	}
//	int M = 10;
//	while (M--)
//	{
//		++count;
//	}
//	printf("%d\n", count);
//}
//The basic operation has been performed 2N + 10 times, and it is known that the time complexity is O by deriving the large O-order method(N)

```
```// Calculate the time complexity of Func3?
//void Func3(int N, int M)
//{
//	int count = 0;
//	for (int k = 0; k < M; ++k)
//	{
//		++count;
//	}
//	for (int k = 0; k < N; ++k)
//	{
//		++count;
//	}
//	printf("%d\n", count);
//}
//The basic operation is executed M + N times, with two unknowns M and N, and the time complexity is O(N + M)

// Calculate the time complexity of Func4?
//void Func4(int N)
//{
//	int count = 0;
//	for (int k = 0; k < 100; ++k)
//	{
//		++count;
//	}
//	printf("%d\n", count);
//}
//The basic operation has been performed 10 times, and the time complexity is O by deriving the large O-order method(1)

//4.
// Time complexity of calculating strchr?
//const char * strchr(const char * str, int character);
//The basic operation is executed once at best and N times at worst. Generally, the time complexity is the worst. The time complexity is O(N)
//strchr means: give a character character, and then look for the character in str (string)

//5. Bubble sorting
// Calculate the time complexity of BubbleSort?
//void BubbleSort(int* a, int n)
//{
//	assert(a);
//	for (size_t end = n; end > 0; --end)
//	{
//		int exchange = 0;
//		for (size_t i = 1; i < end; ++i)
//		{
//			if (a[i - 1] > a[i])
//			{
//				Swap(&a[i - 1], &a[i]);
//				exchange = 1;
//			}
//		}
//		if (exchange == 0)
//			break;
//	}
//}
//The basic operation is executed N times at best and worst(N*(N + 1) / 2 By derivation O Order method + Generally speaking, the worst time complexity is O(N ^ 2)
//6.
// Calculate the time complexity of BinarySearch?
//int BinarySearch(int* a, int n, int x)
//{
//	assert(a);
//	int begin = 0;
//	int end = n - 1;
//
//	while (begin < end)
//	{
//		int mid = begin + ((end - begin) >> 1);
//		if (a[mid] < x)
//			begin = mid + 1;
//		else if (a[mid] > x)
//			end = mid;
//		else
//			return mid;
//	}
//	return -1;
//}
//Basic operation is performed once at best and O at worst(logN)Time complexity is O(logN) ps: logN In algorithm analysis, the base number is 2 and the logarithm is N. Some places will be written lgN. (It is suggested to explain it through origami search logN How is it calculated)

//7.
// Time complexity of Factorial recursion?
//long long Factorial(size_t N)
//{
//	return N < 2 ? N : Factorial(N - 1)*N;
//}
//Through calculation and analysis, it is found that the basic operation recurs N times and the time complexity is O(N).

//8.
// Time complexity of Fibonacci?
//long long Fibonacci(size_t N)
//{
//	return N < 2 ? N : Fibonacci(N - 1) + Fibonacci(N - 2);
//}
//Through calculation and analysis, it is found that the basic operation recurs 2^N times, and the time complexity is O(2 ^ N). (Binary tree explanation of drawing recursive stack frame)
//Optimize to O (N)
//unsigned long long Fib(int N)
//{
//	unsigned long long first = 1;
//	unsigned long long second = 1;
//	unsigned long long ret = 1;//Save results
//	for (int i = 2; i < N; ++i)
//	{
//		ret = first + second;
//		first = second;
//		second = ret;
//	}
//	return ret;
//}
//int main()
//{
//	printf("%d ", Fib(10));
//	return 0;
//}
```

//3. spatial complexity
//Understanding: just like the time complexity method, it is a mathematical expression, a function that creates the number of variables (objects) in a function

```//Example 1:
//unsigned long long Fib(int N)
//{
//	unsigned long long first = 1;
//	unsigned long long second = 1;
//	unsigned long long ret = 1;//Save results
//	for (int i = 2; i < N; ++i)
//	{
//		ret = first + second;
//		first = second;
//		second = ret;
//	}
//	return ret;
//}
//int main()
//{
//	printf("%d ", Fib(10));
//	return 0;
//}
//Int n, unsigned long long first, unsigned long long second, unsigned long RET, int i create a total of five variables
//The variables created are fixed, so:
//Spatial complexity O(1)

//Example 2:
//There are two ordered sequences, which require that two sequences be combined into one ordered sequence:
//First sequence: 2 5 6 8
//Second sequence: 13 5 7 8 9
//Method: Creation
//1. Build the third sequence,
//2. Put the first and second sequence into the third sequence
// (1) Create two variables index1, index2, and index to represent the first number of one, two, three sequences respectively
// (2) Index 1 and index 2 are compared. The smaller index is assigned and one bit is added back. The larger index is unchanged
// (3) Repeated comparison
//#include<stdio.h>
//#include<malloc.h>
//int* MergeData(int array1[], int size1, int array2[], int size2)
//{
//	int index1 = 0, index2 = 0, index = 0;
//	int* array = (int*)malloc((size1 + size2)*sizeof(array1));
//	if (NULL == array )
//		return NULL;
//	while (index1 < size1 && index2 < size2)
//	{
//		if (array1[index1] <= array2[index2])
//			array[index++] = array1[index1++];
//		else
//			array[index++] = array2[index2++];
//	}
//	while (index1 < size1);
//        array[index++] = array1[index1++];
//	while (index2 < size2);
//		array[index++] = array2[index2++];
//	return array;
//}
//int main()
//{
//	int array1[] = { 2, 5, 6, 8 };
//	int array2[] = { 1, 3, 5, 7, 8, 9 };
//	int* array = MergeData(array1, sizeof(array1) / sizeof(array1), array2, sizeof(array2) / sizeof(array2));
//	for (int i = 0; i < 10; ++i)
//		printf("%d ", array[i]);
//	printf("\n");
//	free(array);
//	array = NULL;
//	return 0;
//}
//Time complexity: O(M+N)
//Space complexity: O(M+N)  The number of arrays is changing( malloc)

//Recursive algorithm:
//Example 1:
//int Fac(int N)
//{
//	if (0 == N)
//	    return 1;
//	return Fac(N - 1)*N;
//}
//When each function is running, the system must partition a section of stack space for the function on the stack
//Stack space storage: storage, local variables, register information...
//As long as the function is determined, the stack space divided by the compiler for each call to the function is the same -- constant
//N+1 * word space -- "constant --" O()
//Space complexity: O(N)
//Time complexity: O(1)
//Note: recursion cannot be too deep
//      Reason: every recursion is a function call. Each function call needs to divide a stack frame on the stack. The size of the second stack is limited. If the stack is too deep, it may not be able to allocate the battle array and cause the program to crash
//Example 2: Fibonacci series
//long long Fib(int N)
//{
//	if (N < 3) return 1;
//	return Fib(N - 1) + Fib(N - 2);
//}
//Time complexity: O(2^N)
//give an example:
//        f5
//        /\
//     f4    f3
//     /\      /\
//   f3  f2  f2  f1
//   /\      1    1
//  f2 f1
//  1   1
//Call the one on the left and then the one on the right.
//First use the left side. When the left side is used up, the space will be returned to the system. At this time, the right side will occupy the space just used up on the left side
//(benefit: space saving)
//Spatial complexity O(N)

```

For the related knowledge of time-space complexity, I used many examples to summarize. After in-depth understanding, its algorithm is to learn mathematical problems. We should also pay attention to that sometimes we forget some points, which is a great defect for the results.