# pytorch tensor data basic operation

Posted by boardy on Fri, 04 Mar 2022 20:16:55 +0100

## pytorch data operation

### Getting started and building data

1. Guide Package
```import torch
```
1. Create a row vector (a special tensor)
What is a tensor: a tensor represents an array of values that may have multiple dimensions. A tensor with one axis corresponds to a vector in mathematics; A tensor with two axes corresponds to a mathematical matrix; Tensors with more than two axes have no special mathematical name.
Row vectors are a kind of tensors.
```x = torch.arange(12)       # Contains 0-11 12 integers
```
1. Shape of access tensor
```x.shape
```
1. Get the total number of elements of the tensor
```x.numel()
```
1. Change tensor shape
x. Reshape (height, width)
```X = x.reshape(3, 4)     #The size of the modified target tensor must correspond to the total number before modification
X = x.reshape(-1, 4)     # Give the width and automatically calculate the height
X = x.reshape(3, -1)     # Give the height and calculate the width automatically
```
1. Conventional matrix construction method (all 0, 1 and random number construction)

Construct a matrix tensor with all elements being 0:

```torch.zeros( 3, 4)      # Build a 4 * 3 matrix
```

Output:

```tensor([[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.]])
```
```torch.zeros( 2, 3, 4)   # It can be seen as creating a matrix of 2 * (4 * 3)
```

Output:

```tensor([[[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.]],

[[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.]]])
```

And so on: torch Zeros (3, 2, 3, 4) # can be seen as creating a matrix of 3 * (2 * (4 * 3)), which is similar to multidimensional arrays
Build a matrix of all 1: torch ones((2, 3, 4))
Construct a matrix whose internal elements are random numbers: torch randn(3, 4)
***Note: * * * each element in the above random number matrix is randomly sampled from the standard Gaussian distribution (normal distribution) with mean value of 0 and standard deviation of 1.
Output example:

```tensor([[ 0.4315, -0.8804, -0.1730, -1.2925],
[ 0.3317, -1.1386, -0.6625,  0.3001],
[ 0.0371, -0.4246,  0.0326,  0.1565]])
```
1. Build with list
```torch.tensor([[2, 1, 4, 3], [1, 2, 3, 4], [4, 3, 2, 1]])
```

The tuple used is consistent with its result

### Tensor operator

Operation by element of simple operator
Common operator
The precondition of tensor operator operation is that it has the same shape (same height and width), and each element of the two tensors is operated one by one

```x = torch.tensor([1.0, 2, 4, 8])
y = torch.tensor([2, 2, 2, 2])
x + y, x - y, x * y, x / y, x ** y  # **Operators are exponentiation operations
```

result:

```(tensor([ 3.,  4.,  6., 10.]),
tensor([-1.,  0.,  2.,  6.]),
tensor([ 2.,  4.,  8., 16.]),
tensor([0.5000, 1.0000, 2.0000, 4.0000]),
tensor([ 1.,  4., 16., 64.]))
```

Logical operation
The logical operation is also applicable between two tensors of the same shape. It operates according to the element, and sets True if it conforms to, otherwise sets False

```X == Y
tensor([[False,  True, False,  True],
[False, False, False, False],
[False, False, False, False]])
```

Summation of tensor elements
Summing all elements in the tensor will produce a single element tensor. It should be noted here that the final result is a single element tensor rather than a value

```X.sum()
result: tensor(66.)
```

Tensor connection
Two tensors of the same shape can be connected transversely or longitudinally

```X = torch.arange(12, dtype=torch.float32).reshape((3,4))
Y = torch.tensor([[2.0, 1, 4, 3], [1, 2, 3, 4], [4, 3, 2, 1]])
torch.cat((X, Y), dim=0), torch.cat((X, Y), dim=1)
```

Connect using cat function

```Longitudinal connection
torch.cat((X, Y), dim=0)
tensor([[ 0.,  1.,  2.,  3.],
[ 4.,  5.,  6.,  7.],
[ 8.,  9., 10., 11.],
[ 2.,  1.,  4.,  3.],
[ 1.,  2.,  3.,  4.],
[ 4.,  3.,  2.,  1.]])

Transverse connection
torch.cat((X, Y), dim=1)
tensor([[ 0.,  1.,  2.,  3.,  2.,  1.,  4.,  3.],
[ 4.,  5.,  6.,  7.,  1.,  2.,  3.,  4.],
[ 8.,  9., 10., 11.,  4.,  3.,  2.,  1.]])
```

Tensor can expand the axis (row or column) with length 1, that is, broadcast. The premise must be that the length is 1, which is actually the vector format. This vector can have only one column (width is 1), such as shape is (3,1) or (5,1), or only one row (height is 1), such as (1,3) or (1,5). If the tensors of two different shapes can become the same shape through broadcast, Then the operator operation can be performed

```t2 = torch.arange(4).reshape(1,4) + torch.arange(6).reshape(6,1)         #Can broadcast, can run
t2 = torch.arange(4).reshape(1,4) + torch.arange(6).reshape(1,6)         #It can be broadcast, but it cannot become the same shape through broadcast and cannot be run
t2 = torch.arange(4).reshape(1,4) + torch.arange(8).reshape(2,4)         #Can broadcast, can run
t2 = torch.arange(12).reshape(3,4) + torch.arange(8).reshape(2,4)        #An error is reported and cannot be broadcast. The shape is different

```

### Indexing and slicing

As in any other Python array, the elements in the tensor can be accessed by index. Like any Python array: the index of the first element is 0 and the index of the last element is - 1; You can specify a range to include the first element and elements before the last. Consider tensors as multidimensional lists.

```X[-1], X[1:3]
(tensor([ 8.,  9., 10., 11.]),
tensor([[ 4.,  5.,  6.,  7.],
[ 8.,  9., 10., 11.]]))

X[1, 2] = 9     #Write value to element
X[1, 2] = 9
tensor([[ 0.,  1.,  2.,  3.],
[ 4.,  5.,  9.,  7.],
[ 8.,  9., 10., 11.]])
#The usage is the same as multidimensional list
```

If we want to assign the same value to multiple elements, we just need to index all elements and assign values to them. For example, [0:2,:] accesses rows 1 and 2, where ':' represents all elements along axis 1 (column). Although we are talking about the index of matrix, this also applies to vectors and tensors with more than 2 dimensions.

```X[0:2, :] = 12
X:
tensor([[12., 12., 12., 12.],
[12., 12., 12., 12.],
[ 8.,  9., 10., 11.]])
```

Content from: Teacher Li Mu's hands-on learning and deep learning (personal study notes, please contact to delete if there is infringement)