[secretly kill the little partner Pytorch20 days - day13-nn.functional and nn.Module]

1, NN Functional and NN Module Earlier, we introduced the structure operation of Pytorch's tensor and some common API s in mathematical operation. Using these tensor API s, we can construct neural network related components (such as activation function, model layer, loss function). Most of the functional components related to Pytorch and neu ...

Posted by godsent on Sun, 27 Feb 2022 10:39:22 +0100

[machine learning] basic learning notes of DS 4: neural network and back propagation algorithm

Neural network and back propagation algorithm 4.1 cost function    in the previous section, we learned the basic structure of neural network and forward propagation algorithm. In the supporting operation, the weights of neural network have been given, but how can we train the appropriate weights ourselves? Therefore, we need to ...

Posted by garethdown on Sat, 26 Feb 2022 16:30:27 +0100

Common activation functions

Activation function: The principle of activation layer is to add nonlinear activation function after the output of other layers to increase the nonlinear ability of the network and improve the ability of neural network to fit complex functions. The selection of activation function is very important for the training of neural network. The commo ...

Posted by saronoff on Fri, 25 Feb 2022 09:19:44 +0100

Model fine tuning technology

1, Common skills in transfer learning: fine tuning 1.1 concept Take the weights trained on the big data set as the initialization weight of the specific task (small data set), and retrain the network (modify the full connection layer output as needed); The training methods can be: 1. Fine tune all layers; 2. Fix the weights of the front l ...

Posted by daanoz on Mon, 21 Feb 2022 10:51:38 +0100

Neural network case

Neural network case Learning objectives Able to use TF Keras get datasetConstruction of multilayer neural networkBe able to complete network training and evaluation The MNIST dataset using handwritten digits is shown in the figure above. The dataset contains 60000 samples for training and 10000 samples for testing. The image is a fixed ...

Posted by dabas on Sun, 20 Feb 2022 10:10:24 +0100

Deep learning based on Keras -- the construction and training of LeNet

Deep learning based on Keras (II) -- construction and training of LeNet LeNet is a very efficient convolutional neural network for handwritten character recognition. Although the network is small, it contains the basic modules of deep learning: convolution layer, pooling layer and full connection layer. It is also the basis of other deep l ...

Posted by tracivia on Sat, 19 Feb 2022 16:48:25 +0100

Plane data classification of deep learning single hidden layer

Teacher Wu began to build a hidden neural network. 1, Importing datasets and drawing Two files are required before importing the dataset. Please refer to [data] import numpy as np import pandas as pd from matplotlib import pyplot as plot from testCases import * import sklearn from sklearn import datasets from sklearn import linear_model from ...

Posted by jlgray48 on Sat, 19 Feb 2022 05:58:59 +0100

Introduction to deep learning - initial value of weight (Xavier,He initial value, distribution of activation function value of hidden layer)

Initial value of weight (the pit here is too deep. This article is just an introduction to in-depth learning and will not pursue theory in detail) In the learning of neural network, the initial value of weight is particularly important. In fact, what kind of weight initial value is set is often related to the success of neural network lea ...

Posted by mikesta707 on Mon, 14 Feb 2022 12:57:56 +0100

Reading notes of Python deep learning: Chapter 2 mathematical basis of neural network

catalogue Chapter 2 mathematical basis of neural network 2.1 initial knowledge of neural network 2.2 data representation of neural network 2.2.1 scalar (0D tensor) 2.2.2 vector (1D tensor) 2.2.3 matrix (2D tensor) 2.2.4 3D tensor and higher dimensional tensor 2.2.5 key attributes 2.2.6 manipulating tensors in Numpy 2.2.7 concept of da ...

Posted by hcdarkmage on Fri, 11 Feb 2022 16:36:04 +0100

ML (machine learning) neural network, step function, sigmoid activation function

In the last article, we said that single-layer perceptron realizes simple logic circuit and multi-layer perceptron realizes XOR gate circuit. Threshold is introduced θ, Offset b. Now, according to the expression in the previous article, the reality activation function. The expression of our previous perceptron is as follows: y = { x1w1 ...

Posted by ashu.khetan on Fri, 11 Feb 2022 13:34:17 +0100