Tag：Neuron

Time：20211013
Whenever we train our own neural network, we need to pay attention to the problem called neural networkgeneralizationProblems. In essence, this means how well our model can learn from given data and apply the information to other aspects. When training the neural network, there will be some data on the neural networkTraining,Some data will also […]

Time：202196
preface I wonder if you have seen the popular Korean drama kingdom in April? I’ve watched the whole episode. When Prince Yuanzi was born, he was bitten by zombies in the palace. However, because brain neurons had not yet formed and parasites could not control neurons, the medical woman judged that it would not affect […]

Time：2021820
Recently, I was learning machine learning. I saw this note. It was introduced in great detail. I recorded it as learning. author Jim Liang, from sap (the world’s largest business software company). Book features Clear organization, including graphical representation, easier to understand, detailed notes on formulas, etc. Content summary It is mainly divided into basic […]

Time：202189
By angel DasCompile VKSource: towards Data Science introduce Artificial neural networks (ANNs) are the advanced version of machine learning technology and the core of deep learning. Artificial neural networks involve the following concepts. Inputoutput layer, hidden layer, neurons under hidden layer, forward propagation and back propagation. Simply put, the input layer is a set of […]

Time：2021616
First, the code implementation of lenet5 network (based on pytorch) is given class LeNet(nn.Module): def __init__(self): super(LeNet, self).__init__() # input_size=(1*28*28) self.conv1 = nn.Sequential( # in_channels, out_channels, kernel_size #Padding = 2 to ensure the same input and output size nn.Conv2d(1, 6, 5, padding=2), # input_size=(6*28*28) nn.ReLU(), # output_size=(6*14*14) nn.MaxPool2d(kernel_size=2, stride=2), ) self.conv2 = nn.Sequential( nn.Conv2d(6, 16, […]

Time：202162
This article is from: http://ufldl.stanford.edu/wiki/index.php… There are some difficult points in the original translation, which were written once in my own language. Supervised neural networks require that our data be labeled. However, neural networks are not limited to dealing with labeled data, but also deal with unlabeled data, such as: ${x ^ {(1)}, […]

Time：202161
This article is a very interesting project I saw last year. I tried to imitate its code and write a similar project, but it has not been completed. Here is a related blog translated by the original author. Maybe more people are interested in it. Project demonstration: demo implemented by the original author with JavaScript […]

Time：2021529
Neural network is very cute! 0. ClassificationThe most important use of neural network is classification. In order to give you an intuitive understanding of classification, let’s take a look at a few examplesSpam identification: now there is an email that extracts all the words that appear in it and sends them to a machine. The […]

Time：2021318
The goal of activation functions is to make neural networks nonlinear. The activation function is continuous and differentiable. Continuous: when the input value changes slightly, the output value also changes slightly; Differentiable: in the domain of definition, there is a derivative everywhere; Common activation functions: sigmoid, tanh, relu. sigmoid Sigmoid is a smooth step function […]

Time：2021311
The size of neurons in 3 * 3 convolution nucleus and 2 * 5 convolution nucleus #Here’s Kerner_ size = 2*5 class CONV_ NET( torch.nn.Module ): #CONV_ Net class inheritance nn.Module class def __init__(self): super(CONV_ NET, self).__ init__ () make conv_ Net class contains parent class nn.Module All properties of #Super () requires two arguments, […]

Time：2021214
Why study neural network algorithm Neural network is an old algorithm, but it is also the preferred technology of many machine learning. We already have linear regression and logistic regression algorithms. Why should we study neural network algorithm? Let’s look at the logistic regression problem with enough nonlinear terms. For complex machine learning problems, there […]

Time：2021128
Compared with computers, I prefer mathematical deduction. I think mathematical deduction is touchable and more threedimensional, and even you can trace back to the root. As the first article of neural network algorithm, I decided to start with softmax output layer. I think this article is good This article is very professional As shown in […]