Neural network: Using genetic algorithms to train and deploy neural networks: batch and mini-batch


Data, example, sample and pattern
Training data of neural network includes examples. These examples can be called samples because the neural network learns them. However, examples in the form of raw data and not called patterns - concepts that contain knowledge. Examples of pattern concepts are regular expression patterns, wool patterns, metallic patterns.

Data Mining
The concept of mining here is exactly the same as its literal meaning of [mineral] mining, patterns and knowledge are extracted from raw data including a large number of examples. It differs from data analysis in general only describing data. Because data mining is related to knowledge, it shares knowledge with machine learning.

batch and mini-batch
The batch method is the method of putting all samples for each step to train the network. This method has the advantage of simplicity and high accuracy. However, this method contains the potential to lead the entire set of examples to local minima. In addition, it is only suitable for small example sets (a few dozen examples) because for a large example set the network must learn all the examples at the same time, it will take a lot of time.
 Large example sets need data mining for training. A small group of examples with size batch_size is extracted to train the neural network at a training step. This method is called mini-batch. The group must contain knowledge to model the sample set. In other words, it must have the role of a pattern. In particular, it must ensure a statistical balance for each class in the neural network.
Interestingly, the statistical nature of the pattern is important, not its size. For example, 1m fabric pattern and 10m fabric pattern are similar. Therefore we use a small batch_size which will greatly reduce computational complexity.

More specifically, the training pattern of neural network is not a physical pattern. This means that a specific example group used as a pattern in a training step does not necessarily have to be statistically balanced and does not need to be present all classes. But the pattern is changed smartly and over thousands of steps it meets the requirements.

The video below shows a neural network with 3000 examples completing training in a few minutes with a test accuracy of over 99% . The batch_size is 32. If it has to learn 3000 examples at the same time as the batch method, the learning of the network will last all day


When to use batch method?
The mini-batch method is only intended to reduce network training time for big sample sets. It is not necessary for a small sample set with only a few dozen examples. On the other hand, the training pattern never achieved the ideal conditions as theory. This is more evident when the sample set is small, an example may contain a lot of information and two examples may be very different. Then it may happen that when learning a new pattern the neural network has forgotten the old pattern, the training accuracy is therefore reduced. In this case we will use the batch method and the network will learn all the examples at the same time. The local minima can be handled by many other measures. In the Tennis problem below, all 14 examples are learned at once

Currently unrated


There are currently no comments

New Comment


required (not published)



What is 9 - 3?