Documentation


Viewing posts for the category Omarine User's Manual

Neural network: Using genetic algorithms to train and deploy neural networks: Multithreading

Neural network: Using genetic algorithms to train and deploy neural networks: Embedding

Unlike normal data mining processes, embedded operation is in the interference area between data mining and machine learning. It has just exploited knowledge to put it into machine learning, and received training in data exploitation from the network. The mining is like a pipe where the flow of knowledge is progressing over time through training. That means the network has to learn knowledge and adjust the source of knowledge.

Embedded work has two effects:

Neural network: Using genetic algorithms to train and deploy neural networks: Selecting the class

I would like to say that the AI ​​era is just beginning, that there is plenty of room for you to be creative, especially young people. It is because of two reasons below:

Neural network: Using genetic algorithms to train and deploy neural networks: The probability distributions

Uniform distribution is fundamental, used in most cases such as to assign hybrid rates and mutation rates in genetic algorithms. The distribution can be applied directly like that, or as a basis for a subsequent process, like browsing a tree starting from the root. A specific application is to use in embedding techniques of neural networks.
Beside that, the Standard normal distribution, especially Truncated standard normal distribution, is useful in the initialization of neural networks.
We will in turn learn about them with full source code.

Uniform distribution

Uniform distribution is the most common, the probability to yield values ​​is equal within a range of their values. Fortunately, we already have many functions available to create this distribution, including erand48(). This function returns nonnegative double-precision floating-point values uniformly distributed over the interval [0.0, 1.0). Transforming to an arbitrary range is very simple, for example creating a uniform distribution within [-1.0, 1.0):

Neural network: Using genetic algorithms to train and deploy neural networks: The derivatives of the transfer functions

The two most common transfer functions (activation functions) are the sigmoid logistic function or sigmoid function and ReLU function. Their derivative formulas are as follows

1) sigmoid function