Hello!

I am using Matlab and developped a neural network for several pairs, but I have issues reprogramming the NN from Matlab to mql4!

For a test, I created a small neural network predicting USDJPY price from price in i+10 and i+20. It has 2 inputs, 3 hidden neurons, 1 output. The hidden layer activation function in Matlab is tansigmoide, for the output it is linear.

If I plot the NN output with the real price, it shows the NN has predictive power, but with the code I did, it's definitely not working.

The calculated weights of the hidden layer are :

[13.8525 -43.4534;

-11.2084 18.4331;

-0.30603 0.01022]

The weights from the hidden to the output are :

[0.0020021 0.0047956 -3.4143]

Bias of the hidden layer :

[13.876;

2.644;

0.083215]

Bias of the output

[0.27514]

The problem must be in the activation function wich should be tan sigmoide. As the price is more than 100, the MathExp(-100) give me something very small...

Here is the interesting part of the code :

>>

double a1=iClose("USDJPY",0,i+10);

double a2=iClose("USDJPY",0,i+20);

//Node (1,1)

double Sum_node_1_1=13.8525*a1 -43.4534*a2+13.876;

double Sigmoide_node_1_1=(1-MathExp(-Sum_node_1_1))/(1+MathExp(-Sum_node_1_1));

//Node (1,2)

double Sum_node_1_2=-11.2084*a1+18.4331*a2+2.644;

double Sigmoide_node_1_2=(1-MathExp(-Sum_node_1_2))/(1+MathExp(-Sum_node_1_2));

//Node (1,3)

double Sum_node_1_3=-0.30603*a1+0.01022*a2+0.083215;

double Sigmoide_node_1_3=(1-MathExp(-Sum_node_1_3))/(1+MathExp(-Sum_node_1_3));

//---- Exit value -----

double Sum_node_2_1=(0.0020021*Sigmoide_node_1_1+0.0047956*Sigmoide_node_1_2-3.4143*Sigmoide_node_1_3+0.27514);

<<

I admit that the NN used non-normalized data (not the best), but the plot of the NN output vs the real value under Matlabd shows it is working, so I really wonder about the activation function...

Thanks for your Help!

I am using Matlab and developped a neural network for several pairs, but I have issues reprogramming the NN from Matlab to mql4!

For a test, I created a small neural network predicting USDJPY price from price in i+10 and i+20. It has 2 inputs, 3 hidden neurons, 1 output. The hidden layer activation function in Matlab is tansigmoide, for the output it is linear.

If I plot the NN output with the real price, it shows the NN has predictive power, but with the code I did, it's definitely not working.

The calculated weights of the hidden layer are :

[13.8525 -43.4534;

-11.2084 18.4331;

-0.30603 0.01022]

The weights from the hidden to the output are :

[0.0020021 0.0047956 -3.4143]

Bias of the hidden layer :

[13.876;

2.644;

0.083215]

Bias of the output

[0.27514]

The problem must be in the activation function wich should be tan sigmoide. As the price is more than 100, the MathExp(-100) give me something very small...

Here is the interesting part of the code :

>>

double a1=iClose("USDJPY",0,i+10);

double a2=iClose("USDJPY",0,i+20);

//Node (1,1)

double Sum_node_1_1=13.8525*a1 -43.4534*a2+13.876;

double Sigmoide_node_1_1=(1-MathExp(-Sum_node_1_1))/(1+MathExp(-Sum_node_1_1));

//Node (1,2)

double Sum_node_1_2=-11.2084*a1+18.4331*a2+2.644;

double Sigmoide_node_1_2=(1-MathExp(-Sum_node_1_2))/(1+MathExp(-Sum_node_1_2));

//Node (1,3)

double Sum_node_1_3=-0.30603*a1+0.01022*a2+0.083215;

double Sigmoide_node_1_3=(1-MathExp(-Sum_node_1_3))/(1+MathExp(-Sum_node_1_3));

//---- Exit value -----

double Sum_node_2_1=(0.0020021*Sigmoide_node_1_1+0.0047956*Sigmoide_node_1_2-3.4143*Sigmoide_node_1_3+0.27514);

<<

I admit that the NN used non-normalized data (not the best), but the plot of the NN output vs the real value under Matlabd shows it is working, so I really wonder about the activation function...

Thanks for your Help!