DislikedTraditionally, the term neural network had been used to refer to a network or circuit of biological neurons. The modern usage of the term often refers to artificial neural networks, which are composed of artificial neurons or nodes. Thus the term has two distinct usages:
1. Biological neural networks are made up of real biological neurons that are connected or functionally related in the peripheral nervous system or the central nervous system. In the field of neuroscience, they are often identified as groups of neurons that perform a specific physiological function in laboratory analysis.
2. Artificial neural networks are made up of interconnecting artificial neurons (programming constructs that mimic the properties of biological neurons). Artificial neural networks may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems without necessarily creating a model of a real biological system. The real, biological nervous system is highly complex and includes some features that may seem superfluous based on an understanding of artificial networks.
This article focuses on the relationship between the two concepts; for detailed coverage of the two different concepts refer to the separate articles: Biological neural network and Artificial neural network.
It's been a while I intended to create this decision making algorithm, at the beginning I wanted to use Meta Trader 5. As the backtest mode is not yet available I switched to the brend new Algodeal, which offers a Java API so I could use any Object-Oriented feature in it.
The new idea was to create a node per indicator, the first node I implemented was the RSI one. A node contains some properties :
- spectrum : The volatility of the weighting normal law which we'll see later
- min : The value of the smallest subnode
- max : The value of the highest subnode
- step : The step between two subnodes
- vsn : The subnode Vector
- name : The node name
And a set of functions :
- Node : A constructor which also creates the vector of subnodes
- new_weight : Takes in parameters the pips variation between previous bar and new one, the difference in the specific node value (here RSI) and sets the new_weight to the subnodes
- normal_law : which here calculates the normal law
- dump : A debugging function which displays all the subnodes' weights
This Node class is an abstract one, so I created a RSINode to implement the RSI node. This class contains few specific properties :
- rsi : The rsi indicator
- rsiLength : The RSI length (14 UT in the example)
And implements a single method in addition to its constructor :
- get_weight (which is declared but not implemented in the abstract class) : Retrieves the weight of the right subnode, it's the only place with new_weight where you really need the RSI value. But I use a programming trick to avoid recoding new_weight at each new node type (the old value is stocked in the abstract part when you call get_weight).
The subnode is the second neural network level (which has 2 levels at the moment). For each node, for example a RSINode14 there is (max - min) / step SubNodes. For example with min = 0, max = 100 et min = 1.0 we have a subnode at 0, one at 1, one at 2, etc. up to 100.
The SubNodes have 4 properties :
- value : is the SubNode value (in the case of RSI 1,2,3...100)
- weight : is the weight of the SubNode, the most important thing, a negative weight is a selling signal and a positive weight a buying signal
- wu : A multiplier when you increase weight
- wd : A multiplier when you decrease weight
There is also 3 methods :
- weight : retrieves the SubNode weight
- weight_up :increases the SubNode weight
- weight_down :decreases the SubNode weight
At initialisation the nodes are created and stored in a Vector, each node also creates subnodes and gives them a weight (either zero or result of previous experiences).
At the beginning nodes have a null weight, so we look at the market without taking position. Let's imagine NASDAQ is 4000points with a RSI(14) = 60 on March 1st and Nasdaq goes 4020points on March 2nd. I want to tell my RSI(14).SubNode(60) that the market tends to raise when the RSI is at 60. So I'm gonna raise the weight of all subnodes with the following formula :
NewWeight = OldWeight + WUMultiplier * Log10(1+PipsChange) * NormalLaw(SubNode)
The normal law is used to raise not only the SubNode(60) but also at a lower rates the surrounding subnodes (59, 61,...) :
So now NASDAQ quotes 4020 and my RSI is 65, I'm gonna retrieve the value of SubNode(65), if it's positive i'm gonna buy, if it's negative i'm gonna sell. To check that my learning system worked I've displayed the values of the subnodes at the end of the expérience (10 years backtest on FCE).
We can see that when the RSI is higher than 50 the market statistically raises the next day... be careful the formula is far from being perfect and statistic anomalies are really "powerful".
The listing of possible enhancements is long, even infinite, you just have to add nodes for the thousands of existing indicators. If they are not efficient (they are random) their weight should stay around zero (representing the biais of the experience). Among other possible enhancements :
- SubNodes with numerous variables (RSI from yesterday and the day before, or MACD)
- Research of efficient coefficients (wu, wd, normal law volatility)
- More efficient weighting to fastly eliminate worst indicators and increase the weight of the most efficient indicators.
- New weight for last X bars and not only the 1 previous bar
Results don't matter at this step of the developement but here they are :
A nice trend following with RSI 50…
…with the classical P&L for this kind of strategies