I'm developing a statistical indicator that gathers information about how many up/down bars in a particular period, etc. but I've run into a strange problem with arrays.
bascally, I've arrays declared globally - up[] and down[]. I intend to use them to count the number of bars that are up or down by using the arithmtical operator ++.
that is, whenever an up bar is spotted, I increase the up[] array by 1 by using up[i]++, in which i=0 to 20 for example, and vice versa for downbars with down[i]++;
this should be simple enough, however, the end result is that all elements in both array series are always 0.
if I replace the arrays with other kinds of variables such as an integer named count, then it increases as it should.
I'm quite puzzled by this, could somebody please shed some light on this? what am I doing wrong?
bascally, I've arrays declared globally - up[] and down[]. I intend to use them to count the number of bars that are up or down by using the arithmtical operator ++.
that is, whenever an up bar is spotted, I increase the up[] array by 1 by using up[i]++, in which i=0 to 20 for example, and vice versa for downbars with down[i]++;
this should be simple enough, however, the end result is that all elements in both array series are always 0.
if I replace the arrays with other kinds of variables such as an integer named count, then it increases as it should.
I'm quite puzzled by this, could somebody please shed some light on this? what am I doing wrong?