GSB Forums

Not logged in [Login - Register]

Futures and forex trading contains substantial risk and is not for every investor. An investor could
potentially lose all or more than the initial investment. Risk capital is money that can be lost without
jeopardizing ones’ financial security or life style. Only risk capital should be used for trading and only
those with sufficient risk capital should consider trading. Past performance is not necessarily indicative of
future results
Go To Bottom

Printable Version  
 Pages:  1    3  
Author: Subject: Missing function?
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 8-12-2017 at 08:43 AM


Need explanation:
Notice the 195 min file is just being skipped and ignored; I don't get this one, as I found the following in nearly every TS script-system produced:
"""
// Settings
// ID: 20171208-052424-684327-jCKiv
// Price Data: Data1: ES.30.Minute.20yback.endJan29.2010.txt, Data2: ES.390.Minute.20yback.endJan29.2010.txt
// MaxBarsBack: 500
......
"""
NO 195 min in the above. No Data3 either. Supposed to be 3 datastreams at least recognized in the above? (195 min saved file looks good in Price Data - opened it and peeked.)



dontseedata3.jpg - 317kB


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 8-12-2017 at 11:02 AM


Quick further jots on the fly:

Set all TS charts from which I save data, meticulously to reg. session. 1 to 30 m; 1 to 690 m; 1 to 1380 m;

All 3 datastreams now consistently being at least recognized:
"""'
// Settings
// ID: 20171208-084951-944757-yN0d9
// Price Data: Data1: ES.30.Minute.20yback.endJan29.2010.txt, Data2: ES.690.Minute.20yback.endJan29.2010.txt, Data3: ES.1380.Minute.20yback.endJan29.2010.txt
// MaxBarsBack: 500
"""

BUT, and this is possibly an even bigger issue than the vanishing datastream - on the several I plunked into MS Word, search/find of "Data2," not a one is using it in the code. Brings into serious wondering of any discussion using other than the same bar size for 3 datastreams slated for the GSB run/optimize, whether the program can even use it? Something else going on here?

Basically, just need to be aware of what's really going on....

"The science is yours my friend; I am but a menial janitor of some miniscules.":)


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 8-12-2017 at 11:35 AM


Now using regular session: 30m; 120m; 390m; these 3 datastreams, all ES. Not an instance of using Data2 in the code in several Trade Station scripts checked.

I'm not going to conclude on these few samples that using larger bars in the secondary datastreams in the GSB optimize will always result in them never being used, but something seems possibly going on here....

Apologies for some significant, recognized disorderliness in, among my previous 3 posts - if I only had more time....


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 8-12-2017 at 11:52 AM


One change from this:
"Now using regular session: 30m; 120m; 390m; these 3 datastreams, all ES. Not an instance of using Data2 in the code in several Trade Station scripts checked."

to this:
"Now using regular session: 30m; 120m; 390m; these 3 datastreams, all ES except 120m using YM. Not an instance of using Data2 in the code in several Trade Station scripts checked."

It may not be that the program avoids "symbol redundancy," e.g. when they're all ES, but that it just doesn't like bigger bars in the 2ndary datastreams at all; won't use them at all - is frivolous to try any of that? Really wondering now because I sure had some plans for stuff like 30m, 120m, 390m....


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 8-12-2017 at 04:30 PM


Quote: Originally posted by curt999  
you would likely have more luck trying multiple timeframes of the same instrument..ie nq30 nq 120 nq 480 or closer spaced like 30 60 90


120,480; 60,90 in your quote are not being picked up by the GSB run.

Experimenting now, all that follows using ES, I have mostly to see the GSB not recognize regularly, in the Settings area in TS script, of those 120, 480 or 60, 90 you mention - these sets as secondary. Then with 30 as primary to either of these sets, I have yet to see ANY SINGLE INSTANCE where, recognized or not, it used any of the secondaries above in the code.

Another one: session saving the data out Trade Station was full regular session. Bars for Data1 1000min; bars for Data2 1001min; bars for Data3 1003min. No systems. 1000m you would think "is sufficiently contained" within the data of a regular session that you'd see some systems.

Now replace everything in the preceding, changing all to fit 500m;501m;502m and you get systems, BUT, Settings in TS script only recognizes Data1 at 500min (acts as if 501 and 502 don't exist).

Just to be aware, I completely repeated the foregoing 2 paragraphs, but in the TS data-saving charts I used a custom session that coincided exactly with the regular session. Exact same result.

This appears to be some dissimilar, unanticipated behavior, but I'm sure the programmers can explain the current state.

Data2 and 3...sets that are not saved out of the exact same chart bar minutes so far are identified as exactly dead in the road, NONE ever used in the code, and only few instances of partial-set recognition in the Trade Station scripts.

Please note some of these non-operatives as divergence off my logical ambition-area-I-would-test.



View user's profile View All Posts By User
curt999
Junior Member
**




Posts: 51
Registered: 24-7-2017
Member Is Offline

Mood: No Mood

[*] posted on 9-12-2017 at 07:58 AM


you have to use some common sense when selecting the secondary timeframes..you are using the regualr session so even for a 60min timeframe there are only six bars per day..anything higher then this the likelihood of it being used is nil simply because there arent enough bars..if you are making an intrday system with MOC then you need smaller timeframes.. 5 10 15 30 minute combinations.500min 501min 502min that wont do anything there is one bar per day

View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 9-12-2017 at 04:29 PM


Whoops, just noticed this kind of thing:
Data(i1Data)
Thus, some of the secondary datastreams are getting used in the script. When I first came onboard I specifically saw "Data1," "Data2," in the code; heh, forgot it might use data(input or var), and wasn't looking for any of this.

So it appears some unknown % at this pt. of my recent posts (6 or so?) claiming datastreams not being used in TS script WERE using the secondary datastreams; however, all the other problems noted yet remain, including inserted datastreams not recognized in settings whatsoever; some understanding of why the GSB deserts an inserted bar-minute data for the optimize, as well as why it appears it might desert inserted data files for the optimize that are "unusual." EG ES 30m,500m,1000minute inserted into a reg session setup for the GSB optimize. Bunches getting either discarded or ignored for wanting to look at some of this, perhaps "unusual" stuff.


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 10-12-2017 at 04:02 PM


Quote: Originally posted by parrdo101  
Whoops, just noticed this kind of thing:
Data(i1Data)
Thus, some of the secondary datastreams are getting used in the script. When I first came onboard I specifically saw "Data1," "Data2," in the code; heh, forgot it might use data(input or var), and wasn't looking for any of this.

So it appears some unknown % at this pt. of my recent posts (6 or so?) claiming datastreams not being used in TS script WERE using the secondary datastreams; however, all the other problems noted yet remain, including inserted datastreams not recognized in settings whatsoever; some understanding of why the GSB deserts an inserted bar-minute data for the optimize, as well as why it appears it might desert inserted data files for the optimize that are "unusual." EG ES 30m,500m,1000minute inserted into a reg session setup for the GSB optimize. Bunches getting either discarded or ignored for wanting to look at some of this, perhaps "unusual" stuff.

i would start by using es30 data1, and es60 data2, and seeing of that works. Im fairly sure it does work as i have multiple time frame systems. I wouldn't expect 500 minute to work as its wrong session length. session its 405 minutes, unless you want to use the last 15 minute and then its 420 min.(using last 15 min is going to work poorer and will cause problems with cash indices which dont have last 15 min)


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 12-12-2017 at 07:01 AM


Potentially very powerful Builder you've made.... Quality choices by a pro everywhere to be seen.

Super Admin said,

"I wouldn't expect 500 minute to work as its wrong session length."

Not clear what makes that a wrong session length.

Super Admin said,

"session its 405 minutes, unless you want to use the last 15 minute and then its 420 min"

Absolutely can't decipher this sentence sense, nor begin to guess what it references.





View user's profile View All Posts By User
Carl
Member
***




Posts: 343
Registered: 10-5-2017
Member Is Offline

Mood: No Mood

[*] posted on 12-12-2017 at 09:16 AM


Hello Parrdo101,

I think Peter means the following.

Suppose you want to use 8:30 - 15:00 as your session. This corresponds with 390 minutes.

As a primary datastream you could use 15 minutes.
As secondary datastreams you could use i.e. 30, 45, 65, 130, 195, 390 minutes.
Not 400 minutes nor 500 minutes, because these don't correspond to the total session time of 390 minutes.




View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 12-12-2017 at 03:27 PM


Excellent reply Carl. I am thinking of allowing gsb to create other time frames for smaller time frame bars.
ie if data=15 min, calculate 30 60 120 240 390 etc
I dont think 45 min, 75 min etc are going to be a good idea though.
I for-see its possible that TS <> GSB if we do this. Will have to test and see.

However we still have more pressing features ahead of this.


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 13-12-2017 at 06:40 AM


Word on the "Genetic Process."

Check this now and then yourself; if I did this right, we're really all lost in a math-mucker.

Ran a 3 datastream, 30 min bars each, a very carefully tendered custom session's worth of data files to the GSB effort (cust sessions are sometimes very tricky in Trade Station). GSB optimizes Sept 5 2006 to Jan 1 2010 on BO, S, SM, 30 min each; custom session is
("state's secret"), but very carefully watched to make sure all is in place. Train,test, validation was 50, 25, 25.

GSB run done, no eyeballing to select a graph and script, no app settings changed whatsoever, commissions the same in both runs, selection of script for TS was to merely take the top full period Pearson's and record the net profit in the Trade Station new script chart with the custom session by looking at the OOS in TS charts set to Nov 1 2009 (allowing about 2 months for the 500 bar back, and to catch the full window subsequent to start of Jan 1 2010 on out to current date). OOS is therefore a good 7 years or so on this GSB optimize. All this is simple and precise.

Doing the same, same thing in both instances, simple and carefully, the first run records in the OOS window as $-54539.1. After the second GSB run-finder optimize it records as $-5773.54. Near a $50,000 difference!

Sing with me now...on this, and assume I did this correctly; and that the Genetic process is the bugger doing this. With mathematically precisely programmed optimize finder-software (Exhaustive search and optimize as opposed to genetic), you should-WOULD get the same net profit every time, doing everything the same way in each repeat.

Gentlemen, drop your PCs and learn to Fear Him instead of hope on trading, I guess.

This stuff won't do at all, if this is what it's about ala Genetic, and if I did this absolutely correctly which reveals it.

Can one really count on precision indicators, math, where a Genetic optimizer program just mocks it in toto like this? Can one really feel secure in moving into live trading off results which produce exhibits like this? Impossible to me.


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 13-12-2017 at 06:48 AM


I've always abhorred Genetic. Who sold us into that coin? Why not get those genius's heads out of their Crypto's, and in here to solve it, and age-old problems like this, instead of this "new" bitcoin smart-wiles detraction, probably deliberate? The detraction merely adds to the problems, for it will bring its own.

View user's profile View All Posts By User
curt999
Junior Member
**




Posts: 51
Registered: 24-7-2017
Member Is Offline

Mood: No Mood

[*] posted on 13-12-2017 at 07:47 AM


genetic is supposed to produce different results each time .. reason for the random seed..why would you want the same results and strategies every time thats retarded..each time you run GSB you will get different strategies thats a good thing

View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 13-12-2017 at 09:09 AM


When the data is x and the question is y the answer is z, (THE optimal in the case would only be z - were you not looking for THE optimal?); math likes specific and precise. You could sort the columns below for "strategy differences." You could also get your good thing by using a different price data set that's different by the inclusion of merely 1...day.... --->approaches infinity, your "different strategies," the numb of ways to "get strategies different" - AND THIS WOULD HAPPEN WITHIN ALL CALCULATIONS BEING EXHAUSTIVE AS TO THE SEARCH AND OPTIMIZE. Maybe we're talking light years away (exhaustive) that aren't going to happen soon on earth with the current software and hardware, I guess.

Gimme precise, and please be quick about it. Ok, I just saw it: 600gen; 300pop in one of Peter's YouTube's; Please fill in my blanks on that:

"600 gen, 300 pop will always get you within______(please fill in) ~ X % of an exhaustive search." Maybe I can be more comfortable with it...have to see this X % from The Pro.

And I'll try to learn to speak with less ignorance, but please know that's already too hard, too, given the starting pt.!

In reply, please also indicate to me as to what you said here:

"If you increase indicators to 5, you need to up Gens and Pops in wf."...

...Just what would 600 gen and 300 pop be approximately now (to attain approx. the same X % within exhaustive)?

The attached shows the 2 gen, pop areas I'm actually more concerned about at the moment (the 1st optimize, not the wf):






moreconcern.JPG - 55kB

PS I'm just noticing those huge differences I was seeing in net profit apparently were apparently using 500 and 200 in the gen/pop setup, as in the snippet. I hope SOMETHING, or maybe 600/300 is going to clean that up...? (I am routinely doing my runs with 5 indicators, too, fyi.)


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 13-12-2017 at 03:12 PM


Your comments confuse me a little as you've circled the GA generations / populations but I think are talking about WF gen / populations.
In the example using standard ES, there is 1.5 with 21 zeros after it. These values are the amount of possible systems, but has little to do with WF.
With 5 indicators its 2 with 32 zeros after it.
That's close enough to infinite possibility's.
The much more critical thing is the WF settings, esp as you say you have 5 indicators.
Some markets work better with 5, and others are fine with 3. You need to experiment with this.
In your screen shot you have WF generation & population set to 120 , 120
This is ok for quick WF. It will typically give you close to brute force final parameters.
The WF curve however will be reasonably close, but not exact any may not give such good parameter stability. If you really like the system use 300 x 300 on your final check. (assuming 3 indicators, and no way is it high enough in your case using 5 indicators)
If you have adaptive moving average (had 3 parameters) or have optimize data streams set to true
or have 4 or 5 indicators instead of the default 3, use something like 600x 300, or even 900 x 300.
Bottom line is too few gen / population is going to give a random amount of difference in the WF curve compared to a high amount. Just keep on increasing until it consistently gives similar results.



apc-21.png - 8kB


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 17-12-2017 at 12:39 PM


Can large Trade Station caches (say 400+MB) significantly slow several copies of GSB running - because of memory? This also assumes TS is usually open as a program when GSB is running. I have a feeling I'm seeing something of this sort. (Good i7 6 core, Crucial SSD, 16 GB RAM here.)

View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 17-12-2017 at 03:00 PM


Go to task manager and see how much is free. Also see how much ram gsb using.
Its very possible what your saying is true.
Best run 1 copy of gsb if you think this is the case, or get 64 gb of ram. Ram is cheap.
I think 2 GSB and other apps etc on 16gb is pushing it.
Depends on many things. ie 5 min bars vs 30 min.
see this post
http://www.trademaid.info/forum/viewthread.php?tid=42#pid606


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 25-12-2017 at 12:30 PM


I continuously had on the back burner to check out the possibility of running 3 certain datastreams in GSB, particlarly data saved out of Trade Station from Daily, Weekly, Monthly charts but ran out of time.

Can price bars for the above charts be saved and run in GSB?

Using something like this, as a working format, which worked for me on other GSB runs I did:
ES.30.Minute.20yback.endJan1.2010ShrtSess.txt

What would the above .txt look like, if one can work with these, for Daily, Weekly, Monthly .txts?


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 25-12-2017 at 03:10 PM


Quote: Originally posted by parrdo101  
I continuously had on the back burner to check out the possibility of running 3 certain datastreams in GSB, particlarly data saved out of Trade Station from Daily, Weekly, Monthly charts but ran out of time.

Can price bars for the above charts be saved and run in GSB?

Using something like this, as a working format, which worked for me on other GSB runs I did:
ES.30.Minute.20yback.endJan1.2010ShrtSess.txt

What would the above .txt look like, if one can work with these, for Daily, Weekly, Monthly .txts?

ES.390.Minute.20yback.endJan1.2010ShrtSess.txt
weekly would be 390*5
ES.1950.Minute.20yback.endJan1.2010ShrtSess.txt
Monthly not likely to work in GSB, even if we could input it.
Problem is days in month vary.
I intend to refine how GSB works in months to come.
ie allow GSB to builder higher time frames from lower time frames,
and genetically choose the best time frame.


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 26-12-2017 at 05:47 AM


I'm not getting where the 405 is coming from; the custom session you have early evaluators make is 390 min. e.g. (8:30am to 3pm Central Time U.S.). Pardon me, too, I forgot to specifically say I want Daily as Primary Data, Weekly and Monthly as Secondary in the GSB load and run-optimize.

Let me playback so you know I'm understanding:

Am I understanding you mean the 3 to be named and used in the Price Data folder exactly as
follows: (?)
ES.405.Minute.20yback.endJan1.2010ShrtSess.txt < ES.405*5.Minute.20yback.endJan1.2010ShrtSess.txt < ES.2025.Minute.20yback.endJan1.2010ShrtSess.txt << would be the Monthly(Secondary) config.?
(but this latter has an acknowledged problem).

Finally, in your experience, the Daily and Weekly at least (as configured above? Or?) are get-to-go - able to work in GSB?

Looks like one "Yes" could cover several questions, to your convenience (As I wrote all, "Yes"; Daily and Weekly will work as I wrote,"Yes.") If a "No" anywhere, please align, correct me. Thanks.


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 26-12-2017 at 07:02 AM


Quote: Originally posted by parrdo101  
I'm not getting where the 405 is coming from; the custom session you have early evaluators make is 390 min. e.g.

my mistake.
it should have been 390. Sorry.
Daily bars might work, but it wont work as well as smaller time frames. More on this tomorrow.
There is a post on the forums somewhere about this.
off to sleep now.


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 26-12-2017 at 10:44 PM


Quote: Originally posted by parrdo101  
Pardon me, too, I forgot to specifically say I want Daily as Primary Data, Weekly and Monthly as Secondary in the GSB load and run-optimize.

Let me playback so you know I'm understanding:

Am I understanding you mean the 3 to be named and used in the Price Data folder exactly as
follows: (?)
ES.390.Minute.20yback.endJan1.2010ShrtSess.txt < ES.405*5.Minute.20yback.endJan1.2010ShrtSess.txt < ES.2025.Minute.20yback.endJan1.2010ShrtSess.txt << would be the Monthly(Secondary) config.?
(but this latter has an acknowledged problem).

Finally, in your experience, the Daily and Weekly at least (as configured above? Or?) are get-to-go - able to work in GSB?

Looks like one "Yes" could cover several questions, to your convenience (As I wrote all, "Yes"; Daily and Weekly will work as I wrote,"Yes.") If a "No" anywhere, please align, correct me. Thanks.

If your going to use daily bars, (390) then Market on close must be off, Secondary filter most likely false, but might work on GA.
I don't know how weekly would work. I'm not a great fan of daily bars, let alone weekly / monthly. Fine to try.
1950 for weekly etc
Monthly too problematic I feel.
However I intent to get large time frame of bars build by smaller bars in the next few months. This will mean the interval will not be in the file name any more.


View user's profile View All Posts By User
parrdo101
Junior Member
**




Posts: 71
Registered: 18-11-2017
Member Is Offline

Mood: No Mood

[*] posted on 27-12-2017 at 05:35 AM



Quote:

... I don't know how weekly would work. I'm not a great fan of daily bars, let alone weekly / monthly. Fine to try.
1950 for weekly etc
Monthly too problematic I feel.
However I intent to get large time frame of bars build by smaller bars in the next few months. This will mean the interval will not be in the file name any more


I think a really good guidepost for how the GSB could work, would be to mimick Trade Station as to how versatile the datastream usage can be:

I.E., in TS, I think you can easily have, a completely random example, datastream1 as Daily, trade from it, whilst datastreams 2 and 3 are weekly and monthly - old hat. Somewhere, you can be assured, someone NEEDS to look at exactly this; here I was e.g.

Anything as to datastreams in TS, too, can be created if I recall; e.g., data1 as 31 min, data2 as 853 min, data3 as monthly - ANY COMBINATION is ok, if I recall. And then just trade off data1. GSB could stand to maybe take an example on this level of versatility with their Primary, Secondary file loads, then run the GSB? (But emulate or do a take on only this from TS at the moment; they are a disgusting 32 bit snail that has no macho software meat nor can really crank with RAM for most all the rest of theirs.)


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5069
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 27-12-2017 at 05:42 AM


Quote: Originally posted by parrdo101  

I think a really good guidepost for how the GSB could work, would be to mimick Trade Station as to how versatile the datastream usage can be:


Agreed, but we also need to keep compatibility with multcharts too.
TS has its own agendas. For example needed platform support for Japanese, while we would want other things like 64 bit.
They took many years to go even to multi threading.
Bottom line however is its a great platform regardless of these issues.



View user's profile View All Posts By User
 Pages:  1    3  

  Go To Top

Trademaid forum. Software tools for TradeStation, MultiCharts & NinjaTrader
[Queries: 67] [PHP: 28.0% - SQL: 72.0%]