GSB Forums

Not logged in [Login - Register]

Futures and forex trading contains substantial risk and is not for every investor. An investor could
potentially lose all or more than the initial investment. Risk capital is money that can be lost without
jeopardizing ones’ financial security or life style. Only risk capital should be used for trading and only
those with sufficient risk capital should consider trading. Past performance is not necessarily indicative of
future results
Go To Bottom

Printable Version  
 Pages:  1  ..  38    40    42  ..  98
Author: Subject: General support questions.
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 16-7-2020 at 12:44 PM


Thank you Carl and Peter

View user's profile View All Posts By User
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 16-7-2020 at 01:02 PM


Quote: Originally posted by Daniel UK1  
Quote: Originally posted by bizgozcd  
Greetings once again.

So, I’ve had a few internet issues halfway through an optimization. Is there a way to stop the process so that I can reboot my computer and then continue where I left off?

The only thing I was able to find was the auto save function, but if that's the best way to recover, how can I move those systems back to the Unique Systems tab to perform macros?

Thank you.


Hi There,

Probably best to just mark all systems, then right click and save systems, then reboot, and rght click again in a new manager and then load all saved systems to same opt setting and continue your process.. good luck

As a side note, A a good tip is to make at least 4-5 or more builds of 25-50k systems on same opt setting your are testing and get the averages of the stats, so you make sure to take a decision on not random data..



Hi Daniel,

Is there a reason I wouldn't simply run 200K systems then? My assumption is that would be simpler than running 50K systems 4 times. Thanks for your input.


View user's profile View All Posts By User
bizgozcd
Junior Member
**




Posts: 53
Registered: 27-5-2020
Member Is Offline


[*] posted on 16-7-2020 at 02:19 PM


Quote: Originally posted by admin  



Great you made progress.
If your not trading at a ts brokeage account,
add setexitonclose; any where in the code will fix it.
check your code times match
is if last bar is 1500, then if time >= 1500 then {exit trades}
This time might be different. If they are differnet, you need to check all times in the code.
Check your last bar of day = the same time of the data in GSB


Hi Peter,

I added the line you suggested, however TS did not enter or execute a Market on Close today. (It did correctly enter the trade and simultaneously enter a stop.)

Also, I noticed that the GSB ES System had this code in it:

//Added march 2020 to fix market on close failures on live TS brokeage accounts.
If TIME =SessionEndTime(0,1) THEN BEGIN
Sell ("LX_EOD") THIS BAR AT CLOSE;
Buy TO COVER ("SX_EOD") THIS BAR AT CLOSE;
END;

Do you think this could fix my issue as well?


View user's profile View All Posts By User
Daniel UK1
Member
***




Posts: 470
Registered: 4-6-2019
Member Is Offline


[*] posted on 16-7-2020 at 03:14 PM


Quote: Originally posted by getty002  
Quote: Originally posted by Daniel UK1  
Quote: Originally posted by bizgozcd  
Greetings once again.

So, I’ve had a few internet issues halfway through an optimization. Is there a way to stop the process so that I can reboot my computer and then continue where I left off?

The only thing I was able to find was the auto save function, but if that's the best way to recover, how can I move those systems back to the Unique Systems tab to perform macros?

Thank you.


Hi There,

Probably best to just mark all systems, then right click and save systems, then reboot, and rght click again in a new manager and then load all saved systems to same opt setting and continue your process.. good luck

As a side note, A a good tip is to make at least 4-5 or more builds of 25-50k systems on same opt setting your are testing and get the averages of the stats, so you make sure to take a decision on not random data..



Hi Daniel,

Is there a reason I wouldn't simply run 200K systems then? My assumption is that would be simpler than running 50K systems 4 times. Thanks for your input.


Hi There, i assume that would be an option if you feel thats easier for you, but i do not think it would be the same.

I prefer to not do it like that, and i instead would prefer to start 4 different managers and run 4 separate builds and get the average of that for all stats i make.





View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5060
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 16-7-2020 at 06:56 PM


Quote: Originally posted by bizgozcd  
Quote: Originally posted by admin  



Great you made progress.
If your not trading at a ts brokeage account,
add setexitonclose; any where in the code will fix it.
check your code times match
is if last bar is 1500, then if time >= 1500 then {exit trades}
This time might be different. If they are differnet, you need to check all times in the code.
Check your last bar of day = the same time of the data in GSB


Hi Peter,

I added the line you suggested, however TS did not enter or execute a Market on Close today. (It did correctly enter the trade and simultaneously enter a stop.)

Also, I noticed that the GSB ES System had this code in it:

//Added march 2020 to fix market on close failures on live TS brokeage accounts.
If TIME =SessionEndTime(0,1) THEN BEGIN
Sell ("LX_EOD") THIS BAR AT CLOSE;
Buy TO COVER ("SX_EOD") THIS BAR AT CLOSE;
END;

Do you think this could fix my issue as well?

Yes, This code should fix the issue.


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5060
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 16-7-2020 at 06:58 PM


Quote: Originally posted by Daniel UK1  
Quote: Originally posted by getty002  
Quote: Originally posted by Daniel UK1  
Quote: Originally posted by bizgozcd  
Greetings once again.

So, I’ve had a few internet issues halfway through an optimization. Is there a way to stop the process so that I can reboot my computer and then continue where I left off?

The only thing I was able to find was the auto save function, but if that's the best way to recover, how can I move those systems back to the Unique Systems tab to perform macros?

Thank you.


Hi There,

Probably best to just mark all systems, then right click and save systems, then reboot, and rght click again in a new manager and then load all saved systems to same opt setting and continue your process.. good luck

As a side note, A a good tip is to make at least 4-5 or more builds of 25-50k systems on same opt setting your are testing and get the averages of the stats, so you make sure to take a decision on not random data..



Hi Daniel,

Is there a reason I wouldn't simply run 200K systems then? My assumption is that would be simpler than running 50K systems 4 times. Thanks for your input.


Hi There, i assume that would be an option if you feel thats easier for you, but i do not think it would be the same.

I prefer to not do it like that, and i instead would prefer to start 4 different managers and run 4 separate builds and get the average of that for all stats i make.




I agree with Daniel. 200k will not at all give the same results.
it will have 1 set of indicators, while 4 test will have 4 different set of indicators.


View user's profile View All Posts By User
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 16-7-2020 at 10:40 PM


Quote: Originally posted by admin  
Quote: Originally posted by Daniel UK1  


Hi There, i assume that would be an option if you feel thats easier for you, but i do not think it would be the same.

I prefer to not do it like that, and i instead would prefer to start 4 different managers and run 4 separate builds and get the average of that for all stats i make.


I agree with Daniel. 200k will not at all give the same results.
it will have 1 set of indicators, while 4 test will have 4 different set of indicators.


Hi guys,

why would I have 4 different set of indicators if the opt settings are the same each time?

Thanks...


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5060
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 16-7-2020 at 10:47 PM


because every time you run indicator testing, the results normally vary
See enclosed example

vary.png - 66kB


View user's profile View All Posts By User
RandyT
Member
***


Avatar


Posts: 123
Registered: 5-12-2019
Location: Colorado, USA
Member Is Offline


[*] posted on 17-7-2020 at 07:26 AM


Quote: Originally posted by getty002  

Hi guys,

why would I have 4 different set of indicators if the opt settings are the same each time?

Thanks...


@getty002,

To answer your question in the broader sense, GSB is using a genetic algorithm to arrive at the results. By nature, GA being an evolutionary algorithm, will typically not arrive at the same results every time.

More details here: https://en.wikipedia.org/wiki/Genetic_algorithm


View user's profile View All Posts By User
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 18-7-2020 at 03:37 AM


Thanks Randy. I'm familiar with genetic optimization having used it previously for other applications. Ultimately, an exhaustive search will find all possibilities, and genetic optimization attempts to reduce the search space, as do other optimization schemes. Having said this, restarting an entirely new 50K search from scratch will inevitably cover portions of the net search space already covered with the 50K samples before it. I'm simply wondering about the inefficiency of performing 4 separate 50K searches that are unaware of the search space covered before it.



View user's profile View All Posts By User
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 18-7-2020 at 03:43 AM



I could use some help with basic problems I'm having with reading in data. The primary problem I have is related to the "Public Data Directory". My install is not to C:\GSB, but to Google drive (should be similar to dropbox others here use). The confusion is around which directory is selected to get access to the pre-installed files. I get "FileNotFoundException" whether I use either of these two locations:
C:\<>\Google Drive\GSB\Data\
C:\<>\Google Drive\GSB\Data\Price Data

Both result in the same error above and do not run the basic CL or ES pre-installed optimization settings. I CAN manually change every single OPT and TPD file location individually to override the error, but then this entirely defeats the point of specifying the "Public Data Directory" location if I have to specify every single file location independently.

The handing of the Public Data Directory is peculiar, when I select a directory, often it puts another sub-directory "Price Data" below it. Very confusing for me. File location handling has been the biggest challenge I've encountered so far with GSB. Your help would be appreciated.
Thanks for your help.


View user's profile View All Posts By User
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 18-7-2020 at 03:58 AM


Perhaps to clarify and simplify my question above, I'm using the following optimization file:

CL30-AdvancedModeOff.gsboptset

When using this file, it doesn't run, it gives the "FileNotFoundException" error. I then try changing the "Public Data Directory" to:
C:\<>\Google Drive\GSB\
C:\<>\Google Drive\GSB\Data\
C:\<>\Google Drive\GSB\Data\Price Data\

None of these 3 work to find the CL data pre-loaded with GSB. Hopefully this is now clearer the problem I'm encountering. I am, however, able to get this optimization file to run fine using my own data and specifying the exact location under the TPD and OPD price data. It can become frustrating though, because as soon as I change the optimization file, but want to use the same TPD file, it requires that I open the price data and locate each individual file again. Sorry for the really long post - hopefully there's something really simple I'm missing.


View user's profile View All Posts By User
Daniel UK1
Member
***




Posts: 470
Registered: 4-6-2019
Member Is Offline


[*] posted on 18-7-2020 at 06:19 AM


Quote: Originally posted by getty002  
Thanks Randy. I'm familiar with genetic optimization having used it previously for other applications. Ultimately, an exhaustive search will find all possibilities, and genetic optimization attempts to reduce the search space, as do other optimization schemes. Having said this, restarting an entirely new 50K search from scratch will inevitably cover portions of the net search space already covered with the 50K samples before it. I'm simply wondering about the inefficiency of performing 4 separate 50K searches that are unaware of the search space covered before it.



Hi Getty, My very much personal reasoning for using 4 or more separate builds, is that results can vary quite a lot between builds... and a decision taken based on the outcome from just one build where results can be randomly at the very top of the distribution of results or at the very bottom, can make my reasoning behind the decision useless, hence i like to get the average of several builds in order to increase the chance that my decision making has merit.


Many others i assume do things differently, we all have our own way to do things. No right or wrong, just different outcomes perhaps.


View user's profile View All Posts By User
bizgozcd
Junior Member
**




Posts: 53
Registered: 27-5-2020
Member Is Offline


[*] posted on 18-7-2020 at 12:40 PM


Question about Walks Forward.

The biggest bottleneck for me with my 2 machine setup + GSB cloud when available has been WF's. I'm wondering if cutting the number down to the top 100 or even 50 instead of 250 would sacrifice much in the big picture.

Any thoughts would be appreciated.


View user's profile View All Posts By User
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 18-7-2020 at 08:31 PM


Thanks Daniel, Randy, and Peter. I'll use the 4 or 5, 50K sets as you suggest.

I could use some help from the experienced users here regarding creating a portfolio of GSB strategies:

1) I'm looking to generate 12-14 strategies for my first GSB portfolio. Although, it would be far preferable to have 14 strategies on 6 or more instruments, based upon my current long run-time and challenges entering each new market, I'm considering 14 strategies on a single instrument (with hopefully sufficient decorrelation). Based upon user experience with GSB, is it reasonable to put 14 strategies on ES? I'd be shooting to have less than 30% correlation between any 2 strategies.

2) I'd like to set the minimum trades per month (historically), across the strategies. I know I can set minimum # of trades for the entire test period, but is there a way to set minimum trades over shorter intervals (i.e. per month) in GSB? I'm trying to avoid long periods of an idle portfolio.

3) Is there any metric earlier in the GSB pipeline to assess strategy correlation (so I can remove)? The family function does this to some degree, but I'm trying to avoid days of creating a single new strategy, only to find out in PA Pro that the strategy is 50% correlated to another in the portfolio and I have to toss it. Do the different families generally tend to provide sufficient decorrelation?

Thanks for your inputs!


View user's profile View All Posts By User
DocBober55
Junior Member
**




Posts: 17
Registered: 18-6-2020
Member Is Offline


[*] posted on 19-7-2020 at 12:29 AM


I often find that when I add a new data folder containing data for a new symbol to my C:\GSB\Data\Price Data folder, then try to change the Opt. Price Data in my Manager to use the new data,
that unless I am very lucky I get very frustrating error messages. It seems that you must be very careful or lucky and remove a data path on the right side panel, hit buttons like "deselect",
type in a symbol name in the correct area on the left side panel, etc. If you make an error in this process (and I find that it is very easy to make such an error and very likely that I will make such an error), you get error messages and are unable to change the Opt. price data to the new desired data file. It would be excellent if this process could be clearly demonstrated , step by step, in the instructions.


View user's profile View All Posts By User
RandyT
Member
***


Avatar


Posts: 123
Registered: 5-12-2019
Location: Colorado, USA
Member Is Offline


[*] posted on 19-7-2020 at 02:44 PM


Quote: Originally posted by getty002  
Thanks Daniel, Randy, and Peter. I'll use the 4 or 5, 50K sets as you suggest.

I could use some help from the experienced users here regarding creating a portfolio of GSB strategies:

1) I'm looking to generate 12-14 strategies for my first GSB portfolio. Although, it would be far preferable to have 14 strategies on 6 or more instruments, based upon my current long run-time and challenges entering each new market, I'm considering 14 strategies on a single instrument (with hopefully sufficient decorrelation). Based upon user experience with GSB, is it reasonable to put 14 strategies on ES? I'd be shooting to have less than 30% correlation between any 2 strategies.

2) I'd like to set the minimum trades per month (historically), across the strategies. I know I can set minimum # of trades for the entire test period, but is there a way to set minimum trades over shorter intervals (i.e. per month) in GSB? I'm trying to avoid long periods of an idle portfolio.

3) Is there any metric earlier in the GSB pipeline to assess strategy correlation (so I can remove)? The family function does this to some degree, but I'm trying to avoid days of creating a single new strategy, only to find out in PA Pro that the strategy is 50% correlated to another in the portfolio and I have to toss it. Do the different families generally tend to provide sufficient decorrelation?

Thanks for your inputs!


@getty, a few comments to your questions.

1. I would be very surprised if you could find 12 uncorrelated systems to trade on the same market. I would love to see you prove me wrong (and share how you achieve that) but I think this is a difficult task. GSB tends to find breakout systems in my experience. Mean reversion for example is not something that can be produced in GSB currently.

2. There is no way to do this currently in GSB. Also, in my experience, it is difficult to find systems over last 10 years that are not flat some part of it. It is all shades of gray as to how much better some systems do over those market regime changes than others.

3. A GSB run has no way of knowing what the performance is of another system you have in your portfolio. Interesting idea, but seems a challenging task. I think this will need to be part of your workflow with PA assessing your results after runs.

FWIW

Edit: Just to add some context to perhaps help with expectations. As an example, the work that Peter has been sharing on CL system development for example, has been months of work. It takes a ton of system resources and lots of time to try the huge number of different possible configurations to reach best system performance. Not meant to be discouraging, but rather to help prepare for a lot of work ahead. Would highly recommend absorbing Peter's latest work on CL as that is the current "state-of-the-art" on system development.


View user's profile View All Posts By User
getty002
Junior Member
**




Posts: 30
Registered: 10-7-2020
Member Is Offline


[*] posted on 19-7-2020 at 04:48 PM


Quote: Originally posted by RandyT  

@getty, a few comments to your questions.

1. I would be very surprised if you could find 12 uncorrelated systems to trade on the same market. I would love to see you prove me wrong (and share how you achieve that) but I think this is a difficult task. GSB tends to find breakout systems in my experience. Mean reversion for example is not something that can be produced in GSB currently.

2. There is no way to do this currently in GSB. Also, in my experience, it is difficult to find systems over last 10 years that are not flat some part of it. It is all shades of gray as to how much better some systems do over those market regime changes than others.

3. A GSB run has no way of knowing what the performance is of another system you have in your portfolio. Interesting idea, but seems a challenging task. I think this will need to be part of your workflow with PA assessing your results after runs.

FWIW

Edit: Just to add some context to perhaps help with expectations. As an example, the work that Peter has been sharing on CL system development for example, has been months of work. It takes a ton of system resources and lots of time to try the huge number of different possible configurations to reach best system performance. Not meant to be discouraging, but rather to help prepare for a lot of work ahead. Would highly recommend absorbing Peter's latest work on CL as that is the current "state-of-the-art" on system development.


Thanks very much for the reply Randy. It's great to share information on such things as my experience has been that many strategy developers don't want to discuss *anything* for fear of losing alpha. I much prefer this GSB community approach of openness to collectively conquer these really challenging problems as a group.

I realized after posting, just how unlikely it will be to find 12 decorrelated strategies on a single symbol. But having said this, finding perhaps up to 6 strategies on a symbol isn't unreasonable by simply exploiting different market regimes - trending, bear market, momentum, volatile, etc. By employing a regime approach I could expect to have very little correlation between strategies. This would also address my question 2, by identifying strategies that work during low volatility. I've already requested with Peter the ability to perform regime switching.

In a sense, GSB is focused on the volatility regime which is why long periods of the ES system, for example, are flat. The search capabilities of GSB are very powerful. if I were to speculate, part of the reason that low volatility challenges GSB is the frequent lack of use of daily data where signal to noise is high compared to intraday. Yes, volatility creates high SNR which generates us better alpha. But in lower volatility (and lower SNR), longer time-frames are generally needed to become successful (hence the request to include daily data).

I've been watching Peter's last 2 videos on continuous loop and reading the community comments in the private area. Perhaps the easiest approach for my future GSB portfolio to achieve the decorrelated returns I'm looking for is do what I believe the rest of you are doing - invest months searching the equity and energy markets for my 6 symbols and find my 12 decorrelated strategies that way, rather than regimes.

Regarding 3, one approach I've used before for a correlation metric is to average all values by row in the correlation matrix. The lower value, the lower the average correlation to all other strategies. It would be possible to put the average correlation into one of the GSB columns, then I could sort all strategies from least to most correlated. It's possible this may be best done family-to-family. I would see this as an advantage of performing inside GSB rather than PA Pro because I could sort and remove strategies using a macro (or manually) which are too correlated with each other.


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5060
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 19-7-2020 at 04:54 PM


Quote: Originally posted by bizgozcd  
Question about Walks Forward.

The biggest bottleneck for me with my 2 machine setup + GSB cloud when available has been WF's. I'm wondering if cutting the number down to the top 100 or even 50 instead of 250 would sacrifice much in the big picture.

Any thoughts would be appreciated.

Lets say you build familes from you 250 systems.
You might have say 10 families. You can then WF just the 10 families.
If you did 250, each wf will have little bit different results due to random seed in genetic WF, but thats no big deal.


View user's profile View All Posts By User
RandyT
Member
***


Avatar


Posts: 123
Registered: 5-12-2019
Location: Colorado, USA
Member Is Offline


[*] posted on 19-7-2020 at 05:23 PM


Quote: Originally posted by getty002  

Thanks very much for the reply Randy. It's great to share information on such things as my experience has been that many strategy developers don't want to discuss *anything* for fear of losing alpha. I much prefer this GSB community approach of openness to collectively conquer these really challenging problems as a group.

I realized after posting, just how unlikely it will be to find 12 decorrelated strategies on a single symbol. But having said this, finding perhaps up to 6 strategies on a symbol isn't unreasonable by simply exploiting different market regimes - trending, bear market, momentum, volatile, etc. By employing a regime approach I could expect to have very little correlation between strategies. This would also address my question 2, by identifying strategies that work during low volatility. I've already requested with Peter the ability to perform regime switching.

In a sense, GSB is focused on the volatility regime which is why long periods of the ES system, for example, are flat. The search capabilities of GSB are very powerful. if I were to speculate, part of the reason that low volatility challenges GSB is the frequent lack of use of daily data where signal to noise is high compared to intraday. Yes, volatility creates high SNR which generates us better alpha. But in lower volatility (and lower SNR), longer time-frames are generally needed to become successful (hence the request to include daily data).

I've been watching Peter's last 2 videos on continuous loop and reading the community comments in the private area. Perhaps the easiest approach for my future GSB portfolio to achieve the decorrelated returns I'm looking for is do what I believe the rest of you are doing - invest months searching the equity and energy markets for my 6 symbols and find my 12 decorrelated strategies that way, rather than regimes.

Regarding 3, one approach I've used before for a correlation metric is to average all values by row in the correlation matrix. The lower value, the lower the average correlation to all other strategies. It would be possible to put the average correlation into one of the GSB columns, then I could sort all strategies from least to most correlated. It's possible this may be best done family-to-family. I would see this as an advantage of performing inside GSB rather than PA Pro because I could sort and remove strategies using a macro (or manually) which are too correlated with each other.


@getty, I think that is a reasonable set of expectations and approach from my experience. I've been at it a little less than a year with GSB.

In my limited GSB experience, I find that GSB does quite well in producing systems that do well in volatility. That was demonstrated quite well in the months of Feb, March and April this year. May and June and so far July have been a bit disappointing given the drop in vol.


View user's profile View All Posts By User
Daniel UK1
Member
***




Posts: 470
Registered: 4-6-2019
Member Is Offline


[*] posted on 22-7-2020 at 12:52 PM


Quote: Originally posted by getty002  
Quote: Originally posted by RandyT  

@getty, a few comments to your questions.

1. I would be very surprised if you could find 12 uncorrelated systems to trade on the same market. I would love to see you prove me wrong (and share how you achieve that) but I think this is a difficult task. GSB tends to find breakout systems in my experience. Mean reversion for example is not something that can be produced in GSB currently.

2. There is no way to do this currently in GSB. Also, in my experience, it is difficult to find systems over last 10 years that are not flat some part of it. It is all shades of gray as to how much better some systems do over those market regime changes than others.

3. A GSB run has no way of knowing what the performance is of another system you have in your portfolio. Interesting idea, but seems a challenging task. I think this will need to be part of your workflow with PA assessing your results after runs.

FWIW

Edit: Just to add some context to perhaps help with expectations. As an example, the work that Peter has been sharing on CL system development for example, has been months of work. It takes a ton of system resources and lots of time to try the huge number of different possible configurations to reach best system performance. Not meant to be discouraging, but rather to help prepare for a lot of work ahead. Would highly recommend absorbing Peter's latest work on CL as that is the current "state-of-the-art" on system development.


Thanks very much for the reply Randy. It's great to share information on such things as my experience has been that many strategy developers don't want to discuss *anything* for fear of losing alpha. I much prefer this GSB community approach of openness to collectively conquer these really challenging problems as a group.

I realized after posting, just how unlikely it will be to find 12 decorrelated strategies on a single symbol. But having said this, finding perhaps up to 6 strategies on a symbol isn't unreasonable by simply exploiting different market regimes - trending, bear market, momentum, volatile, etc. By employing a regime approach I could expect to have very little correlation between strategies. This would also address my question 2, by identifying strategies that work during low volatility. I've already requested with Peter the ability to perform regime switching.

In a sense, GSB is focused on the volatility regime which is why long periods of the ES system, for example, are flat. The search capabilities of GSB are very powerful. if I were to speculate, part of the reason that low volatility challenges GSB is the frequent lack of use of daily data where signal to noise is high compared to intraday. Yes, volatility creates high SNR which generates us better alpha. But in lower volatility (and lower SNR), longer time-frames are generally needed to become successful (hence the request to include daily data).

I've been watching Peter's last 2 videos on continuous loop and reading the community comments in the private area. Perhaps the easiest approach for my future GSB portfolio to achieve the decorrelated returns I'm looking for is do what I believe the rest of you are doing - invest months searching the equity and energy markets for my 6 symbols and find my 12 decorrelated strategies that way, rather than regimes.

Regarding 3, one approach I've used before for a correlation metric is to average all values by row in the correlation matrix. The lower value, the lower the average correlation to all other strategies. It would be possible to put the average correlation into one of the GSB columns, then I could sort all strategies from least to most correlated. It's possible this may be best done family-to-family. I would see this as an advantage of performing inside GSB rather than PA Pro because I could sort and remove strategies using a macro (or manually) which are too correlated with each other.


Getty, just a few thoughts, i think Randy and Peter have covered most of your questions..

But if i was you i would think of a regime filter as another variable that makes things more complicated and that could break your system.. regime filters are quite difficult.. often you become aware and establish a regime when its already happened, ie to late..

I see filters as something you perhaps can ad and in my own case very unlikely, AFTER you have established that your strategy actually have an edge and is very good on its own which is in itself the most important and often overlooked part. Messing with filters and regime filters while developing the basic system is not good since the edge needs to be established on its own. Same way you would go about developing a system in a logic way without GSB, you need to establish the edge for entry and exit on its own and edge in the system before trying to improve with filters (in my very much humble opinion).

There is a ton of things one could do to get better results, Filters, doing just either long or short systems, different indicator values for long and short, using specific targets, using more indicators than 4 etc, each time adding variables that complicates things and make you more prone to break.in this case many people become to greedy and crash :). My approach is simplicity and average is always better than perfect.

Also,i would say its better to develop one really good system and get to know one market, than develop 45 systems in 6 markets in a reckless and speedy way.. i Believe Peter and I started this CL research run around the same time, October 2019, I stopped in june 2020, Peter still at it.. so i mean one market can easily take over 6-8 months, and this was a full time job...

Last thing, sometime a system not taking trades during a period in the markets, is actually a very good thing.. perhaps it is that there is no good edge to have in the current markets, then be happy that the system did not forced any trades, and saved you some cash..

Some good tips on getting some more diversification is different SF, and other timeframes and using other markets as data... however this is difficult because many markets is not very easy in providing many options for SF and Timeframes.

Sorry for my ranting, just my very much personal approach and views.




Thanks received (2):

+1 admin at 2020-07-22 18:21:53
+1 RandyT at 2020-07-22 14:07:32
View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5060
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 22-7-2020 at 05:23 PM


I very much agree with all of Randy's comments, though do not claim to be an expert in regimes.

View user's profile View All Posts By User
Carl
Member
***




Posts: 342
Registered: 10-5-2017
Member Is Offline

Mood: No Mood

[*] posted on 23-7-2020 at 01:50 AM


Adding a filter based on price action can improve performance.
But you have to use in sample and out of sample to see if the filter is robust.
Furthermore the filter has to impact a certain number of trades to be statistically significant.




Thanks received (1):

+1 admin at 2020-07-23 02:53:58
View user's profile View All Posts By User
tornado
Junior Member
**




Posts: 9
Registered: 15-6-2020
Member Is Offline


[*] posted on 23-7-2020 at 06:56 AM


Hi Peter,

I backuped my personal data and settings include folders Price Data, Settings, and files Contracts.txt, Price Data.txt, Sessions.txt .

I would like to know if i reinstall GSB, is that right to copy those that i just backuped to data folder?

If i do so, will GSB work well?

Thanks.


View user's profile View All Posts By User
admin
Super Administrator
*********




Posts: 5060
Registered: 7-4-2017
Member Is Offline

Mood: No Mood

[*] posted on 23-7-2020 at 05:18 PM


Quote: Originally posted by tornado  
Hi Peter,

I backuped my personal data and settings include folders Price Data, Settings, and files Contracts.txt, Price Data.txt, Sessions.txt .

I would like to know if i reinstall GSB, is that right to copy those that i just backuped to data folder?

If i do so, will GSB work well?

Thanks.

It should do, but why do you need to re-install?
To be save, backup the entire gsb folder using .7zip else zip


View user's profile View All Posts By User
 Pages:  1  ..  38    40    42  ..  98

  Go To Top

Trademaid forum. Software tools for TradeStation, MultiCharts & NinjaTrader
[Queries: 67] [PHP: 38.9% - SQL: 61.1%]