I am working on an algorithm that has multiple fixed parameters. The algorithm analyzes time series data and spits out a number. The fixed parameters need to be such that this number is as small as possible.
What I found, is that when optimizing the parameters for a specific time period, these parameters don’t necessarily work well when used on another time period.
The way I see it, is that there are two possible solutions to this problem:
- use a longer time period when optimizing the parameters
- find a method of combining the optimal parameters for different time periods, such that these “averaged” parameters work well on all time periods
Option 1. would be incredibly expensive in terms of computational time. And although it makes intuitive sense that this should fix the problem, I am not sure that this would indeed be the case.
Option 2. reminds me of training neural networks, where one would feed in a large number of “data points” and somehow take a (weighted) average of the results to find a set of parameters that work well for all data points. Unfortunately, I know very little to nothing about the algorithms used for this kind of optimization/learning.
Any help or suggestions are greatly appreciated. Please let me know if there is anything you’d like me to expand upon.