Opened 7 years ago
Closed 7 years ago
#2852 closed enhancement (done)
Improve Constants Optimization for Symbolic Regression
Reported by: | mkommend | Owned by: | mkommend |
---|---|---|---|
Priority: | medium | Milestone: | HeuristicLab 3.3.15 |
Component: | Problems.DataAnalysis.Symbolic.Regression | Version: | trunk |
Keywords: | Cc: |
Description
- Constants optimization should work without linear scaling
- Constants optimization should report the number of function and gradient evaluations
Change History (16)
comment:1 Changed 7 years ago by mkommend
- Status changed from new to accepted
comment:2 Changed 7 years ago by mkommend
- Version set to trunk
comment:3 Changed 7 years ago by mkommend
- Milestone changed from HeuristicLab 3.3.16 to HeuristicLab 3.3.15
- Owner changed from mkommend to gkronber
- Status changed from accepted to reviewing
r15448: Added counts of function and gradient evaluations performed by constants optimization as results.
Maximum gradient evaluations = maximum iterations + 1
Maximum function evaluations = maximum iterations * 2 + 1;
comment:4 follow-up: ↓ 5 Changed 7 years ago by gkronber
Reviewed r15448.
Are 'result parameters' already available in the trunk?
Reading the code it is odd to find a 'ResultParameter'. I think at least the properties should be renamed (FunctionEvaluationsResultParameter -> FunctionEvaluationsResult, FunctionEvaluationsResultParameterName -> FunctionEvaluationsResultName)
comment:5 in reply to: ↑ 4 Changed 7 years ago by mkommend
Replying to gkronber:
Are 'result parameters' already available in the trunk?
Resultparameter were introduced with #2281 over a year ago!
Reading the code it is odd to find a 'ResultParameter'. I think at least the properties should be renamed (FunctionEvaluationsResultParameter -> FunctionEvaluationsResult, FunctionEvaluationsResultParameterName -> FunctionEvaluationsResultName)
I have already implemented the proposed changes, but during a final review and test I decided against committing them. The reason therefore is that an execution context must be present (and manually set line 169) to allow access to the result and the postfix parameter indicates that.
comment:6 Changed 7 years ago by gkronber
r15480: moved the scaling parameters to the end of the parameter vector to be able to remove the c.Skip(2).ToArray() call and removed unnecessary .ToArray() calls
comment:7 Changed 7 years ago by gkronber
- Owner changed from gkronber to mkommend
- Status changed from reviewing to readytorelease
comment:8 Changed 7 years ago by mkommend
- Owner changed from mkommend to gkronber
- Status changed from readytorelease to assigned
comment:9 Changed 7 years ago by gkronber
comment:10 Changed 7 years ago by gkronber
After some tests I conclude that the order of parameters in lsfit() has an influence on the optimization results. Therefore, we cannot put the scaling parameters at the end of the parameter vector without breaking backwards compatibility.
Please add a parameter to make reporting of the number of gradient and function evaluations optional.
comment:11 Changed 7 years ago by gkronber
- Owner changed from gkronber to mkommend
comment:12 Changed 7 years ago by mkommend
- Owner changed from mkommend to gkronber
- Status changed from assigned to reviewing
r15483: Added option for reporting function and gradient evaluation in the constants optimization evaluator.
Unfortunately, you have been right and reporting the evaluations has a huge impact on the performance. Quick tests on my work station for the OSGP SymReg sample show execution times of ~ 5.1 min with and ~ 3.02 min without reporting (fixed seed of 0).
comment:13 Changed 7 years ago by gkronber
Reviewed r15483. Thanks for testing this.
comment:14 Changed 7 years ago by gkronber
- Owner changed from gkronber to mkommend
- Status changed from reviewing to readytorelease
comment:15 Changed 7 years ago by mkommend
comment:16 Changed 7 years ago by mkommend
- Resolution set to done
- Status changed from readytorelease to closed
r15447: Adapted constants optimization and auto diff converter to not add linear scaling terms.