OPTANO Algorithm Tuner Change-Log
OPTANO Algorithm Tuner (OAT) is a Genetic Algorithm implementation. It finds near-optimal parameters (a configuration) for any given target algorithm.
Version 2.1.0 (2021-11-12)
Changes:
- Feature: Introduce
addFinalIncumbentGeneration
parameter to control a potential final incumbent generation.- By using a random subset of instances per generation, the current incumbent genome does not need to be stricly better than its predecessor. For example, the current incumbent genome may outperform its predecessor on the current instance set, but perform worse on all provided training instances.
- While using a random subset of instances per generation to avoid overfitting is still recommended, adding a final incumbent generation ensures that OPTANO Algorithm Tuner always returns the fittest incumbent genome.
- If the final incumbent generation is enabled, the last regular generation is followed by a generation, in which all incumbent genomes get evaluated on all provided training instances within a single mini tournament. The number of additional evaluations is kept low by caching previous results and applying racing, if enabled.
- Feature: Rename
addDefaultGenome
parameter toaddDefaultGenomeToFirstGeneration
and introduceaddDefaultGenomeToFinalIncumbentGeneration
parameter to differentiate the potential insertion of a default genome into the corresponding generation. - Feature: Introduce
RacingRunEvaluatorBase
to support the implementation of custom racing strategies. - Feature: Introduce
--tertiaryTuneCriterion
parameter in exemplary Gurobi adapter to flexibly control the adapter's tuning metric. Valid criterions areMipGap
,BestObjective
,BestObjectiveBound
andNone
, disabling the tertiary tune criterion. - Documentation: Add prepared Excel file to documentation to estimate a rough upper bound for the overall tuning runtime.
- Improvement: Extend logging information by date in addition to time.
Version 2.0.0 (2021-05-04)
Changes:
- Feature: Introduce gray box extension to detect out-timing target algorithm runs at runtime. By enabling this extension, OAT can find good configurations faster than its black box counterpart. Detailed information can be found at
- Feature: Add ACLib adapter to advanced examples of OAT.
- Feature: Introduce
allowLocalEvaluations
parameter to control whether target algorithm evaluations can be executed on the master node. - Feature: Introduce
tuningRandomSeed
parameter to control the initial population and the subset of instances, used per generation. - Improvement: Tighten the racing strategies of the Generic OAT Application and some advanced examples of OAT (i.e. Lingeling, Gurobi and SAPS). Again, this led to a strong reduction of required compute time, in addition to the one achieved in version 1.0.0.
- Improvement: Rework usage of Akka.net in OAT: More stability, better logging.
- Improvement: The new memory limitation page now provides two ways to limit the memory of each target algorithm evaluation of OAT in your custom OAT adapter.
- Improvement: Further code clean ups + additional unit tests in OAT and its advanced examples.
Version 1.0.0 (2021-03-25)
Changes:
- Improvement: Reworked parallelization of genome/instance evaluations.
- Feature: Racing rules can be customized by end-user.
- By customization, Racing now can kill more evaluations, leading to a reduction of required compute time, while not altering any tournament-outcome (when compared to a tuning without racing).
- Overall, a speedup of factor
2
up to5
over version 0.9.1 is achieved. (In combination with the new parallelization.)
- Improvement: Further code clean ups + additional unit tests.
Version 0.9.1 (2020-10-14)
Changes:
- Feature: Add option for specifying the default configuration of the target algorithm, in order to include it in the initial population when a new tuning is started.
- See notes on usage here or refer to Gurobi's
parameterTree.xml
for an example on how to use the default value feature.
- See notes on usage here or refer to Gurobi's
- Improvement: The new download page now provides a self-contained version of the Tuner.Application for
win-x64
,linux-x64
andosx-x64
. - Improvement: Export the generation history after each generation.
- Previously, the generation history was only exported after the tuning was finished (and the optional evaluation of the test set has been completed).
- Improvement: (More) Clean ups and refactoring for the example projects. E.g.:
- Gurobi now targets Gurobi 9.0 and supports parsing of compressed instance- and start solution files.
- Fix: The logged age of the incumbent genome now is updated properly in the case that the incumbent did not change.
- This issue did not affect the behavior/performance of OAT and was simply related to logging.
Version 0.9.0 (2020-05-29)
Changes:
- Feature: Introduce .Net Standard 2.1 compatibility
- Feature: Add out-of-the-box algorithm tuning for common optimization functions
- Improvement: Reorganize exemplary adapters for BBOB, Gurobi, Lingeling and SAPS
- Improvement: Remove unnecessary NuGet packages
- Improvement: Update referenced NuGet packages
- Akka.Cluster 1.3.14 -> 1.4.3
- Akka.Logger.NLog 1.3.3 -> 1.3.5
- Akka.Serialization.Hyperion 1.3.14-beta -> 1.4.3
- MathNet.Numerics 4.8.1 -> 4.9.0
- Improvement: Revise default parameters to improve overall tuning behaviour
- Feature: The
tunerLog.txt
logging file is now part of the zip archive created in case of--zipOldStatus=true
- Improvement: Improved default parameters
- Default usage of standard random forest in model-based crossover
- Improvement: Update of referenced NuGet packages
- Akka 1.3.2 -> 1.3.14
- Akka.Cluster 1.3.2 -> 1.3.14
- Akka.Logger.NLog 1.3.0-beta -> 1.3.3
- Akka.Remote 1.3.2 -> 1.3.14
- Akka.Serialization.Hyperion 1.3.2-beta54 ->1.3.14-beta
- DotNetty.Buffers 0.4.7 -> 0.6.0
- DotNetty.Codecs 0.4.7 -> 0.6.0
- DotNetty.Common 0.4.7 -> 0.6.0
- DotNetty.Handlers 0.4.7 -> 0.6.0
- DotNetty.Transport 0.4.7 -> 0.6.0
- Google.Protobuf 3.4.1 -> 3.9.1
- Hyperion 0.9.6 -> 0.9.8
- MathNet.Numerics 4.0.0-beta06 -> 4.8.1
- Microsoft.NETCore.Platforms 2.0.1 -> 2.2.3
- NETStandard.Library 2.0.1 -> 2.0.3
- Newtonsoft.Json 9.0.1 -> 12.0.2
- NLog 5.0.0-beta09 -> 4.6.7
- System.Collections.Immutable 1.3.1 -> 1.5.0
- System.Runtime.CompilerServices.Unsafe 4.4 -> 4.5.2
- Improvement: Better stability during remote execution (e.g. on clusters) due to several bug fixes in referenced libraries
- Fix: Elapsed time is now written to status file, resulting in correct logging of the sum of elapsed time in the case of tuning in multiple sessions
- Extended the package until April 1st.
- Feature: Added additional ways to hybridize GGA(++) with JADE or CMA-ES which focus on improving the current incumbent.
- Useful if tuning a mix of categorical and numerical parameters
- Can be activated via command line:
--focusOnIncumbent=true
- Feature: Added an additional adaptive termination criterion for GGA(++) phases when hybridizing.
- Supports change of tuning algorithm dependent on tuning progress
- Can be controlled via the
--maxGgaGenerationsWithSameIncumbent
argument
- Improvement: Improved default parameters for number of generations per strategy
- Set to maximum integer value, since the number of generations is usually limited by overall parameter
numGens
- Set to maximum integer value, since the number of generations is usually limited by overall parameter
- Improvement: Parameter
generationsPerGgaPhase
renamed tomaxGenerationsPerGgaPhase
- Improvement: Internal code cleanups.
- Improved namespace logic comes with changes in namespaces for some classes.
- Extended the package until September 1st.
- Feature: Added additional tuning algorithms specialized for continuous parameters:
- Feature: New option
--scoreGenerationHistory
to evaluate a tuner run on complete training and test set if it optimizes a numerical evaluation value- Average scores are logged in two new logging files,
generationHistory.csv
andscores.csv
- Calling
Master.Run
now requires theAlgorithmTuner
factory method to take both a training and a test instance folder - Using the test set by calling
AlgorithmTuner.SetTestInstances
is optional
- Average scores are logged in two new logging files,
- Improvement: More information is logged, esp. on debug level
- Complete OPTANO Algorithm Tuner configuration
- All types inheriting from
ConfigurationBase
must now implementtoString
- All types inheriting from
- Information about the total number of evaluations
- Information about repair operations
- Complete OPTANO Algorithm Tuner configuration
- Improvement: Increased stability for distributed execution by employing smaller messages
- Improvement: Metric compare values are now defined by
IMetricRunEvaluator
instead ofIMetricResult
- Improvement: Simplified support of status files. This should lead to less breaking changes in the future.
- Segmented status files. We now create multiple status files which are saved in a single directory. In console arguments, you can specify the directory instead of the status file name.
- All types inheriting from
ConfigurationBase
must now implement the methodIsTechnicallyCompatible(ConfigurationBase other)
- All types implementing
IConfigBuilder
must implementBuildWithFallback(ConfigurationBase fallback)
- Improvement: New parameter
--zipOldStatus
to determine whether old status files should be zipped or overwritten. Default is false. - Feature: The number of evaluations can be bounded via
--evaluationLimit
. - Improvement: Additional information is written to output file, and verbosity options can be used to specify which log types become visible on console
- Improvement: Internal code cleanups.
- Comes with an interface change for subclasses of
HelpSupportingArgumentParser{T}
. Usethis.InternalConfigurationBuilder
instead ofthis.configurationBuilder
to fix any errors. - Improved namespace logic comes with changes in namespaces for many classes.
- Comes with an interface change for subclasses of
- Fix: By default, the status file now is written to the working directory instead of the directory that contains the tuner.exe.
- Fix: Fixed several minor issues regarding the console output. (E.g. Frequencies of output, formats, etc.)
- Improvement: Speedups in machine-learning based tuning.
- Improvement: Reduced amount of traffic sent over TCP between Master and Workers.
- Improvement: Clarified role of the
IRunEvaluator
interface both in the user documentation and API. - Fix: Evaluation settings were not properly updated on all external workers.
- This led to unfair comparisons between configurations when OPTANO Algorithm Tuner was used in a distributed fashion.
- Fix: Target sampling was not performed for some reachable leaves during model based crossover.
- Fix: Console output is now written to working directory instead of *.exe directory.
- .NET Core 2.0 Compatibility
- Running OPTANO Algorithm Tuner via .NET Core comes with improved stability. We now recommend this setup for all projects, but especially if you are tuning dozens of parameters.
- Cleaner API
- Clear naming
- Complete API documentation
- Merged
IComparer<IInstanceFile>
andIInstanceFile
intoInstanceBase
- Simplified status dumps
- Updated user documentation
- especially with respect to machine-learning based tuning
parameterTree.xsd
andSharpLearningCustom
are now automatically added to builds- Added native PAR-k evaluations.
- Logging
- All console output generated by OPTANO Algorithm Tuner is also written to consoleOutput.log. This can be changed using LoggingHelper.Write.
- Export information about current incumbent genome to tunerLog.txt after each generation.
- Faulty target algorithm evaluations on distributed run will only cause the affected worker to stop, but the run will continue.
- Added "-ownHostName" parameter to explicitly tell Akka.Cluster via which channel it shall communicate with other cluster members.
- If this option is omitted, the Fully Qualified Domain Name will be used.
- Make sure to pass the FQDN of your master node as "--seedHostName" when you start the workers.
- Added beta-support for machine learning based tuning
- Based on: Ansótegui, Carlos, et al. "Model-Based Genetic Algorithms for Algorithm Configuration." IJCAI. 2015.
- Documentation will follow shortly.
- Update Akka.Cluster and dependencies from 1.2.3 to 1.3.2.
- Update Akka.Cluster and dependencies from 1.1.3 to 1.2.3.
- The new version uses DotNetty instead of the deprecated Helios and comes with several bug fixes. Check the official Akka release notes for detailed information.
- Initial Commit.
Thanks for reading thoroughly!