evolvepy.callbacks package
Submodules
evolvepy.callbacks.callback module
- class evolvepy.callbacks.callback.Callback(run: bool = True, parameters: Dict[str, object] | None = None, dynamic_parameters: Dict[str, bool] | None = None)[source]
Bases:
Configurable
Base Callback class.
Callbacks are objects that can be called upon during evolution to change its behavior.
- __init__(run: bool = True, parameters: Dict[str, object] | None = None, dynamic_parameters: Dict[str, bool] | None = None)[source]
Callback constructor.
- Parameters:
run (bool, optional) – Whether the object should run. Defaults to True.
parameters (Dict[str, object], optional) – Other callback parameters. Defaults to None.
dynamic_parameters (Dict[str,bool], optional) – Other callback dynamic parameters description. Defaults to None.
- property callbacks: List[Callback]
Other callbacks associated with evolution
Must be set correctly for Callback to work
- property evaluator: Evaluator
The Evaluator associated with evolution.
Must be set correctly for Callback to work
- property generator: Generator
The Generator associated with evolution.
Must be set correctly for Callback to work
- on_evaluator_end(fitness: ndarray) None [source]
Called after evaluator run.
- Parameters:
fitness (np.ndarray) – The population fitness.
evolvepy.callbacks.dynamic_mutation module
- class evolvepy.callbacks.dynamic_mutation.DynamicMutation(layer_names: List[str], patience: int = 10, refinement_patience: int = 2, exploration_patience: int = 2, refinement_steps: int = 2, exploration_steps: int = 5, refinement_divider: int = 2, exploration_multiplier: int = 2, stop_refinement: bool = False, run: bool = True)[source]
Bases:
Callback
Callback that implements the behavior of a dynamic mutation.
Dynamic mutation is the process of changing mutation ranges during the process of evolution, to prevent the population from stagnating at a local maximum, or never adjusting correctly to that maximum.
It works in three stages, in order:
Normal: Mutation occurs without changes. Refinement: Mutation decreases gradually, to correctly adjust to a maximum. It takes place in several steps. Exploration: Mutation gradually increases, to look for other maximums. It takes place in several steps.
It is stopped earlier if there is an improvement in fitness (it is understood that another local maximum has been found)
A stage transition occurs when the best fitness does not change after a few generations.
- EXPLORATION = 2
- NORMAL = 0
- REFINEMENT = 1
- __init__(layer_names: List[str], patience: int = 10, refinement_patience: int = 2, exploration_patience: int = 2, refinement_steps: int = 2, exploration_steps: int = 5, refinement_divider: int = 2, exploration_multiplier: int = 2, stop_refinement: bool = False, run: bool = True)[source]
DynamicMutation constructor
- Parameters:
layer_names (List[str]) – Names of the NumericMutation layers that will be affected by this callback.
patience (int, optional) – How many generations in normal mode to wait before starting a transition (to refinement). Defaults to 10.
refinement_patience (int, optional) – How many generations in refinement mode to wait before starting a transition (to refinement or exploration). Defaults to 2.
exploration_patience (int, optional) – How many generations in exploration mode to wait before starting a transition (to exploration or normal). Defaults to 2.
refinement_steps (int, optional) – How many refinement steps will be performed. Defaults to 2.
exploration_steps (int, optional) – How many exploration steps will be performed. Defaults to 5.
refinement_divider (int, optional) – How much to divide the mutation rates at each refinement step. Defaults to 2.
exploration_multiplier (int, optional) – How much to multiply the mutation rates at each exploration step. Defaults to 2.
stop_refinement (bool, optional) – Whether to stop refining if you find an improvement in fitness. Defaults to False.
run (bool, optional) – Whether this callback should be executed. Defaults to True.
- Raises:
ValueError – raised if layer_names is not a list.
evolvepy.callbacks.incremental_evolution module
- class evolvepy.callbacks.incremental_evolution.IncrementalEvolution(generation_to_start: int, block_layer: Block, first_gen_layer: FirstGenLayer, callbacks: List[Callback] | None = None)[source]
Bases:
Callback
Callback that implements the behavior of a incremental evolution.
Incremental evolution is the process of evolving some aspects of the solution sought at each moment.
It works by preventing some pieces of the individuals from being altered, allowing the adjustment of a part of them, usually that is more essential to the problem (e.g. learning to process the readings of a robot’s sensors before getting around)
The piece of individuals that will be blocked must be on its own chromosome.
- This callback works in conjunction with two layers:
Block layer: prevents the chromosome to be changed. It must be in the generator before the layers that can alter the chromosome.
FirstGenLayer: generates the random distribution of the chromosome after its unlocking
See the incremental evolution example to better understand how to use this callback: https://github.com/EltonCN/evolvepy/blob/main/examples/Incremental%20Evolution.ipynb
- __init__(generation_to_start: int, block_layer: Block, first_gen_layer: FirstGenLayer, callbacks: List[Callback] | None = None)[source]
IncrementalEvolution constructor.
- Parameters:
generation_to_start (int) – In which generation to unlock the chromosome.
block_layer (Block) – Layer that will prevent the chromosome to be changed.
first_gen_layer (FirstGenLayer) – Layer that will generate the random distribution of the chromosome after its unlocking
callbacks (List[Callback], optional) – Callbacks that will be disabled along the chromosome. Defaults to None.
evolvepy.callbacks.logger module
- class evolvepy.callbacks.logger.FileStoreLogger(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Bases:
Logger
Logger that saves all logs in a file.
- __init__(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
FileStoreLogger constructor.
- Parameters:
log_fitness (bool, optional) – Whether it should log the fitness of all individuals of each generation. Defaults to True.
log_population (bool, optional) – Whether it should log the populations. Defaults to False.
log_generator (bool, optional) – Whether it should log the generator dynamic parameters. Defaults to True.
log_evaluator (bool, optional) – Whether it should log the evaluator dynamic parameters. Defaults to True.
log_scores (bool, optional) – Whether it should log all the evaluator individual scores. Defaults to False.
log_best_individual (bool, optional) – Whether it should log the best individual of each generation. Defaults to True.
- property log_name: str
Name of the file where the log is saved
- class evolvepy.callbacks.logger.Logger(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Bases:
Callback
,ABC
Basic Logger callback class.
Allows to log data from the evolutionary process
It needs to be inherited in some concrete save implementation to be used, like MemoryStoreLogger, FileStoreLogger or WandbLogger.
- __init__(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Logger constructor.
- Parameters:
log_fitness (bool, optional) – Whether it should log the fitness of all individuals of each generation. Defaults to True.
log_population (bool, optional) – Whether it should log the populations. Defaults to False.
log_generator (bool, optional) – Whether it should log the generator dynamic parameters. Defaults to True.
log_evaluator (bool, optional) – Whether it should log the evaluator dynamic parameters. Defaults to True.
log_scores (bool, optional) – Whether it should log all the evaluator individual scores. Defaults to False.
log_best_individual (bool, optional) – Whether it should log the best individual of each generation. Defaults to True.
- on_evaluator_end(fitness: ndarray) None [source]
Called after population evaluation.
Adds the fitness, scores, dynamic parameters of generator and evaluator and best individual if configured to do so. Also adds the best fitness.
Saves the dynamic log.
- Parameters:
fitness (np.ndarray) – Population fitness.
- on_generator_end(population: ndarray) None [source]
Called on generator end.
Adds the generated population to the log (if configured to do so), and the generation counter.
- Parameters:
population (np.ndarray) – Generated population that may be logged.
- class evolvepy.callbacks.logger.MemoryStoreLogger(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Bases:
Logger
Logger that keeps all logged data in memory.
Simple to use but can cause high memory usage.
- __init__(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
MemoryStoreLogger constructor.
- Parameters:
log_fitness (bool, optional) – Whether it should log the fitness of all individuals of each generation. Defaults to True.
log_population (bool, optional) – Whether it should log the populations. Defaults to False.
log_generator (bool, optional) – Whether it should log the generator dynamic parameters. Defaults to True.
log_evaluator (bool, optional) – Whether it should log the evaluator dynamic parameters. Defaults to True.
log_scores (bool, optional) – Whether it should log all the evaluator individual scores. Defaults to False.
log_best_individual (bool, optional) – Whether it should log the best individual of each generation. Defaults to True.
- property config_log: Dict[str, Dict]
Allows access to static log
- Returns:
The static log.
- Return type:
List[Dict[str, Dict]]
- property log: List[Dict[str, Dict]]
Allows access to dynamic log
- Returns:
The dynamic log.
- Return type:
List[Dict[str, Dict]]
Module contents
EvolvePy’s callbacks. Objects that can be called upon during evolution to change its behavior.
- class evolvepy.callbacks.Callback(run: bool = True, parameters: Dict[str, object] | None = None, dynamic_parameters: Dict[str, bool] | None = None)[source]
Bases:
Configurable
Base Callback class.
Callbacks are objects that can be called upon during evolution to change its behavior.
- __init__(run: bool = True, parameters: Dict[str, object] | None = None, dynamic_parameters: Dict[str, bool] | None = None)[source]
Callback constructor.
- Parameters:
run (bool, optional) – Whether the object should run. Defaults to True.
parameters (Dict[str, object], optional) – Other callback parameters. Defaults to None.
dynamic_parameters (Dict[str,bool], optional) – Other callback dynamic parameters description. Defaults to None.
- property callbacks: List[Callback]
Other callbacks associated with evolution
Must be set correctly for Callback to work
- property evaluator: Evaluator
The Evaluator associated with evolution.
Must be set correctly for Callback to work
- property generator: Generator
The Generator associated with evolution.
Must be set correctly for Callback to work
- on_evaluator_end(fitness: ndarray) None [source]
Called after evaluator run.
- Parameters:
fitness (np.ndarray) – The population fitness.
- class evolvepy.callbacks.DynamicMutation(layer_names: List[str], patience: int = 10, refinement_patience: int = 2, exploration_patience: int = 2, refinement_steps: int = 2, exploration_steps: int = 5, refinement_divider: int = 2, exploration_multiplier: int = 2, stop_refinement: bool = False, run: bool = True)[source]
Bases:
Callback
Callback that implements the behavior of a dynamic mutation.
Dynamic mutation is the process of changing mutation ranges during the process of evolution, to prevent the population from stagnating at a local maximum, or never adjusting correctly to that maximum.
It works in three stages, in order:
Normal: Mutation occurs without changes. Refinement: Mutation decreases gradually, to correctly adjust to a maximum. It takes place in several steps. Exploration: Mutation gradually increases, to look for other maximums. It takes place in several steps.
It is stopped earlier if there is an improvement in fitness (it is understood that another local maximum has been found)
A stage transition occurs when the best fitness does not change after a few generations.
- EXPLORATION = 2
- NORMAL = 0
- REFINEMENT = 1
- __init__(layer_names: List[str], patience: int = 10, refinement_patience: int = 2, exploration_patience: int = 2, refinement_steps: int = 2, exploration_steps: int = 5, refinement_divider: int = 2, exploration_multiplier: int = 2, stop_refinement: bool = False, run: bool = True)[source]
DynamicMutation constructor
- Parameters:
layer_names (List[str]) – Names of the NumericMutation layers that will be affected by this callback.
patience (int, optional) – How many generations in normal mode to wait before starting a transition (to refinement). Defaults to 10.
refinement_patience (int, optional) – How many generations in refinement mode to wait before starting a transition (to refinement or exploration). Defaults to 2.
exploration_patience (int, optional) – How many generations in exploration mode to wait before starting a transition (to exploration or normal). Defaults to 2.
refinement_steps (int, optional) – How many refinement steps will be performed. Defaults to 2.
exploration_steps (int, optional) – How many exploration steps will be performed. Defaults to 5.
refinement_divider (int, optional) – How much to divide the mutation rates at each refinement step. Defaults to 2.
exploration_multiplier (int, optional) – How much to multiply the mutation rates at each exploration step. Defaults to 2.
stop_refinement (bool, optional) – Whether to stop refining if you find an improvement in fitness. Defaults to False.
run (bool, optional) – Whether this callback should be executed. Defaults to True.
- Raises:
ValueError – raised if layer_names is not a list.
- class evolvepy.callbacks.FileStoreLogger(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Bases:
Logger
Logger that saves all logs in a file.
- __init__(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
FileStoreLogger constructor.
- Parameters:
log_fitness (bool, optional) – Whether it should log the fitness of all individuals of each generation. Defaults to True.
log_population (bool, optional) – Whether it should log the populations. Defaults to False.
log_generator (bool, optional) – Whether it should log the generator dynamic parameters. Defaults to True.
log_evaluator (bool, optional) – Whether it should log the evaluator dynamic parameters. Defaults to True.
log_scores (bool, optional) – Whether it should log all the evaluator individual scores. Defaults to False.
log_best_individual (bool, optional) – Whether it should log the best individual of each generation. Defaults to True.
- property log_name: str
Name of the file where the log is saved
- class evolvepy.callbacks.IncrementalEvolution(generation_to_start: int, block_layer: Block, first_gen_layer: FirstGenLayer, callbacks: List[Callback] | None = None)[source]
Bases:
Callback
Callback that implements the behavior of a incremental evolution.
Incremental evolution is the process of evolving some aspects of the solution sought at each moment.
It works by preventing some pieces of the individuals from being altered, allowing the adjustment of a part of them, usually that is more essential to the problem (e.g. learning to process the readings of a robot’s sensors before getting around)
The piece of individuals that will be blocked must be on its own chromosome.
- This callback works in conjunction with two layers:
Block layer: prevents the chromosome to be changed. It must be in the generator before the layers that can alter the chromosome.
FirstGenLayer: generates the random distribution of the chromosome after its unlocking
See the incremental evolution example to better understand how to use this callback: https://github.com/EltonCN/evolvepy/blob/main/examples/Incremental%20Evolution.ipynb
- __init__(generation_to_start: int, block_layer: Block, first_gen_layer: FirstGenLayer, callbacks: List[Callback] | None = None)[source]
IncrementalEvolution constructor.
- Parameters:
generation_to_start (int) – In which generation to unlock the chromosome.
block_layer (Block) – Layer that will prevent the chromosome to be changed.
first_gen_layer (FirstGenLayer) – Layer that will generate the random distribution of the chromosome after its unlocking
callbacks (List[Callback], optional) – Callbacks that will be disabled along the chromosome. Defaults to None.
- class evolvepy.callbacks.Logger(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Bases:
Callback
,ABC
Basic Logger callback class.
Allows to log data from the evolutionary process
It needs to be inherited in some concrete save implementation to be used, like MemoryStoreLogger, FileStoreLogger or WandbLogger.
- __init__(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Logger constructor.
- Parameters:
log_fitness (bool, optional) – Whether it should log the fitness of all individuals of each generation. Defaults to True.
log_population (bool, optional) – Whether it should log the populations. Defaults to False.
log_generator (bool, optional) – Whether it should log the generator dynamic parameters. Defaults to True.
log_evaluator (bool, optional) – Whether it should log the evaluator dynamic parameters. Defaults to True.
log_scores (bool, optional) – Whether it should log all the evaluator individual scores. Defaults to False.
log_best_individual (bool, optional) – Whether it should log the best individual of each generation. Defaults to True.
- on_evaluator_end(fitness: ndarray) None [source]
Called after population evaluation.
Adds the fitness, scores, dynamic parameters of generator and evaluator and best individual if configured to do so. Also adds the best fitness.
Saves the dynamic log.
- Parameters:
fitness (np.ndarray) – Population fitness.
- on_generator_end(population: ndarray) None [source]
Called on generator end.
Adds the generated population to the log (if configured to do so), and the generation counter.
- Parameters:
population (np.ndarray) – Generated population that may be logged.
- class evolvepy.callbacks.MemoryStoreLogger(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
Bases:
Logger
Logger that keeps all logged data in memory.
Simple to use but can cause high memory usage.
- __init__(log_fitness: bool = True, log_population: bool = False, log_generator: bool = True, log_evaluator: bool = True, log_scores: bool = False, log_best_individual: bool = True)[source]
MemoryStoreLogger constructor.
- Parameters:
log_fitness (bool, optional) – Whether it should log the fitness of all individuals of each generation. Defaults to True.
log_population (bool, optional) – Whether it should log the populations. Defaults to False.
log_generator (bool, optional) – Whether it should log the generator dynamic parameters. Defaults to True.
log_evaluator (bool, optional) – Whether it should log the evaluator dynamic parameters. Defaults to True.
log_scores (bool, optional) – Whether it should log all the evaluator individual scores. Defaults to False.
log_best_individual (bool, optional) – Whether it should log the best individual of each generation. Defaults to True.
- property config_log: Dict[str, Dict]
Allows access to static log
- Returns:
The static log.
- Return type:
List[Dict[str, Dict]]
- property log: List[Dict[str, Dict]]
Allows access to dynamic log
- Returns:
The dynamic log.
- Return type:
List[Dict[str, Dict]]