bfgn.experiments package¶
Submodules¶
bfgn.experiments.callbacks module¶
-
class
bfgn.experiments.callbacks.HistoryCheckpoint(config, existing_history=None, period=1, verbose=0)[source]¶ Bases:
keras.callbacks.CallbackA custom Keras callback for checkpointing model training history and associated information.
-
config= None¶
-
existing_history= None¶
-
period= None¶
-
verbose= None¶
-
epochs_since_last_save= None¶
-
epoch_begin= None¶
-
bfgn.experiments.experiments module¶
-
class
bfgn.experiments.experiments.Experiment(config)[source]¶ Bases:
object-
model= None¶ Keras model object.
- Type
keras.models.Model
-
loaded_existing_history= None¶ Whether an existing history object was loaded from the model training directory.
- Type
-
loaded_existing_model= None¶ Whether an existing model object was loaded from the model training directory.
- Type
-
is_model_trained= None¶ Whether model is trained to stopping criteria. If False, either not trained at all or training was stopped before stopping criteria.
- Type
-
config= None¶ bfgn configuration object.
- Type
-
logger= None¶ Root logger for Experiment. Available if user wants to directly modify the log formatting, handling, or other behavior.
- Type
-
fit_model_with_sequences(training_sequence, validation_sequence=None, resume_training=False)[source]¶ - Return type
None
-
calculate_model_memory_footprint(batch_size)[source]¶ Calculate model memory footprint. Shamelessly copied from (but not tested rigorously): https://stackoverflow.com/questions/43137288/how-to-determine-needed-memory-of-keras-model. :type batch_size:
int:param batch_size: Batch size for training.- Return type
- Returns
Model memory footprint in gigabytes.
-
-
bfgn.experiments.experiments.get_config_filepath(config)[source]¶ Get the default config path for experiments.
-
bfgn.experiments.experiments.get_history_filepath(config)[source]¶ Get the default model training history path for experiments.
bfgn.experiments.histories module¶
-
bfgn.experiments.histories.load_history(filepath)[source]¶ Loads model training history from serialized file.
-
bfgn.experiments.histories.save_history(history, filepath)[source]¶ Saves model training history to serialized file
bfgn.experiments.losses module¶
-
bfgn.experiments.losses.get_available_loss_methods()[source]¶ Gets list of available loss methods.
- Returns
List of available loss methods.
-
bfgn.experiments.losses.get_cropped_loss_function(loss_method, outer_width, inner_width, weighted=True)[source]¶ Creates a loss function callable with optional per-pixel weighting and edge-trimming.
- Parameters
loss_method (
str) – The loss calculate to implement, currently supports categorical_crossentropy or cc,or mae, mean_squared_error or mse, and root_mean_squared_error or rmse. (mean_absolute_error) –
outer_width (
int) – The full dimension (height or width) of the input image; e.g., 128 for a 128x128 image.inner_width (
int) – The full dimension (height or width) of the loss window to use. Must not be greater than 128 for aimage, with generally better results at 25% to 50% of the full image size; i.e., 32 to 64 for a 128x128 (128x128) –
image. –
weighted (
bool) – Whether the training response array has weights appended to the last axis to use in loss calculations.does not have a simple way to pass weight values to loss functions, so a common work-around is to append (Keras) –
values to the sample responses and reference those weights in the loss functions directly. The bfgn (weight) –
will automatically build and append weights if the configuration specifieds that the loss function (package) –
be weighted. (should) –
- Return type
Callable- Returns
Loss function callable to be passed to a Keras model.