Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
149 views
in Technique[技术] by (71.8m points)

python 3.x - Naming of Keras Tuner Trials directory for TensorBoard

I am using Keras tuner's BayesianOptimization to search for the optimum hyper parameters of a model, I am also using the TensorBoard callback with it to visualise the performance of each model/trial.

However, the trials from the Tuner are named/labelled weirdly (e.g. trial_1dc4838863f2e4e8a84f0e415ee1db33). Is there a way that I can have the Tuner to name the trials only as "trial_1", "trial_2", etc.? Instead of all the numbers and letters that follow it?

I couldn't find anywhere in the Keras documentations how to do it or if there's an argument for it when creating the Tuner instance.

question from:https://stackoverflow.com/questions/65832395/naming-of-keras-tuner-trials-directory-for-tensorboard

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I was able to solve this by overriding the BayesianOptimization and BayesianOptimizationOracle classes. It just names each trial "0", "1", "2", etc.

But it would be nice if this was more flexible, because I will probably end up doing this for the other hypertuner methods. as well.

from kerastuner.engine import trial as trial_lib
from kerastuner.tuners import BayesianOptimization
from kerastuner.tuners.bayesian import 
    BayesianOptimization, BayesianOptimizationOracle


class CustomBayesianOptimizationOracle(BayesianOptimizationOracle):

    def __init__(self,
                 objective,
                 max_trials,
                 num_initial_points=None,
                 alpha=1e-4,
                 beta=2.6,
                 seed=None,
                 hyperparameters=None,
                 allow_new_entries=True,
                 tune_new_entries=True):
        super(CustomBayesianOptimizationOracle, self).__init__(
            objective=objective,
            max_trials=max_trials,
            num_initial_points=num_initial_points,
            alpha=alpha,
            beta=beta,
            seed=seed,
            hyperparameters=hyperparameters,
            tune_new_entries=tune_new_entries,
            allow_new_entries=allow_new_entries)

        self.trial_id = '0'

    def create_trial(self, tuner_id):
        """Create a new `Trial` to be run by the `Tuner`.

        A `Trial` corresponds to a unique set of hyperparameters to be run
        by `Tuner.run_trial`.

        Args:
          tuner_id: A ID that identifies the `Tuner` requesting a
          `Trial`. `Tuners` that should run the same trial (for instance,
           when running a multi-worker model) should have the same ID.

        Returns:
          A `Trial` object containing a set of hyperparameter values to run
          in a `Tuner`.
        """
        # Allow for multi-worker DistributionStrategy within a Trial.
        if tuner_id in self.ongoing_trials:
            return self.ongoing_trials[tuner_id]

        if self.max_trials and len(self.trials) >= self.max_trials:
            status = trial_lib.TrialStatus.STOPPED
            values = None
        else:
            response = self._populate_space(self.trial_id)
            status = response['status']
            values = response['values'] if 'values' in response else None

        hyperparameters = self.hyperparameters.copy()
        hyperparameters.values = values or {}
        trial = trial_lib.Trial(
            hyperparameters=hyperparameters,
            trial_id=self.trial_id,
            status=status)

        if status == trial_lib.TrialStatus.RUNNING:
            self.ongoing_trials[tuner_id] = trial
            self.trials[self.trial_id] = trial
            self._save_trial(trial)
            self.save()

        self.trial_id = str(int(self.trial_id) + 1)

        return trial


class CustomBayesianOptimization(BayesianOptimization):

    def __init__(self,
                 hypermodel,
                 objective,
                 max_trials,
                 num_initial_points=2,
                 seed=None,
                 hyperparameters=None,
                 tune_new_entries=True,
                 allow_new_entries=True,
                 **kwargs):
        oracle = CustomBayesianOptimizationOracle(
            objective=objective,
            max_trials=max_trials,
            num_initial_points=num_initial_points,
            seed=seed,
            hyperparameters=hyperparameters,
            tune_new_entries=tune_new_entries,
            allow_new_entries=allow_new_entries)
        super(BayesianOptimization, self).__init__(
            oracle=oracle,
            hypermodel=hypermodel,
            **kwargs)

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...