01 January 1970 5 5K Report

Hi. Everyone. I'm trying to code hyperparameter optimization by using keras tuner.

At first I made layer, node and activation function optimization code

for i in range(hp.Int('num_layers',2,9)):

model.add(keras.layers.Dense(units = hp.Int('units_'+str(i),32,512,16), activation = hp.Choice('a_'+str(i), values=['softmax','relu','tanh','sigmoid'])))

like this.(Of course I made model first)

And then I want to optimize 'Optimizer' and 'learning_rate'

But I can't make the code that optimizes two parameter at the same time.

model.compile(optimizer = hp.Choice('optimizer',['Adam','sgd','rmsprop']),

loss = keras.losses.MeanSquaredError(reduction="auto"),

metrics = ['mae'])

this is my optimizer optimization code.(learning_rate is not added)

I want to add learning_rate optimization code so optimizer and learning_rate parameter is optimized simultaneously

learning_rate = hp.Choice('learning_rate', values = [1e-1, 1e-2])

model.compile(optimizer = keras.optimizers.Adam(learning_rate=hp_learning_rate),

loss = keras.losses.MeanSquaredError(reduction="auto"),

metrics = ['mae'])

Like this, i can make each single parameter optimization.

Could you help me to make double parameter optimization?

Thank you so much!

More Kim Kiwan's questions See All
Similar questions and discussions