Im trying to save and load weights from the model i have trained.
the code im using to save the model is.
TensorBoard(log_dir='/output')
model.fit_generator(image_a_b_gen(batch_size), steps_per_epoch=1, epochs=1)
model.save_weights('model.hdf5')
model.save_weights('myModel.h5')
Let me know if this an incorrect way to do it,or if there is a better way to do it.
but when i try to load them,using this,
from keras.models import load_model
model = load_model('myModel.h5')
but i get this error:
ValueError Traceback (most recent call
last)
<ipython-input-7-27d58dc8bb48> in <module>()
1 from keras.models import load_model
----> 2 model = load_model('myModel.h5')
/home/decentmakeover2/anaconda3/lib/python3.5/site-
packages/keras/models.py in load_model(filepath, custom_objects, compile)
235 model_config = f.attrs.get('model_config')
236 if model_config is None:
--> 237 raise ValueError('No model found in config file.')
238 model_config = json.loads(model_config.decode('utf-8'))
239 model = model_from_config(model_config,
custom_objects=custom_objects)
ValueError: No model found in config file.
Any suggestions on what i may be doing wrong?
Thank you in advance.
Here is a YouTube video that explains exactly what you're wanting to do: Save and load a Keras model
There are three different saving methods that Keras makes available. These are described in the video link above (with examples), as well as below.
First, the reason you're receiving the error is because you're calling load_model incorrectly.
To save and load the weights of the model, you would first use
model.save_weights('my_model_weights.h5')
to save the weights, as you've displayed. To load the weights, you would first need to build your model, and then call load_weights on the model, as in
model.load_weights('my_model_weights.h5')
Another saving technique is model.save(filepath). This save function saves:
The architecture of the model, allowing to re-create the model.
The weights of the model.
The training configuration (loss, optimizer).
The state of the optimizer, allowing to resume training exactly where you left off.
To load this saved model, you would use the following:
from keras.models import load_model
new_model = load_model(filepath)'
Lastly, model.to_json(), saves only the architecture of the model. To load the architecture, you would use
from keras.models import model_from_json
model = model_from_json(json_string)
For loading weights, you need to have a model first. It must be:
existingModel.save_weights('weightsfile.h5')
existingModel.load_weights('weightsfile.h5')
If you want to save and load the entire model (this includes the model's configuration, it's weights and the optimizer states for further training):
model.save_model('filename')
model = load_model('filename')
Since this question is quite old, but still comes up in google searches, I thought it would be good to point out the newer (and recommended) way to save Keras models.
Instead of saving them using the older h5 format like has been shown before, it is now advised to use the SavedModel format, which is actually a dictionary that contains both the model configuration and the weights.
More information can be found here: https://www.tensorflow.org/guide/keras/save_and_serialize
The snippets to save & load can be found below:
model.fit(test_input, test_target)
# Calling save('my_model') creates a SavedModel folder 'my_model'.
model.save('my_model')
# It can be used to reconstruct the model identically.
reconstructed_model = keras.models.load_model('my_model')
A sample output of this :
Loading model from scratch requires you to build model from scratch,
so you can try saving your model architecture first using model.to_json()
model_architecture = model.to_json()
Save model weighs using
model.save_weights('model_weights.h5')
For loading the weights you need to reconstruct your model using the saved json file
first.
from tensorflow.keras.models import model_from_json
model = model_from_json(model_architecture)
Then load the weights using
model.load_weights('model_weights.h5')
You can now Compile and test the model , No need to retrain
eg
model.compile(loss=keras.losses.SparseCategoricalCrossentropy(from_logits=True),
optimizer=keras.optimizers.Adam(lr=0.001), metrics=["accuracy"])
model.evaluate(x_test, y_test, batch_size=32, verbose=2)
Related
def get_model():
return load_model("model.h5")
model = KerasClassifier(build_fn = get_model)
# model.fit(X_train,y_train)
plt.figure(figsize=(10,8))
display = plot_partial_dependence(
model, TrainX, features, fig=fig
)
I dont want to retrain the model as it will change the model that I was trying to evaluate
As I understand,
IF you are trying to import .h5 files of a pre-trained model and use that for the predict :
then, Yes. you can
from tensorflow.keras.models import load_model
model = load_model("path to .h5 files")
## if there is a separate weight file
model.load_weights("path to weight .h5 file")
after that, you can use model variable for predictions.
but
If you are trying to see how the model is trained, then you have to get log files for a particular training instance. And use tensorboard to visualize easily.
I tried to import vgg16 which I downloaded from google storage
import keras
import cv2
from keras.models import Sequential, load_model
But I got that error
ValueError: No model found in config file.
I was able to recreate the issue using your code and downloaded weights file mentioned by you. I am not sure about the reason for the issue but I can offer an alternative way for you to use pretrained vgg16 model from keras.
You need to use model from keras.applications file
Here is the link for your reference https://keras.io/api/applications/
There are three ways to instantiate this model by using weights argument which takes any of following three values None/'imagenet'/filepathToWeightsFile.
Since you have already downloaded the weights , I suggest that you use the filepath option like the below code but for first time usage I will suggest to use imagenet (option 3). It will download the weight file which can be saved and reused later.
You need to add the following lines of code.
Option 1:
from keras.applications.vgg16 import VGG16
model = VGG16(weights = 'vgg16_weights_tf_dim_ordering_tf_kernels.h5')
Option 2:
from keras.applications.vgg16 import VGG16
model = VGG16(weights = None)
model.load_weights('vgg16_weights_tf_dim_ordering_tf_kernels.h5')
Option 3: for using pretrained imagenet weights
from keras.applications.vgg16 import VGG16
model = VGG16(weights = 'imagenet')
The constructor also takes other arguments like include_top etc which can be added as per requirement.
The problem here is that you're trying to load a model that is not a model and probably are just weights: so the problem is not in the load of the model but in the save.
When you are saving the model try:
If you are using callbacks then "save_weights_only"=False
Else use the function tf.keras.models.save_model(model,filepath)
A complete model has two parts: model architecture and weights.
So if we just have weights, we must first load architecture(may be use python file or keras file ), and then load weights on the architecture.
for example:
model = tf.keras.models.load_model("facenet_keras.h5")
model.load_weights("facenet_keras_weights.h5")
I try to load a LSTM model (created by Keras) after using the command:
model_json = model.to_json()
with open("model.json", "w") as json_file:
json_file.write(model_json)
with the command:
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
and to print the lr and other hyper-parameters via the command :
loaded_model.summary()
I received all the information about the structure of the LSTM. But I did not receive the hyper-parameters information like the lr etc.
Learning rate is a parameter of the optimizer of the model and is not included in the model.summary() output. If you want to find the value of learning rate, you can use optimizer attribute of the model and use K.eval() to evaluate the learning rate tensor and get its actual value:
print(K.eval(model.optimizer.lr))
Update: the optimizer of the model is not saved when you use to_json method and therefore the above solution does not work. If you want to save the whole model including the layers weights as well as the optimizer (along with its state), you can use save method:
model.save('my_model.h5')
Then you can load it using load_model:
from keras.models import load_model
model = load_model('my_model.h5')
Alternatively, if you have used save_weights method (to save the weights of layers) along with to_json method (to save only the architecture of the model), then you can load back the weights after loading back the model using model_from_json:
# load the architecture of model from json file ...
# load the weights
model.load_weights('model_weights.h5')
However, the optimizer in this second approach has not been saved and therefore you need to recompile the model (note that this means the state of the optimizer is lost and therefore you may not be able to easily to continue training the model without configuring the optimizer first; however, this is fine if you only want to use the model for prediction or retrain the model from scratch).
I highly recommend to read the related section in Keras FAQ as well: How can I save a Keras model?
My impression is that it only saves the model's architecture, so I should be able to call it before I start training? And then save_weights() saves the weights I need to restore the model? Any more details on this?
At what stage can I call to_json()? I.e. do I have to call compile() first? Can it be before fit() ?
As mentioned in Keras docs it only saves the architecture of the model:
Saving/loading only a model's architecture
If you only need to save the architecture of a model, and not its
weights or its training configuration, you can do:
# save as JSON
json_string = model.to_json()
# save as YAML
yaml_string = model.to_yaml()
The generated JSON / YAML files are human-readable and can be manually
edited if needed.
You can then build a fresh model from this data:
# model reconstruction from JSON:
from keras.models import model_from_json
model = model_from_json(json_string)
# model reconstruction from YAML
from keras.models import model_from_yaml
model = model_from_yaml(yaml_string)
I have saved a keras model as a h5py file and now want to load it from disk.
When training the model I use:
from keras.models import Sequential
model = Sequential()
H = model.fit(....)
When the model is trained, I want to load it from disk with
model = load_model()
How can I get H from the model variable? It unfortunately does not have a history parameter that I can just call. Is it because the save_model function doesn't save history?
Unfortunately it seems that Keras hasn't implemented the possibility of loading the history directly from a loaded model. Instead you have to set it up in advance. This is how I solved it using CSVLogger (it's actually very convenient storing the entire training history in a separate file. This way you can always come back later and plot whatever history you want instead of being dependent on a variable you can easily lose stored in the RAM):
First we have to set up the logger before initiating the training.
from keras.callbacks import CSVLogger
csv_logger = CSVLogger('training.log', separator=',', append=False)
model.fit(X_train, Y_train, callbacks=[csv_logger])
The entire log history will now be stored in the file 'training.log' (the same information you would get, by in your case, calling H.history). When the training is finished, the next step would simply be to load the data stored in this file. You can do that with pandas read_csv:
import pandas as pd
log_data = pd.read_csv('training.log', sep=',', engine='python')
From here on you can treat the data stored in log_data just as you would by loading it from K.history.
More information in Keras callbacks docs.
Using pickle to save the history object threw a whole host of errors. As it turns out you can instead use pickle on H.history instead of H to save your history file!
Kind of annoying having to have a model and a history file saved, but whatever