Given that deep learning models can take hours, days, or weeks to train, it is paramount to know how to *save* and *load* them from disk.

Let us take the ResNet50 model as an example:

```
from keras.applications import resnet50
model = resnet50.ResNet50(include_top=True, weights='imagenet')
model.load_weights('resnet50_weights_tf_dim_ordering_tf_kernels.h5')
model.compile(optimizer='rmsprop', loss='categorical_crossentropy')
```

The task is to **save and load it on another computer**.

There are two ways to achieve this. Here's how:

**Option 1: Weights + Model Architecture (⭐️)**

This is the ⭐️ preferred method ⭐️ as it is modular and is compatible with Keras.js.

#### Saving

Specifically, you want to save:

- the weights (
`.h5`

) - the model architecture (
`.json`

)

Here's how you do them both:

```
# Save the weights
model.save_weights('model_weights.h5')
# Save the model architecture
with open('model_architecture.json', 'w') as f:
f.write(model.to_json())
```

#### Loading

Should you ever need to load them:

```
from keras.models import model_from_json
# Model reconstruction from JSON file
with open('model_architecture.json', 'r') as f:
model = model_from_json(f.read())
# Load weights into the new model
model.load_weights('model_weights.h5')
```

**Option 2: Save/Load the Entire Model**

```
from keras.models import load_model
# Creates a HDF5 file 'my_model.h5'
model.save('my_model.h5')
# Deletes the existing model
del model
# Returns a compiled model identical to the previous one
model = load_model('my_model.h5')
```

This single `HDF5 file`

will contain:

- the architecture of the model (allowing the recreation of the model)
- the weights of the model
- the training configuration (e.g. loss, optimizer)
- the state of the optimizer (allows you to resume the training from exactly where you left off)

Note: It is __ not recommended__ to use

`pickle`

or `cPickle`

to save a Keras model.**Summary**

```
# Suppose we have a model
from keras.applications import resnet50
model = resnet50.ResNet50(include_top=True, weights='imagenet')
model.load_weights('resnet50_weights_tf_dim_ordering_tf_kernels.h5')
model.compile(optimizer='rmsprop', loss='categorical_crossentropy')
# Import dependencies
import json
from keras.models import model_from_json, load_model
# Option 1: Save Weights + Architecture
model.save_weights('model_weights.h5')
with open('model_architecture.json', 'w') as f:
f.write(model.to_json())
# Option 1: Load Weights + Architecture
with open('model_architecture.json', 'r') as f:
new_model_1 = model_from_json(f.read())
new_model_1.load_weights('model_weights.h5')
# Option 2: Save/load the entire model
model.save('my_model.h5')
new_model_2 = load_model('my_model.h5')
```

**Bonus: Saving **__Scikit-Learn__ Models

__Scikit-Learn__Models

You can easily save Scikit-Learn (sklearn) models by using Python's `pickle`

module or sklearn's `sklearn.externals.joblib`

, which is more efficient at serializing large Numpy arrays.

```
from sklearn.externals import joblib
# Saving a model
joblib.dump(my_model, '_model.pkl')
# Loading a model
loaded_model = joblib.load('my_model.pkl')
```

**Other Resources**

If you enjoyed this post and want to buy me a cup of coffee...The thing is, I'll

alwaysaccept a cup of coffee. So feel free to buy me one.Cheers! ☕️