can I use .h5 file in Django project? - python

I'm making AI web page using Django and tensor flow. and I wonder how I add .h5 file in Django project.
writing whole code in views.py file
but I want to use pre-trained model
not online learning in webpage.

Yeah u can use .h5 file in django. You can use h5py for operations on .h5 files. Exapmle -
import h5py
filename = "filename.h5"
h5 = h5py.File(filename,'r')
# logic
...
h5.close()

Related

'UnpicklingError: invalid load key, '\x0a'. Trying to save and load a model

I have been stuck on this error for days.
I have created and saved a model on my Google Colab. It is saved in a '.tar' file. I want to save and load this model with the help of the torch library in Python. This is my code so far.
import torch
import pickle
import json
torch.save('/content/drive/MyDrive/model.tar',open('/content/drive/MyDrive/saved.tar', 'wb'))
filename = '/content/drive/MyDrive/saved.tar'
loaded =(torch.load(filename, map_location=torch.device('cpu')))
'model.tar' is the tar file of the model I have on my colab which I need to load. I know that 'loaded' is now of type 'string', which means I am doing something wrong with my torch.save() function call. It would be great if anyone can help. Thanks in advance.

Loading a FastText Model from s3 without Saving Locally

I am looking to use a FastText model in a ML pipeline that I made and saved as a .bin file on s3. My hope is to keep this all in a cloud based pipeline, so I don't want local files. I feel like I am really close, but I can't figure out how to make a temporary .bin file. I also am not sure if I am saving and reading the FastText model in the most efficient way. The below code works, but it saves the file locally which I want to avoid.
import smart_open
file = smart_open.smart_open(s3 location of .bin model)
listed = b''.join([i for i in file])
with open("ml_model.bin", "wb") as binary_file:
binary_file.write(listed)
model = fasttext.load_model("ml_model.bin")
If you want to use the fasttext wrapper for the official Facebook FastText code, you may need to create a local temporary copy - your troubles make it seem like that code relies on opening a local file path.
You could also try the Gensim package's separate FastText support, which should accept an S3 path via its load_facebook_model() function:
https://radimrehurek.com/gensim/models/fasttext.html#gensim.models.fasttext.load_facebook_model
(Note, though, that Gensim doesn't support all FastText functionality, like the supervised mode.)
As partially answered by the above response, a temporary file was needed. But on top of that, the temporary file needed to be passed as a string object, which is sort of strange. Working code below:
import tempfile
import fasttext
import smart_open
from pathlib import Path
file = smart_open.smart_open(f's3://{bucket_name}/{key}')
listed = b''.join([i for i in file])
with tempfile.TemporaryDirectory() as tdir:
tfile = Path(tdir).joinpath('tempfile.bin')
tfile.write_bytes(listed)
model = fasttext.load_model(str(tfile))

How to load tf.keras model directly from cloud bucket?

I try to load tf.keras model direcly from cloud bucket but I can't see easy wat to do it.
I would like to load whole model structure not only weights.
I see 3 possible directions:
Is posssible to load keras model directly from Google cloud bucket? Command tf.keras.model.load_model('gs://my_bucket/model.h5') doesn't work
I tried to use tensorflow.python.lib.ii.file_io but I don't know how to load this as model.
I copied model to local directory by gsutil cp command but I don't know how to wait until operation will be complete. tf try to load model before download operation is complete so the errors occurs
I will be thankful for any sugestions.
Peter
Load the file from gs storage
from tensorflow.python.lib.io import file_io
model_file = file_io.FileIO('gs://mybucket/model.h5', mode='rb')
Save a temporary copy of the model locally
temp_model_location = './temp_model.h5'
temp_model_file = open(temp_model_location, 'wb')
temp_model_file.write(model_file.read())
temp_model_file.close()
model_file.close()
Load model saved locally
model = tf.keras.models.load_model(temp_model_location)

Can I pickle a tensorflow model?

Will I be able to pickle all the .meta, .data and checkpoint files of a tensorflow model? That's because I want to run a prediction on my model and if i deploy it , the files can't be on disk right? I know about the tensorflow serving but I don't really understand it. I want to be able to load the tensforflow files without accessing the drive all the time.
Using pickle is not recommended. Instead, they have created a new format called "SavedModel format" that serves this exact purpose.
See: https://www.tensorflow.org/guide/saved_model

Access .rds file from Python

I am working on a project where I have got an .rds file which consist of trained model as per my requirement generated by R code.
Now I need to load the trained model in python and use it in processing the records.
Is there any way to do so? If not what are the alternatives.
Thanks
We can use feather:
import feather
path = 'my_data.feather'
feather.write_dataframe(df, path)
df = feather.read_dataframe(path)

Categories

Resources