All of the examples I have seen with tensorflow transform export the models after training using an estimator with the export_saved_model function. I am working with existing training code that does not use estimators and saves with saved_model.simple_save.
What is the best practice method for saving a transform_fn with a model for serving without using estimators?
Related
I've currently trained & tested several supervised models using sklearn and xgboost (using the same data). The xgboost model performs slightly better than sklearn's LassoCV.
I'm trying to find a way to export the model object so that it can be interacted with by non-technical folks in either Excel and/or VBA. Specifically, non-technical folk need to be able to enter all feature values for a new observation, and have the exported sklearn or xgboost model output a new prediction.
I know I can save sklearn and xgboost objects down as .pkl. Is there an interface or API that can take input from excel, pass it to the .pkl model file, and return the correct scalar prediction? There are about 40-50 features that will need to be inputted and passed to the exported model. After exporting the model from python, it will not need to retrained, only used for prediction.
I had a pre-trained model(tensorflow model) which was trained using data from publicly available data set. I had meta file and ckpt file. I’d like to train my tensorflow model using new data from privately obtained data set. I have small dataset, so I’d like to fine-tune my model according to ‘Strategy 2’ or ‘Strategy 3’.
Strategy 2: Train some layers and leave the others frozen.
Strategy 3: Freeze the convolutional base.
Reference site: https://towardsdatascience.com/transfer-learning-from-pre-trained-models-f2393f124751
However, I couldn’t find sample code which is implemented in a transfer learning and fine-tuning for tensorflow model. There are many examples with keras model. How can I implement in a transfer learning and fine-tuning for my tensorflow model?
If you don't have to use Tensorflow's functions, You can use example code with tf.keras module of Tensorflow 2.0 also..
I have the Ensemble model that combines both tensorflow and scikit-learn. And I would like to save this Ensemble model as a box to feed data in and generate the output. My code is as below
def model_base_LSTM(***):
***
model = model_base_LSTM(***)
ensem_model = BaggingRegressor(base_estimator=model, n_estimators=15)
ensem_model.fit(x_train, y_train)
bag_mod_pred = ensem_model.predict(x_test_bag)
from joblib import dump, load
dump(ensem_model, 'LSTM_Ensemble.joblib')
TypeError: can't pickle _thread._local objects
So, how to solve this problem??
You can save your TensorFlow (and even PyTorch) models with Scikit-Learn, but only if you use Neuraxle and its saving mechanics.
Neuraxle is an extension of Scikit-Learn to make it more compatible with all deep learning libraries.
The trick is performed by using Neuraxle-TensorFlow or Neuraxle-PyTorch.
Why so?
Using one of Neuraxle-TensorFlow or Neuraxle-PyTorch will provide you with a saver to allow your thing to be serialized correctly. You want it to be serialized correctly to be able to ensure compatibility between scikit-learn and your Deep Learning framework when it comes time to save or parallelize things and so forth. You can read how Neuraxle solves this with savers here.
Code Examples
Here is a full project example from A to Z where TensorFlow is used with Neuraxle as if it was used with Scikit-Learn.
Here is another practical example where TensorFlow is used within a scikit-learn-like pipeline
Directory structure:
Data
-Cats
--<images>.jpg
-Dogs
--<images>.jpg
I'm training a (n-ary) classification model. I want to create an input_fn for serving these images for training.
image dimensions are (200, 200, 3). I have a (keras) generator for them, if they can be used somehow.
I've been looking for a while but haven't found an easy way to do this. I thought this should be a standard use-case? e.g. Keras provides flow_from_directory to serve keras models. I need to use a tf.estimator for AWS Sagemaker so I'm stuck with it.
By using the tf dataset Module you can feed your data directly into your estimator. You basically have 3 ways to integrate this into your api:
1. convert your images into tfrecords and use tfrecorddataset
2 use the tf dataset from generator function to use generators
3 try introducing these decoder functions into your inputpipeline
I have successfully (I hope) trained and evaluated a model using the tf.Estimator where I reach a train/eval accuracy of around 83-85%. So now, I would like to test my model on a separate dataset using the predict() function call in the Estimator class. Preferably I would like to do this in a separate script.
I've at this which says that I need to export as a SavedModel, but is this really necessary? Looking at the documentation for the Estimator class, it seems like I can just pass the path to my checkpoint and graph files via the model_dir parameter. Has anyone any experience with this? When I run my model on the same dataset I used for validation, I do not obtain the same performance as during the validation phase... :-(
I think you just need a separate file containing your model_fn definition. Than you instantiate the same estimator class in another script, using the same model_fn definition and the same model_dir.
That works because the Estimator API recovers the tf.Graph definitions and the latest model.ckpt files by itself so you are able to continue training, evaluation and prediction.