I Originally trained the model with sklearn, which was then converted to onnx. I now want to get the Sklearn model back. The only related thing I can find online is converting onnx to tf-lite.
Thanks in advance!
Related
I used this repo : https://github.com/Turoad/lanedet
to convert a pytorch model that use mobilenetv2 as backbone To ONNX but I didn't succeeded.
i got a Runtime error that says:
RuntimeError: Exporting the operator eye to ONNX opset version 12 is
not supported. Please open a bug to request ONNX export support for
the missing operator.
it's really disappointing, looking to the good result that this model gives and the quick performance that it provides,
is there any way that I can fix this bug? because I need to convert it to ONNX and then to TF lite model to use it in Android App
I will provide the pretrained model that I have used and the way that I follow in converting..
Thank you so much for helping!
my colab notebook:
https://colab.research.google.com/drive/18udIh8tNJvti7jKmR4jRaRO-oYDgRmvA?usp=sharing
the pretrained model that I use:
https://drive.google.com/file/d/1o3-BgLIQesurIyDCKGliqbo2inUA5cPw/view?usp=sharing
Use torch>=1.7.0 to convert the model, because operation Eye is added.
I am looking for solutions to quantize sklearn models. I am specifically looking for XGBoost models.
I did find solutions to quantize pytorch and tensorflow models but nothing on sklearn.
Solutions tried:
Converted sklearn model to ONNX and then tried to quantize ONNX model, but that didn't work either. Here is the link to the bug.
Any pointers or solutions can be shared, it would be of great help.
Someone had answered the question in your link bug.
Try not to add a final node ZipMap by the 'zipmap' option:
onx = convert_sklearn(clr, initial_types=initial_type,
options={'zipmap': False})
I'm interested to know if that works for you?
BTW, you can use onnxmltools to convert a XGBoost model to ONNX according to this.
the sample code:
import onnx
import onnxmltools
from onnxmltools.convert.common.data_types import FloatTensorType
from xgboost import XGBClassifier
clf = XGBClassifier()
# fit the classifier...
onnx_model_path = "xgb_classifier.onnx"
initial_type = [('float_input', FloatTensorType([None, num_features]))]
onnx_model = onnxmltools.convert.convert_xgboost(clf, initial_types=initial_type, target_opset=10)
onnx.save(onnx_model, onnx_model_path)
Note that:
Model must be trained using the scikit-learn API of xgboost
The training data passed to XGBClassifier().fit() must not have feature names associated with it. For example, if your training data is a DataFrame called df, which has column names, you will need to use a representation without column names (i.e. df.values) when training.
I created a model of darknet53.weights for image classification using my original data in darknet.
(This isn't a YOLO v3 model.)
Is there a way to convert a darknet53.weight to a pytorch pt model?
I tried quoting various codes on github etc., but all of them can convert only YOLOv3 weights file to pytorch's pt model.
I want to compare the accuracy of the darknet53 model created with darknet with other image classification models created with pytorch.
Initially, I tried to make a darknet53 model with pytorch, but that didn't work. Therefore, I created a darknet53 model with darknet.
If anyone knows a good way, please teach me.
Thanks.
I'm trying to convert a SageMaker XGBoost model to ONNX, in order to use the ONNX model in .Net application using ML.NET. I've tried to convert the model using winmltools and onnxmltools but both tools are returned similar error.
There is a good resource to use machine learning in business area. I've tried Using Machine Learning to Improve Sales in SageMaker to create the model and then convert the model to ONNX model. The example is working well in SageMaker.
After running the example, I got a model and the type of the model is sagemaker.estimator.Estimator. I've tried to convert the model by using winmltools and onnxmltools. But both are returned same error.
ValueError: No proper operator name found for '<class 'sagemaker.estimator.Estimator'>'
I've tried to follow Convert ML models to ONNX with WinMLTools and ONNXMLTools enables conversion of models to ONNX to convert the SageMaker model to ONNX model.
After that, I used xgb.create_model() command to create SageMaker model. Then used the tools to convert the model to ONNX. but no luck. I got same error this time. Just the model is different.
ValueError: No proper operator name found for '<class 'sagemaker.model.Model'>'
Then I load the model using pickle and tried to convert the model. I got same error, just the model is different.
ValueError: No proper operator name found for '<class 'xgboost.core.Booster'>'
At this moment, I have no idea about the issues. How should I solve the issues. I've attached the Improve Sales Classification to ONNX notebook file for reference.
Could you please take a look at the issues and let me know a way to solve the issues?
Thanks in advance!
I'm training tensorflow slim based models for image classification on a custom dataset. Before I invest a lot of time training such huge a dataset, I wanted to know whether or not can I convert all the models available in the slim model zoo to tflite format.
Also, I know that I can convert my custom slim-model to a frozen graph. It is the step after this which I'm worried about i.e, conversion to .tflite from my custom trained .pb model.
Is this supported ? or is there anyone who is facing conversion problems that has not yet been resolved ?
Thanks.
Many Slim models can be converted to TFLite, but it isn't a guarantee since some models might have ops not supported by TFLite.
What you could do, is try and convert your model to TensorFlow Lite using TFLiteConverter in Python before training. If the conversion succeeds, then you can train your TF model and convert it once again.