Keras Default Backend in Python & R - python

I am very confused after reading a lot about keras and tensorFlow, and still have some basic questions in my mind.
My confusion started from the answer of this question, where he writes keras standalone and from tensorflow.keras import keras.
1- (Python case):
Does keras use any backend when I write this line of code import keras, and No single line of code related to tensorflow e.g tf.keras or tf.keras.layers in my full implementation of the model, but only import keras? if it does, then is there any way to check what backend is being used?
2- Same question in the case of R Language.
3 - Is TensorFlow only used as backend when we write import tensorflow as tf and import tf.keras ?
4- Does import keras and import tf.keras have any discrepency in performance and accuracy in case of python?
5- Does versions of keras and tensorFlow have an impact in performance and accuracy in both language (R and Python) ?
6- What could be the reasons to have 5% accuracy difference in R and Python. Python gives 94%, while the same implementation in R gives 89% accuracy. The versions of keras & tensorFlow in R are 2.3.0, 2.2.0, while the versions in Python are : tf: 2.3.0, keras: 2.4.3. Please see this one.

Related

How to use produced tf.keras(for tf version>2) model in C++

I am using https://github.com/fizyr/keras-retinanet this implementation of retinanet which is implemented with tensorflow and keras. So I want to use produced model in c++ for inference. But when I search for it, I can't find anything to try for tensorflow version>=2.0 . There is a good documentation for this operation for pytorch https://pytorch.org/tutorials/advanced/cpp_export.html . I am looking for tensorflow version of it. Thanks.

How to use pretrained model by TF 2.x in C++

I have trained segmentation and classification network in python using Tensorflow 2.1. The Model is saved using SavedModel (.pb). Now, I want to test the model, I need this to be done in C++.
I saw many information about C++ API for Tensorflow 1.x, except TF 2.x.
The official tensorflow site says "Note: There is no libtensorflow support for TensorFlow 2 yet. It is expected in a future release.".
Does anyone know of a possible way?
It will be great help to me.
Thanks.

tensorflow.contrib.predictor.from_saved_model() in Tensorflow 2

Currently using Tensorflow 1 and noticed tensorflow.contrib has been removed in Tensorflow 2. How to convert tensorflow.contrib.predictor.from_saved_model() to work on Tensorflow 2?
In TF2 the Predictor API is no longer supported and is not in TF2 at all (the whole contrib module is gone). You can either attempt using TF-HUB (the link above says Predictor is replaced by it), convert your model to a Keras model (the way I'd recommend if you have a custom model architecture), convert it to an Estimator, or stick to the latest TF1 release.

Importing dataset.mnist

I am following a deep learning tutorial book in Japanese and it is using MNIST for its handwritten images. It has the code from dataset.mnist import load_dataset, and when I tried it, it did not work, gave an error saying no such module named dataset.mnist. I have downloaded the modules dataset and mnist individually using pip. The book recommended to use Anaconda, but I have tried it to no success.
How can I use the module dataset.mnist?
The first question I want to ask you is which deep learning framework are you working with for solving your problem.
There are many deep learning frameworks. PyTorch, Tensorflow, and Keras are examples of such frameworks.
1) If you are working with PyTorch framework then you should import the torch framework using the command.
import torch
and then you can import MNIST dataset using the command
from torchvision.datasets import MNIST
2) For Keras framework use the following commands for importing MNIST dataset.
import keras
from keras.datasets import mnist
NOTE: This can be written as well for better understanding of your problem.
import keras
from keras.datasets as datasets
and then you can import MNIST dataset using thedatasets which is an alias of keras.datasets
Similarly, you can import MNIST dataset in other frameworks as well.
Hope this will be of your help.
Adding on to #SauravRai's Answer
For tensorflow :
from tensorflow.examples.tutorials.mnist import input_data
input_data.read_data_sets('my/directory')

How to make tensorflow and theano share gpus simultaneously?

for some complicated reasons I use both tensorflow and theano in my python code, and I have 2 gpus which I want them to share, but as stated in another question there is some problem, I want to know if there's some trick to achieve that, like specifying tensorflow to use only 1 gpu while theano using another?
for now I can only disable theano's gpu usage by os.environ['THEANO_FLAGS'] = 'device=cpu,floatX=float64', and let tensorflow use all
os.environ['KERAS_BACKEND'] = 'theano'
os.environ['THEANO_FLAGS'] = 'device=cpu,floatX=float64'
import tensorflow as tf
import keras as ks
I haven't tried this. However if you have multiple GPUs, you can force to run the code on GPU using the following trick:
import tensorflow as tf
with tf.device('/gpu:0'):
# Run the tensorflow code
import tensorflow as tf
with tf.device('/gpu:1'):
# Run the theano code
Hope this helps!

Categories

Resources