When I import tensorflow
import tensorflow as tf
I don't get an error. However, I do get the error below. I'm using spyder if that helps.
As per other questions, I ensured up to date (v1.8) tensorflow using both conda and then pip installs. This didn't resolve the issue. Please assist.
import tensorflow.examples.tutorials.mnist.input_data as input_data
ModuleNotFoundError: No module named 'tensorflow.examples'
I think you should use like bellow on tensorflow 2
import tensorflow_datasets
mnist = tensorflow_datasets.load('mnist')
Use the following, it will download the data. It is from the tensorflow documentation
import tensorflow as tf
(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.mnist.load_data()
Sometimes the TensorFlow examples are not pre-downloaded, so you might need to run the below command to install the examples from Github using the below code.
!pip install -q git+https://github.com/tensorflow/examples.git
To load the mnist dataset in Tensorflow 2.0:
mnist = tf.keras.datasets.mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()
Here is the reference:
TensorFlow 2 quickstart for beginners
Another method(also works for locally saved dataset):
DATA_URL = 'https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz'
path = tf.keras.utils.get_file('mnist.npz', DATA_URL)
with np.load(path) as data:
train_examples = data['x_train']
train_labels = data['y_train']
test_examples = data['x_test']
test_labels = data['y_test']
Here is the reference:
Load NumPy data
Sometimes on downloading the TF, the example directory might not be available. You could rectify it by linking the 'example' directory from the GitHub repo into the tensorflow python wheel folder. That way you don't need to change the code.
If this doesn't work, try to replace import tensorflow.examples.tutorials.mnist.input_data as input_data as import input_data as mentioned in the link:
TensorFlow MNIST example not running with fully_connected_feed.py
Hope this helps!!!
Different approach
OS: Windows
copy from:
https://github.com/tensorflow/examples/tree/master/tensorflow_examples
to
[python folder]\Lib\site-packages\tensorflow_examples
use
import tensorflow_examples
example
from tensorflow_examples.models import pix2pix
but for datasets use:
pip install tensorflow_datasets
I solved this issue by adding **tutorial** directory into tensorflow_core, usually this issue pops up when lacking of this file
..\anaconda3\envs\tensorflow\Lib\site-packages\tensorflow_core\examples check this directory to see if you have tutorials file.
lack of tutorial file
If you do not have, then go to https://github.com/tensorflow/tensorflow download the zip file, and extract all (or open it).
download tutorial file
find tutorials file from tensorflow-master\tensorflow\examples\, and copy it to ..\anaconda3\envs\tensorflow\Lib\site-packages\tensorflow_core\examples.
Issue resolved.
run
from tensorflow.examples.tutorials.mnist import input_data
import matplotlib.pyplot as plt
mnist = input_data.read_data_sets("MNIST_data", one_hot=True)
im = mnist.train.images[1]
im = im.reshape(-1, 28)
plt.imshow(im)
I solved this issue on Mac, simply copy the official examples to tensorflow_core/examples directory.
pull the tensorflow code
git clone https://github.com/tensorflow/tensorflow
copy the examples to the system python3 directory
cp -a tensorflow/examples/ /usr/local/lib/python3.7/site-packages/tensorflow_core/examples/
You need to download the data sets to use it.
Command:
pip install tensorflow-datasets
Code part:
mnist_train = tfds.load(name="mnist", split="train")
You are done now. Happy coding! :)
You just need to download the lost files and copy them to the tensorflow-core/ examples.
for me on windows 10 is :
C:\Users\Amirreza\AppData\Local\Programs\Python\Python37\Lib\site-packages\tensorflow_core\examples
This folder has been deleted at September/2020. See their repository.
I used the commands:
git clone https://github.com/tensorflow/tensorflow.git
tensorflow> git checkout c31acd156c
I solved this issue by installing Keras according to this answer from different question:
I didn't have Keras installed and had to add it manualy
ImportError: No module named 'keras'
Related
I am looking at this workbook which comes from huggingface course. I dont have internet access from my python environment but I could download files and save them in python environment. I copied all file from this folder and saved it in the folder bert-base-uncased/. I renamed some of the files to match what is in the above folder
I have below packages
tensorflow.__version__
'2.2.0'
keras.__version__
'2.4.3'
Then I installed transformers
!pip install datasets transformers[sentencepiece]
checkpoint = "bert-base-uncased/"
from transformers import TFAutoModelForSequenceClassification
Successfully installed datasets-1.17.0 dill-0.3.4 fsspec-2022.1.0 huggingface-hub-0.4.0 multiprocess-0.70.12.2 pyarrow-6.0.1 sacremoses-0.0.47 sentencepiece-0.1.96 tokenizers-0.10.3 transformers-4.15.0 xxhash-2.0.2
all the files are available
#files from https://huggingface.co/bert-base-uncased/tree/main
import os
cwd = os.getcwd()
print (cwd)
os.listdir('bert-base-uncased/')
['gitattributes',
'vocab.txt',
'tokenizer_config.json',
'tokenizer.json',
'README.md',
'.dominokeep',
'config.json',
'tf_model.h5',
'flax_model.msgpack',
'rust_model.ot',
'pytorch_model.bin']
But I still get the below error. I don't get this error when I run the same code using google colab
model = TFAutoModelForSequenceClassification.from_pretrained('bert-base-uncased/', num_labels=2)
print ("----------------")
print (type(model))
print ("----------------")
RuntimeError: Failed to import transformers.models.bert.modeling_tf_bert because of the following error (look up to see its traceback):
No module named 'tensorflow.python.keras.engine.keras_tensor'
To replicate Your issue I install TensorFlow version 2.2.0.
Then I tried to import problematic module:
from tensorflow.python.keras.engine import keras_tensor
Yield importError.
Meaning, that TensorFlow version 2.2.0 doesn't have module keras_tensor required by transformer.
Then I updated TensorFlow to version 2.7.0 and try to import keras_tensor module again and everything worked.
Updating TensorFlow to newer version should solve Your issue.
EDIT
Digging a bit for lowest working version of TensorFlow I get to setup.py of Transformers.
Version requirements for TensorFlow is >= 2.3.0.
I'm trying to import my training dataset for my CNN (30,000 images), but there's something about this line that breaks the program.
data_dir = tf.keras.utils.get_file(origin=dataset_url,
fname='functionidentifier',
untar=True)
Here's the full code block for more context:
dataset_url = "https://barisciencelab.tech/functionidentifier.tgz"
data_dir = tf.keras.utils.get_file(origin = dataset_url,
fname = "functionidentifier",
untar = True)
data_dir = pathlib.Path(data_dir)
It was working just last week, but suddenly broke:
Some things I tried:
Changing extension of my training set from .tar.gz to .zip and .tar.tar and .tgz.
Unit testing (all other parts of code are fine; they're just importing stuff anyway)
Exact same code in a new Google Colab Notebook and Jupyter Notebook. Neither worked.
Checked the documentation. My code is verbatim the same! The only thing that's different is literally my URL. That's it.
Nothing worked. And no, I can't manually download the whole training set and do this locally (big-time hassle).
MWE (Full Code)
pip install tensorflow
pip install numpy
pip install matplotlib
!git clone https://github.com/Refath/SinusoidalAnalyzer.git
import tensorflow as tf
from tensorflow import keras
from keras.models import Sequential
from keras.layers import Dense, Flatten, Conv2D, MaxPooling2D, Dropout
from tensorflow.keras import layers
from tensorflow.keras.utils import to_categorical
import numpy as np
import matplotlib.pyplot as plt
plt.style.use('fivethirtyeight')
import pathlib
dataset_url = "https://barisciencelab.tech/TrainingSet.tar.gz"
data_dir = tf.keras.utils.get_file(origin = dataset_url, fname = "TrainingSet", untar = True)
data_dir = pathlib.Path(data_dir)
Folks in the SO Chat valiantly tried to debug the issue as well, but to no avail. If anyone can help, that would be great.
Have you tried installing a previous version of TF in the colab notebook
pip install tensorflow==2.x
import tensorflow as tf
print(tf.__version__)
Or maybe your URL is no longer reachable from the colab VM. Try to ping its ip, you can run terminal commands with !ping "IP"
For anyone in the future battling with this issue:
I came across this problem as well. I did some digging and found that the problem was that I didn't have the right certificates installed for Python3. Fairly easy fix on a Mac - run the file located at Applications>Python 3.X>Install Certificates.command and that should do the trick.
Not sure if this problem pops up on other OSs or not but I'd imagine the fix is similar.
Took me three solid days. In the end, here's what worked:
import os
import pathlib
!wget https://barisciencelab.tech/FunctionIdentifier.zip
!unzip FunctionIdentifier.zip
PATH = os.path.join(os.path.dirname('FunctionIdentifier.zip'), 'FunctionIdentifier')
data_dir = pathlib.Path(PATH)
Here's the output:
And of course, it goes on for 30,000 lines.
How did I come up with this solution? After combing through the web, I stumbled upon the 403 – Forbidden code, different from my 406--Not Acceptable code, but nonetheless worth a look. I tried porting the solution over and after some modifications (i.e., I define the training/test/validation set parameters separately and feed the PATH into data_dir).
Also, thanks to #Kevin & #AndrasDeak for spending a significant amount of time helping me debug.
I am very new to use github. I have installed github in ubuntu 16.04, I installed python 2.7.12, tensorflow 1.9 and keras. I want to use my own custom activation and optimizer in keras RNN. I searched in web and came to know i need to install keras-contrib package to use advanced activation and custom activation function.
So, I install the keras-contrib from github. But I don't know how to work with it and how to run the program using keras-contrib.
But i tried with following commands
git clone https://www.github.com/keras-team/keras-contrib.git
cd keras-contrib
python setup.py install
then I tried with this following code
from keras.models import Sequential
from keras.layers import Dense
import numpy as np
from keras_contrib.layers.advanced_activations import PELU
it showing the following error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "keras_contrib/__init__.py", line 4, in <module>
from . import layers
File "keras_contrib/layers/__init__.py", line 3, in <module>
from .convolutional import *
File "keras_contrib/layers/convolutional.py", line 15, in <module>
from keras.utils.conv_utils import normalize_data_format
ImportError: cannot import name normalize_data_format
Anyone please check this error and help me to sort out this error.
I update the keras contribute source code installed in my linux. Follow the changes:
https://github.com/ekholabs/keras-contrib/commit/0dac2da8a19f34946448121c6b9c8535bfb22ce2
Now, it works well.
I had the same problem. I installed keras 2.2.2 version using the following command and problem solved.
pip install -q keras==2.2.2
Refer this PR.
https://github.com/keras-team/keras-contrib/pull/292
Had the same issue. The problem is that normalize_data_format function was moved to keras.backend.common from keras.utils.conv_utils in later versions of keras. You can use
import keras
and then in your code use
keras.utils.conv_utils.normalize_data_format
I found that in keras version 2.6.0 the normalize function is not lost, it is just "stored" in a file "np_utils.py", so what we need to do is just change
"
from keras.utils import normalize
to
from keras.utils.np_utils import normalize
It must be because the keras_contrib you have downloaded is not compatible with updated version of keras. Check this link https://github.com/keras-team/keras/blob/master/keras/utils/conv_utils.py
There is no function here like normalise_data_format, that is where it is throwing error.
It must be because the keras_contrib you have downloaded is not compatible with updated version of keras. Check this link https://github.com/keras-team/keras/blob/master/keras/utils/conv_utils.py
It does not work...
This bug is reported and fixed here: https://github.com/keras-team/keras-contrib/issues/291
On my Windows 10 system and in Colaboratory, using Python 3.7, I solved this problem updating Keras and installing git version of keras-contrib.
pip install -q keras==2.2.2
pip install git+https://www.github.com/keras-team/keras-contrib.git
Check your Keras version with
import keras
print(keras.__version__)
I had the same problem. I solved it by using this :
from tensorflow.keras.utils import normalize
instead of :
from keras.utils import normalize
TensorFlow MNIST example not running with fully_connected_feed.py
I checked this out and realized that input_data was not built-in. So I downloaded the whole folder from here. How can I start the tutorial:
import input_data
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-6-a5af65173c89> in <module>()
----> 1 import input_data
2 mnist = tf.input_data.read_data_sets("MNIST_data/", one_hot=True)
ImportError: No module named input_data
I'm using iPython (Jupyter) so do I need to change my working directory to this folder I downloaded? or can I add this to my tensorflow directory? If so, where do I add the files? I installed tensorflow with pip (on my OSX) and the current location is ~/anaconda/lib/python2.7/site-packages/tensorflow/__init__.py
Are these files meant to be accessed directly through tensorflow like sklearn datasets? or am I just supposed to cd into the directory and work from there? The example is not clear.
EDIT:
This post is very out-dated
So let's assume that you are in the directory: /somePath/tensorflow/tutorial (and this is your working directory).
All you need to do is to download the input_data.py file and place it like this. Let's assume that the file name you invoke:
import input_data
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)
...
is main.py and it is also in the same directory.
Once this is done, you can just start running main.py which will start downloading the files and will put them in the MNIST_data folder (once they are there the script will not be downloading them next time).
The old tutorial said, to import the MNIST data, use:
import input_data
mnist = input_data.read_data_sets('MNIST_data', one_hot=True)
This will cause the error.
The new tutorial uses the following code to do so:
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("MNIST_data", one_hot=True)
And this works well.
I am using different version - following Install on Windows with Docker here - and had similar problem.
An easy workaround I've found was:
1.Into the Linux command line, figure out where is the input_data.py on my Docker image (in your case you mentionned that you had to download it manually. In my case, it was already here). I used the follwing linux command:
$ sudo find . -print | grep -i '.*[.]py'
I've got the files & path
./tensorflow/g3doc/tutorials/mnist/mnist.py
./tensorflow/g3doc/tutorials/mnist/input_data.py
2.launch Python and type the following command using SYS:
>> import sys
>> print(sys.path)
you will get the existing paths.
['', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-x86_64-linux-gnu', '/usr/lib/python2.7/lib-tk', '/usr/lib/python2.7/lib-old', '/usr/lib/python2.7/lib-dynload', '/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages/PILcompat']
4.add the path of inputa_data.py:
>> sys.path.insert(1,'/tensorflow/tensorflow/g3doc/tutorials/mnist')
Hope that it can help. If you found better option, let me know. :)
How can I start the tutorial
I didn't download the folder you did but I installed tensorflow by pip and then I had similar problem.
My workaround was to replace
import tensorflow.examples.tutorials.mnist.input_data
with
import tensorflow.examples.tutorials.mnist.input_data as input_data
If you're using Tensorflow 2.0 or higher, you need to install tensorflow_datasets first:
pip install tensorflow_datasets
or if you're using an Anaconda distribution:
conda install tensorflow_datasets
from the command line.
If you're using a Jupyter Notebook you will need to install and enable ipywidgets. According to the docs (https://ipywidgets.readthedocs.io/en/stable/user_install.html) using pip:
pip install ipywidgets
jupyter nbextension enable --py widgetsnbextension
If you're using an Anaconda distribution, install ipywidgets from the command line like such:
conda install -c conda-forge ipywidgets
With the Anaconda distribution there is no need to enable the extension, conda handles this for you.
Then import into your code:
import tensorflow_datasets as tfds
mnist = tfds.load(name='mnist')
You should be able to use it without error if you follow these instructions.
I might be kinda late, but for tensorflow version 0.12.1, you might wanna use input_data.read_data_sets instead.
Basically using this function to load the data from your local drive that you had downloaded from http://yann.lecun.com/exdb/mnist/.
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets('data_set/')
For TensorFlow API 2.0 the mnist data changed place to: tf.keras.datasets.mnist.load_data
There's now a much easier way to load MNIST data into tensorflow without having to download the data by using Tensorflow 2 and Tensorflow Datasets
To get started, make sure you import Tensorflow and specify the 2nd version:
%tensorflow_version 2.x
import tensorflow as tf
Then load the data into a dictionary using the following code:
MNIST_data = tfds.load(name = "mnist")
and Then split the data into train and test:
train, test = MNIST_data['train'] , MNIST_data['test']
Now you can use these data generators however you like.
Remove the lines:
from tensorflow.examples.tutorials.mnist import input_data
fashion_mnist = input_data.read_data_sets('input/data',one_hot=True)
and the line below will suffice:
fashion_mnist = keras.datasets.fashion_mnist
Note that if the dataset is not available in the examples built-in to the keras, this will download the dataset and solve the problem. :)
cd your_mnist_dir &&\
wget https://github.com/HIPS/hypergrad/raw/master/data/mnist/mnist_data.pkl &&\
wget https://github.com/HIPS/hypergrad/raw/master/data/mnist/t10k-images-idx3-ubyte.gz &&\
wget https://github.com/HIPS/hypergrad/raw/master/data/mnist/t10k-labels-idx1-ubyte.gz &&\
wget https://github.com/HIPS/hypergrad/raw/master/data/mnist/train-images-idx3-ubyte.gz &&\
wget https://github.com/HIPS/hypergrad/raw/master/data/mnist/train-labels-idx1-ubyte.gz
MNIST input_data was built-in, it's just not a individual module, it's inside Tensorflow module, try
from tensorflow.examples.tutorials.mnist import input_data
MNIST data set included as a part of tensorflow examples tutorial, If we want to use this :
Import MNIST data to identify handwritten digites
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("MNIST data", one_hot=True)
As TensorFlow official website shown, All MNIST data is hosted on http://yann.lecun.com/exdb/mnist/
For Tensorflow API above 2.0, to use MNIST dataset following command can be used,
import tensorflow_datasets as tfds
data = tfds.load(name = "mnist")
The following steps work perfectly in my Notebook:
step 1 : get Python files from github :
!git clone https://github.com/tensorflow/tensorflow.git
step 2 : append these files in my Python path :
import sys
sys.path.append('/content/tensorflow/tensorflow/examples/tutorials/mnist')
step 3 : load the MNIST data with 'input_data' fonction
import input_data
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)
That's all !
I am able to run the Deep MNIST Example fine, but when running fully_connected_feed.py, I am getting the following error:
File "fully_connected_feed.py", line 19, in <module>
from tensorflow.g3doc.tutorials.mnist import input_data ImportError: No module named
g3doc.tutorials.mnist
I am new to Python so could also just be a general setup problem.
This is a Python path issue. Assuming that the directory tensorflow/g3doc/tutorials/mnist is your current working directory (or in your Python path), the easiest way to resolve it is to change the following lines in fully_connected_feed.py from:
from tensorflow.g3doc.tutorials.mnist import input_data
from tensorflow.g3doc.tutorials.mnist import mnist
...to:
import input_data
import mnist
Another alternative is to link the 'g3doc' directory from the github repo into the tensorflow python wheel folder. That way you don't need to change the code.