IsADirectoryError: [Errno 21] importing data - python

I am trying to access to my data on my drive for train the model but i received this error can anyone please gives me a piece of advice
import matplotlib.pyplot as plt
import numpy as np
import tensorflow as tf
import pandas as pd
import seaborn as sns
import pickle
import random
with open("/content/drive/MyDrive/traffic_light_images/Train", mode='rb') as training_data:
train=pickle.load(training_data)
IsADirectoryError Traceback (most recent call last)
<ipython-input-10-e2ad2e076f9c> in <module>()
----> 1 with open("/content/drive/MyDrive/traffic_light_images/Train", mode='rb') as training_data:
2 train=pickle.load(training_data)
IsADirectoryError: [Errno 21] Is a directory: '/content/drive/MyDrive/traffic_light_images/Train'

The error is raised since you are passing a path to a directory (/content/drive/MyDrive/traffic_light_images/Train) instead of a file. Make sure to pass the path to the file instead of the directory.

You should add the file name and its extension after your directory like "/content/drive/MyDrive/traffic_light_images/Train/train_data.csv". Then, it is called a path.

Related

OSError: [Errno 36] File name too long: for python package and .txt file, pandas opening

Error OSError: [Errno 36] File name too long: for the following code:
from importlib_resources import open_text
import pandas as pd
with open_text('package.data', 'librebook.txt') as f:
input_file = f.read()
dataset = pd.read_csv(input_file)
Ubuntu 20.04 OS, this is for a python package, init.py file
I dont want to use .readlines()
Can I structure this code differently to not have this outcome occur? Do I need to modify my OS system? Some of the help I found looked to modify OS but dont want to do this if I dont need to. Thank you.
why not just pass in the name of the file and not the contents
dataset = pd.read_csv('librebook.txt')
from importlib_resources import path
import pandas as pd
with path('package.data', 'librebook.txt') as f:
dataset = pd.read_csv(f)

No such file or directory: 'final_data_1.npy'

I am trying this code using tensorflow and numpy. However, I am getting an error.
import numpy as np
from tensorflow.python.framework import ops
np.random.seed(1)
ops.reset_default_graph()
ops.reset_default_graph()
#final_data_1 and 2 are the numpy array files for the images in the folder img and annotations.csv file
#total of 5 GB due to conversion of values to int
Z2= np.load('final_data_1.npy')
Z1= np.load('final_data_2.npy')
print(Z2[:,0])
print(Z1.shape)
my error is:
FileNotFoundError: [Errno 2] No such file or directory: 'final_data_1.npy'
Can you suggest a solution?
Like the Error message implies you have to name the right directory where this file "final_data_1.npy" is located at:
Example
import pandas as pd
df = pd.read_csv("./Path/where/you/stored/table/data.csv")
print(df)
Same goes with the function .load()
You have to add the directory of this file
np.load('./User/Desktop/final_data_1.npy')
Without naming the directory where the file is located your computer doesn't know where "final_data_1" is

can't read directory using listdir command in google colab

let us suppose i have a lot of images located in this directory
directory ='/content/drive/My Drive/Colab Notebooks/GAN network/Celebra/img_align_celeba'
i have done following steps in order to read and show all images :
step 1 : mount drive
from google.colab import drive
drive.mount('/content/drive')
step 2: create a directory variable
directory ='/content/drive/My Drive/Colab Notebooks/GAN network/Celebra/img_align_celeba'
step 3:create a basic code for reading files
from os import listdir
from numpy import asarray
from PIL import Image
import matplotlib.pyplot as plt
def load_image(filename):
image =Image.open(filename)
image =image.convert('RGB')
pixels =asarray(image)
return pixels
def load_faces(directory,n_faces):
faces =list()
for filename in listdir(directory):
pixesl =load_image(directory + filename)
faces.append(pixels)
if len(faces) >= n_faces:
break
return asarray(faces)
def plot_faces(faces,n):
for i in range(n*n):
plt.subplot(n,n,1+i)
plt.axis('off')
plt.imshow(faces[i])
plt.show()
and final step is to check program :
faces =load_faces(directory,25)
print('Loaded: ', faces.shape)
plot_faces(faces,5)
but it gives me following error :
---------------------------------------------------------------------------
OSError Traceback (most recent call last)
<ipython-input-9-3392f98fc252> in <module>()
----> 1 faces =load_faces(directory,25)
2 print('Loaded: ', faces.shape)
3 plot_faces(faces,5)
<ipython-input-8-99365c573bbf> in load_faces(directory, n_faces)
10 def load_faces(directory,n_faces):
11 faces =list()
---> 12 for filename in listdir(directory):
13 pixesl =load_image(directory + filename)
14 faces.append(pixels)
OSError: [Errno 5] Input/output error: '/content/drive/My Drive/Colab Notebooks/GAN network/Celebra/img_align_celeba'
please help me to clarify what is wrong?
I had problems with listdir too. My solutions were to force the drive mount and to use glob to list files in the folder.
import glob
from google.colab import drive
drive.mount('/gdrive', force_remount=True)
files = glob.glob(f"/gdrive/My Drive/path_to_folder")
for file in files:
do_something(file)

Can't access directory Tensorflow Google Colab

Sorry I'm new to Tensorflow2.1 andGoogleColab`. And I don't understand why I have this error :
My code :
%tensorflow_version 2.x
import tensorflow as tf
from tensorflow import keras
print(tf.__version__)
import pathlib
import os
path_data_dir = tf.keras.utils.get_file(origin='https://www.kaggle.com/c/dogs-vs-cats/download/0iMGwZllApFLiU35zX78%2Fversions%2Fm5lLqMS0KLfxJUozn3gR%2Ffiles%2Ftrain.zip',fname='train',untar= True)
data_dir = pathlib.Path(path_data_dir)
entries = os.listdir(data_dir)
for entry in entries:
print(entry)
And I have this error (I tried to mount a GoogleDrive folder and I have access
FileNotFoundError Traceback (most recent call last)
<ipython-input-1-88f88035f225> in <module>()
12 data_dir = pathlib.Path(path_data_dir)
13
---> 14 entries = os.listdir(data_dir)
15 for entry in entries:
16 print(entry)
FileNotFoundError: [Errno 2] No such file or directory: '/root/.keras/datasets/train'
Thanks a lot for your help
Lily
I am assuming this is because of the different file system structure between a normal Linux machine and the runtime hosted by Google Colab.
As a workaround, pass the cache_dir='/content' argument to the get_file function to be as follows: path_data_dir = tf.keras.utils.get_file(origin='https://www.kaggle.com/c/dogs-vs-cats/download/0iMGwZllApFLiU35zX78%2Fversions%2Fm5lLqMS0KLfxJUozn3gR%2Ffiles%2Ftrain.zip',fname='train',untar= True, cache_dir='/content')
Be aware that the returned value path_data_dir is a full path to the file, so the function call os.list_dir(data_dir) will fail since data_dir points to a file and not a directory.
To fix this, change entries = os.listdir(data_dir) to entries = os.listdir(data_dir.parent)
I think this is simply a bad link to download data finally... On google colab I can't see correctly the downloaded file (because I can't see folders...) but I tried later on a computer and It's juste the link.

FileNotFoundError: [Errno 2] I can't figure out why my file path does not exist

I am using python to write some code in a tensorflow google colab notebook.
I am stuck with this error
FileNotFoundError: [Errno 2] File b'/home/brandon/Desktop/AnomalyDetection/Code/train/Y_10KHz_left.csv' does not exist: b'/home/brandon/Desktop/AnomalyDetection/Code/train/Y_10KHz_left.csv'
Here is the problematic snippet of code:
from __future__ import absolute_import, division, print_function, unicode_literals
import functools
import numpy as np
import tensorflow as tf
import os
import pandas as pd
import matplotlib.pyplot as plt
import sklearn
#load the data from local file into a dataframe
path = '/home/brandon/Desktop/AnomalyDetection/Code/train/Y_10KHz_left.csv'
df = pd.read_csv(path)
df.head()
And just to confirm I do have the right path,
(base) brandon#brandon:~/Desktop/AnomalyDetection/Code/train$ find $PWD -type f | grep "Y_10KHz_left.csv"
/home/brandon/Desktop/AnomalyDetection/Code/train/Y_10KHz_left.csv
You should check for the existence of the file before attempting to read from it.
import os
#load the data from local file into a dataframe
file_path = '/home/brandon/Desktop/AnomalyDetection/Code/train/Y_10KHz_left.csv'
if os.path.exists(file_path):
df = pd.read_csv(file_path)
df.head()
else:
print(f"Unable to find the file at {file_path}")
try this
import os
#load the data from local file into a dataframe
file_path = '/home/brandon/Desktop/AnomalyDetection/Code/train/Y_10KHz_left.csv'
if file_full_path:
file_path, file_name = os.path.split(file_full_path)
print(file_path, file_name)
try:
data = pd.read_csv(os.path.join(file_path, file_name))
print(data)
except Exception as e:
print(f"error: {e}")

Categories

Resources