Python: Load numpy arrays (npys) with specific columns - python

I have saved arrays as npy with sizes around 2GB. Can I somehow load only specific columns,rows with numpy.load ? I did not find a command for that or is there a workaround for that case?

This is not possible with .npy files. For that kind of problems, it is better recommended to use .h5 files, with the h5py package. You will find an example in this post: h5py: how to read selected rows of an hdf5 file?.

Related

Handling large numpy array in tensorflow with regression output(51 outputs)

I have a very large dataset which is a single npy file that contains around 1.5m elements each a 150x150x3 image. The output has 51 columns (51 outputs). Since the dataset can't fit into memory, How do I load it and use it to fit the model? An efficient way is using TFRecords and tf.data but I couldn't understand how to do this. I would appreciate the help. Thank you.
One way is to load your NPY file fragment by fragment ( to feed your neural network with) and not to load it into the memory at once. You can use numpy.load as normal and specify the mmap_mode keyword so that the array is kept on disk, and only necessary bits are loaded into memory upon access (more details here)
numpy.load(file, mmap_mode=None, allow_pickle=False, fix_imports=True, encoding='ASCII')
Memory-mapped files are used for accessing small segments of large files on disk, without reading the entire file into memory. NumPy’s memmap’s are array-like objects. This differs from Python’s mmap module, which uses file-like objects.
If you want to know how to create a tfrecords from a numpy array, and then read the tfrecords using the Dataset API, this link provides a good answer.

Sparse matrix in npz format in Python

I have a sparse matrix in numpy's .npz format. I know that to read this matrix I need to use scipy.sparse.load_npz(), but would like to understand its internals.
I see in the preview of the .npz file that it contains the following 5 parts:
data
format
indices
indptr
shape
How can I better understand this file format?
npz is a simple zip archive, which contains numpy files. Simple review of internal structure of ZIP can be found here http://en.wikipedia.org/wiki/ZIP_(file_format)
Here are the docs:
Format of .npz files https://docs.scipy.org/doc/numpy/reference/generated/numpy.savez.html
Format of .npy files
http://pyopengl.sourceforge.net/pydoc/numpy.lib.format.html

Saving numpy array such that it is readily available without loading

I have a 20GB library of images stored as a high-dimensional numpy array. This library allows me to these use images without having to generate them anew each time. Now my problem is that np.load("mylibrary") takes as much time as it would take to generate a couple of those images. Therefore my question is: Is there a way to store a numpy array such that it is readily accessible without having to load it?
Edit: I am using PyCharm
I would suggest h5py which is a Pythonic interface to the HDF5 binary data format.
It lets you store huge amounts of numerical data, and easily manipulate that data from NumPy. For example, you can slice into multi-terabyte datasets stored on disk, as if they were real NumPy arrays. Thousands of datasets can be stored in a single file, categorized and tagged however you want.
You can also use PyTables'. It is another HDF5 interface for python and numpy
PyTables is a package for managing hierarchical datasets and designed to efficiently and easily cope with extremely large amounts of data. You can download PyTables and use it for free. You can access documentation, some examples of use and presentations here.
numpy.memap is another option. It however would be slower than hdf5. Another condition is that a array should be limited to 2.5G

Load numpy .npy files Opencv C++

I am trying to load data saved from python as .npy files in Opencv C++.
I found about Filestorage XML/YML in Opencv, but is there a direct way to do that ?
Regards
SMW
There is a C++ .npy reader at https://github.com/rogersce/cnpy
.npy is documented at https://github.com/numpy/numpy/blob/master/doc/neps/npy-format.txt should you want to know how write your own (doesn't look like a big job).
I know i am late but you can use xtensor to load and save npy files
docs
github

how to export HDF5 file to NumPy using H5PY?

I have an existing hdf5 file with three arrays, i want to extract one of the arrays using h5py.
h5py already reads files in as numpy arrays, so just:
with h5py.File('the_filename', 'r') as f:
my_array = f['array_name'][()]
The [()] means to read the entire array in; if you don't do that, it doesn't read the whole data but instead gives you lazy access to sub-parts (very useful when the array is huge but you only need a small part of it).
For this question it is way overkill but if you have a lot of things like this to do I use a package SpacePy that makes some of this easier.
datamodel.fromHDF5() documentation This returns a dictionary of arrays stored in a similar way to how h5py handles data.

Categories

Resources