I'm still getting the hang of working with numpy and array-wise operations.
I'm looking for the way of getting the row-wise average of a list of 2D arrays.
E.g I have a 4x3x25 array and I'm looking to get a 3x25 array of the row-wise averages.
If everything’s in one 3D array already, you can just do:
A.mean(axis=0)
…which will operate along the first dimension.
If it’s actually just a list of 2D arrays, you’ll have to convert it to a 3D array first. I would do:
A = np.dstack(list_of_arrays) # Combine the 2D arrays along a new 3rd dimension
A.mean(axis=2) # Calculate the means along that new dimension
Related
Suppose I have a numpy array A with shape (j,d,d) and I want to obtain an array with shape j, in which each entry corresponds to the determinant of each (d,d) array.
I tried using np.apply_along_axis(np.linalg.det(A), axis=0), but np.apply_along_axis only seems to work for 1D slices.
Is there an efficient way of doing that using only numpy?
np.linalg.det can already do this for an array of arbitrary shape as long as the last two dimensions are square. You can see the documentation here.
I think this is straightforward but I can't quite get it. I have a large 3d array and I want to reduce the 3rd dim by some factor and then sum the values to get to that reduced size. An example that works to get what I want is:
import numpy as np
arr=np.ones((10,10,16))
processed_data=np.zeros((arr.shape[0], arr.shape[1]), dtype='object')
factor=2
for i in range(arr.shape[0]):
for j in range(arr.shape[1]):
processed_data[i][j]=arr[i][j].reshape(int(arr.shape[2]/factor),-1).sum(axis=1)
So we take the last dimension, reshape it to an extra dimension and then sum along that dimension. In the example above the data is a 10x10x16 array of all 1s so with a factor=2 we get a 10x10x8 array out with the data all being 2s. I hope this illustrates what I am trying to achieve. If the factor would change to 4 we would get a 10x10x4 array out.
This method is not ideal as it involves creating a separate processed_data 'object' array where I would rather leave it as a 3D array, just with a reduced third dimension. It also involves iterating over every element in the 2D array which I don't think is neccessary. And it's really slow.
Any help appreciated - I suspect it is a combination of reshaping and transposing but cannot get my head around it.
Thanks.
I think you can reshape on the whole data and sum:
arr.reshape(*arr.shape[:2], -1, 2).sum(axis=-1)
I want to calculate theoretical value of 2D array.I have a 2D array like arr = [[1,3,4],[5,7,9],[8,1,7]]
So this 2D array's theoretical array is [5,3,7]
I tried to get the array by the code
theory = np.median(arr)
but when I print out theory, only 4.67 is returned.I read numpy document ,median method can be gotten array. What is wrong in my code?How should I fix this?
This will calculate the median over the rows.
numpy.median(arr, axis=0)
I got a np.ndarray with ~3000 trajectories. Each trajectory has x, y and z coordinates and a different length; between 150 and 250 (points in time). Now I want to remove the z coordinate for all of these trajectories.
So arr.shape gives me (3000,),(3000 trajectories) and (for example) arr[0].shape yields (3,178) (three axis of coordinates and 178 values).
I have found multiple explanations for removing lines in 2D-arrays and I found np.delete(arr[0], 2, axis=0) working for me. However, I don't just want to delete the z coordinates for the first trajectory; I want to do this for every trajectory.
If I want to do this with a loop for arr[i] I would need to know the exact length of every trajectory (It doesn't suit my purpose to just create the array with the length of the longest and fill it up with zeroes).
TL;DR: So how do I get from a ndarray with [amountOfTrajectories][3][value] to [amountOfTrajectories][2][value]?
The purpose is to use these trajectories as labels for a neural net that creates trajectories. So I guess it's a entirely new question but is the shape I'm asking for suitable for usage as labels for tensorflow?
Also: What would have been a better title and some terms to find results for this with google? I just started with Python and I'm afraid I'm missing some keywords here...
If this comes from loadmat, the source is probably a MATLAB workspace with a cell, which contains these matrices.
loadmat has, evidently created a 1d array of object dtype (the equivalent of a cell, with squeeze on).
A 1d object array is similar to a Python list - it contains pointers to arrays else where in memory. Most operations on such an array use Python iteration. Iterating on the equivalent list is usually faster. (arr.tolist()).
alist = [a[:2,:] for a in arr]
should give you a list of arrays, each of shape (2, n) (n varying). This makes new arrays - but then so does np.delete.
You can't operate on all arrays in the 1d array with one operation. It has to be iterative.
I have a numpy ndarray object with the following shape:
(3, 256, 170, 256).
So, basically this represents an array of 3-dimensional vectors. The dimension of the vector is the first element as it enables one to write something like: array[0] for the relevant vector component.
Now, I am trying to use scipy pdist function, which computes the distance between the entries. So, I need to modify this array, so that it can be represented as a two dimensional matrix, where the number of rows is 256*170*256 and the number of columns is 3 and pdist should return me the matrix where each element is the squared distance between the corresponding 3 dimensional vectors (if I have interpreted the documentation correctly).
Can someone tell me how I can get a view into this numpy array, so that I can generate this matrix. I do not want to copy the data again (as these matrices can be quite large), so looking for some efficient solutions.