Numpy append: Automatically cast an array of the wrong dimension - python

is there a way to do the following without an if clause?
I'm reading a set of netcdf files with pupynere and want to build an array with numpy append. Sometimes the input data is multi-dimensional (see variable "a" below), sometimes one dimensional ("b"), but the number of elements in the first dimension is always the same ("9" in the example below).
> import numpy as np
> a = np.arange(27).reshape(3,9)
> b = np.arange(9)
> a.shape
(3, 9)
> b.shape
(9,)
this works as expected:
> np.append(a,a, axis=0)
array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8],
[ 9, 10, 11, 12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23, 24, 25, 26],
[ 0, 1, 2, 3, 4, 5, 6, 7, 8],
[ 9, 10, 11, 12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23, 24, 25, 26]])
but, appending b does not work so elegantly:
> np.append(a,b, axis=0)
ValueError: arrays must have same number of dimensions
The problem with append is (from the numpy manual)
"When axis is specified, values must have the correct shape."
I'd have to cast first in order to get the right result.
> np.append(a,b.reshape(1,9), axis=0)
array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8],
[ 9, 10, 11, 12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23, 24, 25, 26],
[ 0, 1, 2, 3, 4, 5, 6, 7, 8]])
So, in my file reading loop, I'm currently using an if clause like this:
for i in [a, b]:
if np.size(i.shape) == 2:
result = np.append(result, i, axis=0)
else:
result = np.append(result, i.reshape(1,9), axis=0)
Is there a way to append "a" and "b" without the if statement?
EDIT: While #Sven answered the original question perfectly (using np.atleast_2d()), he (and others) pointed out that the code is inefficient. In an answer below, I combined their suggestions and replaces my original code. It should be much more efficient now. Thanks.

You can use numpy.atleast_2d():
result = np.append(result, np.atleast_2d(i), axis=0)
That said, note that the repeated use of numpy.append() is a very inefficient way to build a NumPy array -- it has to be reallocated in every step. If at all possible, preallocate the array with the desired final size and populate it afterwards using slicing.

You can just add all of the arrays to a list, then use np.vstack() to concatenate them all together at the end. This avoids constantly reallocating the growing array with every append.
|1> a = np.arange(27).reshape(3,9)
|2> b = np.arange(9)
|3> np.vstack([a,b])
array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8],
[ 9, 10, 11, 12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23, 24, 25, 26],
[ 0, 1, 2, 3, 4, 5, 6, 7, 8]])

I'm going to improve my code with the help of #Sven, #Henry and #Robert. #Sven answered the question, so he earns the reputation for this question, but - as highlighted by him and others -there is a more efficient way of doing what I want.
This involves using a python list, which allows appending with a performance penalty of O(1) whereas numpy.append() has a performance penalty of O(N**2). Afterwards, the list is converted to a numpy array:
Suppose i is either of type a or b:
> a = np.arange(27).reshape(3,9)
> b = np.arange(9)
> a.shape
(3, 9)
> b.shape
(9,)
Initialise list and append all read data, e.g. if data appear in order 'aaba'.
> mList = []
> for i in [a,a,b,a]:
mList.append(i)
Your mList will look like this:
> mList
[array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8],
[ 9, 10, 11, 12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23, 24, 25, 26]]),
array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8],
[ 9, 10, 11, 12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23, 24, 25, 26]]),
array([0, 1, 2, 3, 4, 5, 6, 7, 8]),
array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8],
[ 9, 10, 11, 12, 13, 14, 15, 16, 17],
[18, 19, 20, 21, 22, 23, 24, 25, 26]])]
finally, vstack the list to form a numpy array:
> result = np.vstack(mList[:])
> result.shape
(10, 9)
Thanks again for valuable help.

As pointed out, append needs to reallocate every numpy array. An alternative solution that allocates once would be something like this:
total_size = 0
for i in [a,b]:
total_size += i.size
result = numpy.empty(total_size, dtype=a.dtype)
offset = 0
for i in [a,b]:
# copy in the array
result[offset:offset+i.size] = i.ravel()
offset += i.size
# if you know its always divisible by 9:
result = result.reshape(result.size//9, 9)
If you can't precompute the array size, then perhaps you can put an upper bound on the size and then just preallocate a block that will always be big enough. Then you can just make the result a view into that block:
result = result[0:known_final_size]

Related

numpy is double transposition necessary in this specific case?

I have an array
xx = np.arange(24).reshape(2, 12)
array([[ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11],
[12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23]])
and I would like to reshape it, to obtain
array([[[ 0, 1, 2, 3],
[12, 13, 14, 15]],
[[ 4, 5, 6, 7],
[16, 17, 18, 19]],
[[ 8, 9, 10, 11],
[20, 21, 22, 23]]])
I can achieve it via
xx.T.reshape(3, 4, 2).transpose(0, 2, 1)
But it has to be transposed twice, which seems unnecessary to me. So could somebody confirm that this is the only way of doing it or provide more readable solution otherwise?
Thanks!
It is possible to do a single transpose:
data = np.arange(24).reshape(2, 12)
data = data.reshape(2, 3, 4).transpose(1, 0, 2)
Edit:
I checked this using itertools.permutations and itertools.product:
import itertools
import numpy as np
data = np.arange(24).reshape(2, 12)
desired_data = np.array([[[ 0, 1, 2, 3],
[12, 13, 14, 15]],
[[ 4, 5, 6, 7],
[16, 17, 18, 19]],
[[ 8, 9, 10, 11],
[20, 21, 22, 23]]])
shapes = [2, 3, 4]
transpose_dims = [0, 1, 2]
shape_permutations = itertools.permutations(shapes)
transpose_permutations = itertools.permutations(transpose_dims)
for shape, transpose in itertools.product(
list(shape_permutations),
list(transpose_permutations),
):
new_data = data.reshape(*shape).transpose(*transpose)
try:
np.allclose(new_data, desired_data)
except ValueError as e:
pass
else:
break
print(f"{shape=}, {transpose=}")
shape=(2, 3, 4), transpose=(1, 0, 2)
I would do it this way: first, generate two arrays (shown separated for the sake of decomposition):
xx.reshape(2, -1, 4)
# Output:
# array([[[ 0, 1, 2, 3],
# [ 4, 5, 6, 7],
# [ 8, 9, 10, 11]],
#
# [[12, 13, 14, 15],
# [16, 17, 18, 19],
# [20, 21, 22, 23]]])
From here, I would then stack along the second dimension in order to combine them like you want:
np.stack(xx.reshape(2, -1, 4), axis=1)
# Output:
# array([[[ 0, 1, 2, 3],
# [12, 13, 14, 15]],
#
# [[ 4, 5, 6, 7],
# [16, 17, 18, 19]],
#
# [[ 8, 9, 10, 11],
# [20, 21, 22, 23]]])
You'd avoid the transposition. Hopefully it's more readable, but in the end, that's highly subjective, right? '^^
To add on top of #Paul's answer, there is some speedup from removing one of the transpose. The time gain is of ~15%:

How to cast a list into an array with specific ordering of the elements in the array

If I have a list:
lst = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]
I would like to cast the above list into an array with the following arrangements of the elements:
array([[ 1, 2, 3, 7, 8, 9]
[ 4, 5, 6, 10, 11, 12]
[13, 14, 15, 19, 20, 21]
[16, 17, 18, 22, 23, 24]])
How do I do this or what is the best way to do this? Many thanks.
I have done this in a crude way below where I will just get all the sub-matrix and then concatenate all of them at the end:
np.array(results[arr.shape[0]*arr.shape[1]*0:arr.shape[0]*arr.shape[1]*1]).reshape(arr.shape[0], arr.shape[1])
array([[1, 2, 3],
[4, 5, 6]])
np.array(results[arr.shape[0]*arr.shape[1]*1:arr.shape[0]*arr.shape[1]*2]).reshape(arr.shape[0], arr.shape[1])
array([[ 7, 8, 9],
[ 10, 11, 12]])
etc,
But I will need a more generalized way of doing this (if there is one) as I will need to do this for an array of any size.
You could use the reshape function from numpy, with a bit of indexing :
a = np.arange(24)
>>> a
array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16,
17, 18, 19, 20, 21, 22, 23])
Using reshape and a bit of indexing :
a = a.reshape((8,3))
idx = np.arange(2)
idx = np.concatenate((idx,idx+4))
idx = np.ravel([idx,idx+2],'F')
b = a[idx,:].reshape((4,6))
Ouptut :
>>> b
array([[ 0, 1, 2, 6, 7, 8],
[ 3, 4, 5, 9, 10, 11],
[12, 13, 14, 18, 19, 20],
[15, 16, 17, 21, 22, 23]])
Here the tuple (4,6) passed to reshape indicates that you want your array to be 2 dimensional, and have 4 arrays of 6 elements. Those values can be computed.
Then we compute the index to set the correct order of the data. Obvisouly, this a complicated bit here. As I'm not sure what you mean by "any size of data", its difficult for me to give you a agnostic way to compute that index.
Obviously, if you are using a list and not an np.array, you might have to convert the list first, for example by using np.array(your_list).
Edit :
I'm not sure if this exactly what you are after, but this should work for any array evenly divisible by 6 :
def custom_order(size):
a = np.arange(size)
a = a.reshape((size//3,3))
idx = np.arange(2)
idx = np.concatenate([idx+4*i for i in range(0,size//(6*2))])
idx = np.ravel([idx,idx+2],'F')
b = a[idx,:].reshape((size//6,6))
return b
>>> custom_order(48)
array([[ 0, 1, 2, 6, 7, 8],
[ 3, 4, 5, 9, 10, 11],
[12, 13, 14, 18, 19, 20],
[15, 16, 17, 21, 22, 23],
[24, 25, 26, 30, 31, 32],
[27, 28, 29, 33, 34, 35],
[36, 37, 38, 42, 43, 44],
[39, 40, 41, 45, 46, 47]])

How To Index Range Of 2-D Numpy Array From Different Start Points Sourced From Other Array [duplicate]

This question already has answers here:
Selecting Random Windows from Multidimensional Numpy Array Rows
(2 answers)
Closed 3 years ago.
I am attempting to extract out and perform math on the subset of one numpy array that is of shape (3,32), the subset of data I am attempting to extract out is that of shape (3,9) and the ranges of this data originate at the indices contained in another array size (3). As an example, I have a data set of values from three channels operating in the time domain and I extract the index of the max value of each channel into an array
a = np.random.randint(20,size = (3,32))
a
array([[18, 3, 10, 6, 12, 1, 10, 8, 4, 11, 13, 14, 9, 9, 10, 2,
9, 0, 0, 16, 14, 19, 1, 19, 14, 19, 19, 2, 14, 0, 4, 18],
[ 9, 19, 2, 12, 0, 14, 18, 7, 3, 0, 7, 3, 12, 19, 4, 2,
5, 9, 2, 11, 15, 19, 16, 17, 3, 4, 17, 5, 6, 1, 2, 17],
[ 0, 11, 18, 8, 9, 2, 9, 15, 9, 6, 0, 8, 9, 16, 9, 6,
1, 19, 1, 9, 12, 8, 0, 0, 7, 15, 3, 14, 15, 8, 10, 19]])
b = np.argmax(a,1)
b
array([21, 1, 17], dtype=int64)
my goal at this point is to derive a new array consisting of the three values from each of the indexes specified. For instance I would be looking to extract out :
[21,22,23] from a[0]
[1,2,3] from a[1]
[17,18,19] from a[2]
all into a new array of size [3,3]
I've been able to achieve this using loops already but I suspect that there is a more efficient way of producing this result without loops (speed is a bit of an issue with this application). I have been able to effect a similar result by manually populating a smaller matrix manually ...
c = np.asarray([[1,2,3],[2,3,4],[3,4,5]])
a[np.arange(3)[:,None],c]
array([[ 3, 10, 6],
[ 2, 12, 0],
[ 8, 9, 2]])
However, given the dynamic nature of this application I would like to write this such that it can be dynamically scaled (range of indeces out to 9 values beyond the root index, etc). I just don't know if there is such a way to do this. I have used syntax similar to the following in an effort to slice the array ...
a[np.arange(3)[:,None],b[:]:(b[:] + 2)]
resulting in error messages in the nature of ...
builtins.TypeError: only integer scalar arrays can be converted to a scalar index
Since you say you cannot overflow, this gets much less tricky. In general, since you have your starting indices, you can use basic broadcasting to create an (n, 3) shape array with your indices, and use take_along_axis to pull those elements from the original array.
np.take_along_axis(a, b[:, None] + np.arange(3), axis=1)
array([[19, 1, 19],
[19, 2, 12],
[19, 1, 9]])

numpy.where condition to select specific vertical columns in 2D array

say we have
a = numpy.arange(25).reshape(5,5)
> array([[ 0, 1, 2, 3, 4],
[ 5, 6, 7, 8, 9],
[10, 11, 12, 13, 14],
[15, 16, 17, 18, 19],
[20, 21, 22, 23, 24]])
By going
numpy.where(a[1])
> array([0, 1, 2, 3, 4])
and then something like
a[1][numpy.where(a[1])]
> array([5, 6, 7, 8, 9])
I can select the horizontal rows of an array and the respective values, However how can I have a similar where condition to select only specific vertical columns
ie.
numpy.where(condition)
> array([1, 6, 11, 16, 21])
I'm not sure exactly if this is what you mean, but you can index columns using [:,column_number], where : stands for "all rows":
a[:,1][numpy.where(a[1])]
# array([ 1, 6, 11, 16, 21])
The above, however, is equivalent to simply a[:,1]:
>>> a[:,1]
array([ 1, 6, 11, 16, 21])
Have a look at this tutorial to learn how to apply slicing on numpy arrays (https://docs.scipy.org/doc/numpy-1.15.1/reference/arrays.indexing.html). As for your question, the answer is:
a[:,1]

Preserving sequential order of numpy 2D arrays

Given this 2D numpy array:
a=numpy.array([[31,22,43],[44,55,6],[17,68,19],[12,11,18],...,[99,98,97]])
given the need to flatten it using numpy.ravel:
b=numpy.ravel(a)
and given the need to later dump it into a pandas dataframe, how can I make sure the sequential order of the values in a is preserved when applying numpy.ravel? e.g., How can I check/ensure that numpy.ravel does not mess up with the original sequential order?
Of course the intended result should be that the numbers coming before and after 17 in b, for instance, are the same as in a.
First of all you need to formulate what "sequential" order means for you, as numpy.ravel() does preserve order. Here is a tip how to formulate what you need: try with a simplest possible toy example:
import numpy as np
X = np.arange(20).reshape(-1,4)
X
#array([[ 0, 1, 2, 3],
# [ 4, 5, 6, 7],
# [ 8, 9, 10, 11],
# [12, 13, 14, 15],
# [16, 17, 18, 19]])
X.ravel()
# array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12,
# 13, 14, 15, 16, 17, 18, 19])
Does it meet your expectation? Or you want to see this order:
Z = X.T
Z
# array([[ 0, 4, 8, 12, 16],
# [ 1, 5, 9, 13, 17],
# [ 2, 6, 10, 14, 18],
# [ 3, 7, 11, 15, 19]])
Z.ravel()
# array([ 0, 4, 8, 12, 16, 1, 5, 9, 13, 17, 2, 6, 10,
# 14, 18, 3, 7, 11, 15, 19])

Categories

Resources