I was wondering if it is possible to concatenate two different pytorch tensors with different shapes.
one tensor is of shape torch.Size([247, 247]) and the other is of shape torch.Size([10, 183]). Is it possible to concatenate these using torch.cat() on dim=1?
I think you should use broadcasting. That is, to broadcast torch.Size([10, 183]) along dimension 0 (to reach 247) or do it for the other dimensions. For torch.cat to work, you need to have matching dimensions along which you are trying to concatenate.
Related
Suppose you have a tf.data.Dataset of the following definition:
tf.data.Dataset.from_generator(gen, output_signature=(
tf.TensorSpec(shape=(1, self.n_channels, self.height, self.width)),
tf.RaggedTensorSpec(shape=(1, None, self.agent_dim), ragged_rank=1)
))
These ones in the beginning do not do you any good, since using .batch(batch_size) on this Dataset adds another dimension. Now you have two approaches: to reshape ragged tensors as soon as they get into keras.Model.train_step, squeezing the excess dimensions, or to drop this 1 in tf.RaggedTensorSpec(shape=(1, None, self.agent_dim), ragged_rank=1).
For the first one, can one cast a RaggedTensor to a shape? A related question
For the second one, can one create a RaggedTensor with the first dimension being None?
Thank you for your attention in advance
I have a tensor list and each element has different shapes. For example, there are two tensors in my list, the shape of the first one is 333 and the second one is 4*4. I want to randomly sample a tensor from them in TensorFlow. But I don't know how to do it.
My approach now to reshape all the tensors to 1*N and use tf.concat to create a new tensor. Then I can use tf.gather. But this is to slow. I want to choose the tensor by using the index directly.
I'm looking at LSTM neural networks. I saw code like this below:
X_train = np.reshape(X_train, (X_train.shape[0], X_train.shape[1], 1))
This code is meant to change a 2d array into a 3d array but the syntax looks off to me or at least I don't understand it. For example I would assume this code below as a 3d syntax
np.reshape(rows , columns, dimensions)
Could someone elaborate what the syntax is and what it is trying to do.
Function numpy.reshape gives a new shape to an array without changing its data. It is a numpy package function. First of all, it needs to know what to reshape, which is the first argument of this function (in your case, you want to reshape X_train).
Then it needs to know what is the shape of your new matrix. This argument needs to be a tuple. For 2D reshape you can pass (W,H), for three dimensional you can pass (W,H,D), for four dimensional you can pass (W,H,D,T) and so on.
However, you can also call reshape a Numpy matrix by X_train.reshape((W,H,D)). In this case, since reshape function is a method of X_train object, then you do not have to pass it and only pass the new shape.
It is also worth mentioning that the total number of element in a matrix with the new shape, should match your original matrix. For example, your 2D X_train has X_train.shape[0] x X_train.shape[1] elements. This value should be equal to W x H x D.
I created a CNN whith Python and Keras which compresses 2D input of various length into a single output. All images have a height of 80 pixels, but different lenght, e.g. shape (80, lenght_of_image_i, 2), where 2 is the number of color channels.
I have 5000 images, the shape of the training data array X in numpy is (5000, 1) and the array has dtype object. This is because storing content with different shape is not possible in a single numpy array. Each object in the list has shape (80, lenght_of_image_i, 2).
With this said, when I call the model.fit(X,y) function of the sequential model, I get the following error:
ValueError: Error when checking input: expected conv2d_1_input to have 4
dimensions, but got array with shape (5000, 1)
Converting the numpy array to Python list of numpy arrays also doesn't work:
AttributeError: 'list' object has no attribute 'ndim'
Zero padding or transformations of my data to get all of my images to the same shape is not an option.
My Question now is: How can I call the model.fit(X,y) function when my data has not a fixed shape?
Thank you in advance!
Edit: Note that I do not have a problem with the architecture of my network (since I am not using dense layers). My problem is that I cannot call the fit function, due to problems with the shape of the numpy array.
My model is a replicate of this network: http://machine-listening.eecs.qmul.ac.uk/wp-content/uploads/sites/26/2017/01/sparrow.pdf
You need to pass "numpy arrays" to fit, of type "float". That is the only possibility.
So, you will probably have to group batches of images with the same length, or train each sample individually:
for image, output in zip(images,outputs):
model.train_on_batch(image.reshape((1,80,-1,2), outputs.reshape((1,)+outputs.shape, ....)
During debuging the Tensorflow code, I would like to output the shape of a tensor, say, print("mask's shape is: ",mask.get_shape()) However, the corresponding output is mask's shape is (?,?,?,1) How to explain this kind of output, is there anyway to know the exactly value of the first three dimensions of this tensor?
This output means that TensorFlow's shape inference has only been able to infer a partial shape for the mask tensor. It has been able to infer (i) that mask is a 4-D tensor, and (ii) its last dimension is 1; but it does not know statically the shape of the first three dimensions.
If you want to get the actual shape of the tensor, the main approaches are:
Compute mask_val = sess.run(mask) and print mask_val.shape.
Create a symbolic mask_shape = tf.shape(mask) tensor, compute mask_shape_val = sess.run(mask_shape) and print `mask_shape.
Shapes usually have unknown components if the shape depends on the data, or if the tensor is itself a function of some tensor(s) with a partially known shape. If you believe that the shape of the mask should be static, you can trace the source of the uncertainty by (recursively) looking at the inputs of the operation(s) that compute mask and finding out where the shape becomes partially known.