I have a 2D array of values and I'm trying to analyze spatial correlations. To calculate a 2D autocorrelation like Moran's I in python, pysal provides an implementation.
1) How do I transform my 2D data into a 1D array expected by pysal?
2) How do I construct a weight array w that is based on distance (what does the input array of points mean in the Kernel distance function?)?
1) The weights array should be flattened in the same way as you flatten the data array. The order doesn't matter, as long as the indices agree.
2) The input array can be spatial coordinates (e.g. x and y, or lat and long). By far the easiest are the indices of your original matrix (e.g. 1 to n times 1 to m).
In the end, your data will be a list with 3 elements: x, y and value. Your weights will be a list with 5 elements: x_from, y_from, x_to, y_to and weight.
Related
How can I turn this log likelihood 2D matrix format equation into a 1D format array to use the scipy.minimize on it? The log likelihood function is as follows:
term = A#(X.shift(1).fillna(0))-X
min_fun = (term.T)#np.diag(mat)#term
where X is a time series format known 2D array (MxT), A is square 2d array (MxM) which I want to estimate the elements of and np.diag(mat) is a column vector of length M. Note that the problem is high dimension so I end up with many equations and am looing for the best way to make this parameter estimation into a 1D equation format.
I need to implement an algorithm. But it takes a lot of time to compute and I need to make it as fast as possible.
Right now I have two numpy arrays:
Array A -> 2000 vectors of 512 elements,
Array B -> 1000 vectors of 512 elements.
I need to calculate every single distance between the vectors from Array A and Array B. Right now, I take 1 vector from array A, and calculate it's distances to all vectors in Array B as follows:
np.sum(np.abs(B-A[0])**2,axis=-1)**(0.5)
But using this I have to loop for 2000 cycles and it takes a lot of time.
Any alternatives?
sklearn.metrics.pairwise_distances solves exactly this problem.
I have a list of points xy with the shape(2,100). I want to take the dot product with a 2x2 matrix as follows:
g = xy.T#W#xy
which should result in a vector of 100 values. How can I do this with Python?
I know it should result in 100 values because the above express works well if I feed in one 2D point. How can I vectorize the above?
We can np.einsum -
np.einsum('ij,ik,kj->j',xy,W,xy, optimize=True)
I have a set of numpy.arrays of NXM (two dimensions: Range and Azimuth).
I need to form a stack of three dimensions and extract a single dimension vector to compute a covariance matrix (the red vectors in the picture).
How i do this efficiently and easy in Python?
You can make a 3D numpy array pretty easily and then just use the indexing to pull out the bits that you're interested in:
stackOfImages = np.array((image1, image2)) #iterate over these if many more
redData = stackOfImages[:, N-1, M-1]
I have a numpy ndarray object with the following shape:
(3, 256, 170, 256).
So, basically this represents an array of 3-dimensional vectors. The dimension of the vector is the first element as it enables one to write something like: array[0] for the relevant vector component.
Now, I am trying to use scipy pdist function, which computes the distance between the entries. So, I need to modify this array, so that it can be represented as a two dimensional matrix, where the number of rows is 256*170*256 and the number of columns is 3 and pdist should return me the matrix where each element is the squared distance between the corresponding 3 dimensional vectors (if I have interpreted the documentation correctly).
Can someone tell me how I can get a view into this numpy array, so that I can generate this matrix. I do not want to copy the data again (as these matrices can be quite large), so looking for some efficient solutions.