I am using clr to import c# dll in python
one of the functions return ushort[,] ,
which is considered as System.UInt16[,] in python
How can in convert System.UInt16[,] to numpy uint16 matrix?
I can do the conversion only by looping on the matrix, reading each element and assigning its value to the respective position in another numpy matrix, but this solution is very slow.
Is there a faster conversion method which can utilize numpy vectorization ?
Here's a sample for my loop
import clr
import os
import numpy as np
dll_name = os.path.join(os.path.abspath(os.path.dirname(__file__)), ("mydll") + ".dll")
clr.AddReference(dll_name)
from mynamespace import myclass
myobject = myclass()
numpy_matrix = np.empty([80,260],dtype = np.uint16)
SystemInt16_matrix = myobject.Getdata()
for i in range(20):
for j in range(32):
numpy_matrix[i,j]=SystemInt16_matrix[i,j]
I could find the solution, instead of the loop I should use np.fromiter & reshape
import clr
import os
import numpy as np
dll_name = os.path.join(os.path.abspath(os.path.dirname(__file__)), ("mydll") + ".dll")
clr.AddReference(dll_name)
from mynamespace import myclass
myobject = myclass()
SystemInt16_matrix = myobject.Getdata()
numpy_matrix = np.fromiter(SystemInt16_matrix, np.int16).reshape((20, 32))
Related
I'm using the following code to convert all my 332 npy slices into the nii.gz format:
import numpy as np
import nibabel as nib
file_dir = "D:/Volumes convertidos LIDC/"
fileNPY1 = "slice200.npy"
img_array1 = np.load(file_dir + fileNPY1)
nifti_file = nib.Nifti1Image(img_array1 , np.eye(4))
nib.save(nifti_file, "D:/slices convertidos/slice200converted.nii.gz")
There are just too many slices for me (and tons of images) to keep doing it that way, is there a way to convert them all at once?
I don’t know nibabel, and I am not sure if I understand correctly what you are trying to do, but perhaps this will be helpful:
import numpy as np
import nibabel as nib
file_dir = "D:/Volumes convertidos LIDC/"
for i in range(332):
fileNPY1 = f"slice{i}.npy"
img_array1 = np.load(file_dir + fileNPY1)
nifti_file = nib.Nifti1Image(img_array1 , np.eye(4))
nib.save(nifti_file, f"D:/slicesconvertidos/slice{i}converted.nii.gz")
I have a function that I created and I want the function to be applied to these different values using a for loop or something.
How do I create a for loop that takes each value but stores them in different arrays?
I have this so far:
import numpy as np
from scipy import stats
import matplotlib.pyplot as plt
import matplotlib.patches as patches
import xarray as xr
import cartopy.crs as ccrs
import cartopy.feature as cfeature
import netCDF4 as s
import numpy.ma as ma
fwf_tot = fwf_ice + ds.runoff_tundra*ds.LSMGr #data input i am using
# function i want to apply to the data
def ob_annual(ob_monthly, id_number):
ann_sum = ob_monthly.where(ds.ocean_basins == id_number).resample(TIME='1AS').sum().sum(dim=('X','Y'))
return ann_sum
This is where my problem is to create the for loop to save for these different values. I think this for loop is just saving the function applied to the last value (87) and not the others. How might I fix this? I expected there to be an output of 7 arrays with each a size of 59.
obs = np.array([26,28,29,30,76,84,87])
total_obs = []
for i in obs:
total_obs = ob_annual(fwf_tot_grnl, i)
print(total_obs.shape)
(59)
You replace your list total_obs at each iteration. You must append each value into it:
for i in obs:
total_obs.append(ob_annual(fwf_tot_grnl, i))
or use a comprehension list
total_obs = [ob_annual(fwf_tot_grnl, i) for i in obs]
Im looking from smart ways to optimise this looped euclidean distance calculation. This calculation is looking for the mean distance from all other vectors.
As my vector arrays are really big to just do: eucl_dist = euclidean_distances(eigen_vs_cleaned)
Im running a loop row by row.
Typical eigen_vs_cleaned shape is at least (300000,1000) at the moment and I have to go up way more. (like 2000000,10000)
Any smarter way to do this?
eucl_dist_meaned = np.zeros(eigen_vs_cleaned.shape[0],dtype=float)
from sklearn.metrics.pairwise import euclidean_distances
for z in range(eigen_vs_cleaned.shape[0]):
if z%10000==0:
print(z)
eucl_dist_temp = euclidean_distances(eigen_vs_cleaned[z].reshape(1, -1), eigen_vs_cleaned)
eucl_dist_meaned[z] = eucl_dist_temp.mean(axis=1)
Im no python/numpy guru but this is the first step I took optimising this. it runs way better on my MacPro at least.
from joblib import Parallel, delayed
import multiprocessing
import os
import tempfile
import shutil
from sklearn.metrics.pairwise import euclidean_distances
# Creat a temporary directory and define the array pat
path = tempfile.mkdtemp()
out_path = os.path.join(path,'out.mmap')
out = np.memmap(out_path, dtype=float, shape=eigen_vs_cleaned.shape[0], mode='w+')
eucl_dist_meaned = np.zeros(eigen_vs_cleaned.shape[0],dtype=float)
num_cores = multiprocessing.cpu_count()
def runparallel(row, out):
if row%10000==0:
print(row)
eucl_dist_temp = euclidean_distances(eigen_vs_cleaned[row].reshape(1, -1), eigen_vs_cleaned)
out[row] = eucl_dist_temp.mean(axis=1)
##
nothing = Parallel(n_jobs=num_cores)(delayed(runparallel)(r, out) for r in range(eigen_vs_cleaned.shape[0]))
Then I save the output:
eucl_dist_meaned = np.array(out,copy=True,dtype=float)
I have a problem while modifying an NumPy array in depended module, that was previously defined in parrent module. I have checked, and it's modifying only localy in the function calc()
How can I modify an NumPy array that was defined in other module, inside a function?
main_module.py
import numpy as np
from pprint import pprint
test_array = np.array([1, 2, 3])
pprint(test_array)
process.py
from main_module import *
def calc():
global test_array
test_array = np.append(test_array, [4])
pprint(test_array)
calc()
pprint(test_array)
In python globals are global to the module, not to the whole program. The standard way to do something like this in an object oriented language is to attach the relevant array to some object for example:
main_module:
import numpy as np
from pprint import pprint
class GlobalArrayHolder(object):
def __init__(self):
self.test_array = np.array([1, 2, 3])
arrayholder = GlobalArrayHolder()
pprint(arrayholder.test_array)
process:
import numpy as np
from pprint import pprint
from main_module import arrayholder
def calc(arrayholder):
arrayholder.test_array = np.append(arrayholder.test_array, [4])
pprint(arrayholder.test_array)
calc(arrayholder)
pprint(arrayholder.test_array)
If you don't want to define your own class for this you can use a simple built in class like a dict. For example:
main_module:
import numpy as np
from pprint import pprint
arrayholder = {'test_array':np.array([1, 2, 3])}
pprint(arrayholder['test_array'])
process:
import numpy as np
from pprint import pprint
from main_module import arrayholder
def calc(arrayholder):
arrayholder['test_array'] = np.append(arrayholder['test_array'], [4])
pprint(arrayholder['test_array'])
calc(arrayholder)
pprint(arrayholder['test_array'])
I have a script that reads in image data, and then iterates over the images with the median filter in scipy.ndimage. From the iteration i create new arrays.
However when i attempt to run the script with
run filtering.py
The filtering does not seem to work. The new arrays (month_f) are the same as the old ones.
import matplotlib.pyplot as plt
import numpy as numpy
from scipy import ndimage
import Image as Image
# Get images
#Load images
jan1999 = Image.open('jan1999.tif')
mar1999 = Image.open('mar1999.tif')
may1999 = Image.open('may1999.tif')
sep1999 = Image.open('sep1999.tif')
dec1999 = Image.open('dec1999.tif')
jan2000 = Image.open('jan2000.tif')
feb2000 = Image.open('feb2000.tif')
#Compute numpy arrays
jan1999 = numpy.array(jan1999)
mar1999 = numpy.array(mar1999)
may1999 = numpy.array(may1999)
sep1999 = numpy.array(sep1999)
dec1999 = numpy.array(dec1999)
jan2000 = numpy.array(jan2000)
feb2000 = numpy.array(feb2000)
########### Put arrays into a list
months = [jan1999, mar1999, may1999, sep1999, dec1999, jan2000, feb2000]
############ Filtering = 3,3
months_f = []
for image in months:
image = scipy.ndimage.median_filter(image, size=(5,5))
months_f.append(image)
Any help would be much appreciated :)
This is rather a comment but due to reputation limits I'm not able to write one.
The way you import your modules is a bit strange. Especially "import .. as" with the idential name. I think a more pythonian way would be
import matplotlib.pyplot as plt
import numpy as np
from scipy import ndimage
from PIL import Image
and then call
image = ndimage.median_filter(image, size=(...))
When I run your steps with a RGB test image it seems to work.
What does jan1999.shape return?