I tried out the following example. This should show image processing results.
from scipy import ndimage as ndi
import matplotlib.pyplot as plt
from scipy import misc
import numpy as np
import cv2
from skimage.morphology import watershed, disk
from skimage import data
from skimage.filters import rank
from skimage.util import img_as_ubyte
from skimage import io; io.use_plugin('matplotlib')
image = img_as_ubyte('imagepath.jpg')
# denoise image
denoised = rank.median(image, disk(2))
# find continuous region (low gradient -
# where less than 10 for this image) --> markers
# disk(5) is used here to get a more smooth image
markers = rank.gradient(denoised, disk(5)) < 10
markers = ndi.label(markers)[0]
# local gradient (disk(2) is used to keep edges thin)
gradient = rank.gradient(denoised, disk(2))
# process the watershed
labels = watershed(gradient, markers)
# display results
fig, axes = plt.subplots(nrows=2, ncols=2, figsize=(8, 8),
sharex=True, sharey=True)
ax = axes.ravel()
ax[0].imshow(image, cmap=plt.cm.gray, interpolation='nearest')
ax[0].set_title("Original")
ax[1].imshow(gradient, cmap=plt.cm.nipy_spectral, interpolation='nearest')
ax[1].set_title("Local Gradient")
ax[2].imshow(markers, cmap=plt.cm.nipy_spectral, interpolation='nearest')
ax[2].set_title("Markers")
ax[3].imshow(image, cmap=plt.cm.gray, interpolation='nearest')
ax[3].imshow(labels, cmap=plt.cm.nipy_spectral, interpolation='nearest', alpha=.7)
ax[3].set_title("Segmented")
for a in ax:
a.axis('off')
fig.tight_layout()
plt.show()
I get the follofing error.
Traceback (most recent call last):
File "/home/workspace/calculate_watershed.py", line 15, in <module>
image = img_as_ubyte('koralle0.jpg')
File "/home/workspace/venv/lib/python3.5/site-packages/skimage/util/dtype.py", line 409, in img_as_ubyte
return convert(image, np.uint8, force_copy)
File "/home/workspace/venv/lib/python3.5/site-packages/skimage/util/dtype.py", line 113, in convert
.format(dtypeobj_in, dtypeobj_out))
ValueError: Can not convert from <U12 to uint8.
The path to the image is a valued one. Do You have any idea how to solve this problem? Thanks in advance
The problem is that the ndarray returned from your image has dtype <U12 which cannot be converted to dtype uint8. To check the dtype of your image file, convert it to a numpy array. I get a <U38 dtype for my image:
np.array('CAPTURE.jpg')
#array('Capture.JPG', dtype='<U38')
You should first read the image with skimage.io.imread(image_path). This will return an ndarray of MxN, MxNx3 or MxNx4. Then, reshape the resultant ndarray to 2D if its 3D or 4D. This conversion is required because skimage.filters.rank.median(image) accepts an image ndarray of 2D shape. In the following code, I've used my sample image to perform these steps before passing to img_as_ubyte(sk_image). The rest of the code remains the same.
from skimage.io import imread
#<---code--->
sk_image = imread('CAPTURE.jpg') #read the image to convert to skimage ndarray
sk_image = sk_image.transpose(1,0,2).reshape(130,-1) #convert to 2D array
image = img_as_ubyte(sk_image) #Convert image to 8-bit unsigned integer format.
#<---code--->
I get the following images:
You should consider the following points:
Check the shape of the image array returned from imread: After reading the image with sk_image = imread('CAPTURE.jpg'), check the shape of the array with sk_image.shape. For my image, I get the shape as (74, 130, 3) which shows its a 3D array.
To reshape to 2D, first get the strides with sk_image.strides. For my image, I get (390, 3, 1), then transpose with sk_image.transpose(1,0,2). You can also check the strides after transposing and you will notice the values have been swapped sk_image.transpose(1,0,2).strides: (3, 390, 1). Then, use reshape: sk_image.transpose(1, 0, 2).reshape(130,-1) to reshape to a 2D array. You will notice that the reshape dimensions have been roughly calculated from the stride value(390/2).
P.S: You can read more about 3D to 2D reshaping of numpy arrays here.
Related
PIL returns IndexError: tuple index out of range when converting a 1D numpy array into an PIL image object.
I am trying to covert a 1D Numpy Array of length 2048 having value between 0 and 255 into an image using PIL. I think this is an issue with my array being 1D. I have also tried converting a random 1D array integer to an image and I get the same error.
Random integer example:
from PIL import Image
import numpy as np
arr = np.random.randint(255, size=(2048))
arr = arr.astype('uint8')
img = Image.fromarray(arr, 'L')
img.show()
I would expect the code to show an image of a singe line of pixels having varying shades of gray.
When I tried to run your code, the problem was just that your array was a 1D array. So try:
arr2d = arr.reshape(-1,1)
Image.fromarray(arr2d,'L').show()
The input array has to be 2D, even if one dimension is 1. You just need to decide if you want the image to be a horizontal or vertical row of pixels, and add a dimension when creating your array.
arr = np.random.randint(255, size=(2048, 1)) # vertical image
arr = np.random.randint(255, size=(2048, 1)) # horizontal image
I have a numpy array of shape (74, 743) which represents a spectrogram of a few seconds of human speech. I can easily convert this into a matplotlib plot using plt.subplots.matshow, but I want to know if it's possible to convert the plot into the original numpy array? At the least, how does matplotlib generate an image from an arbitrarily shaped array?
I am trying to create a Generative Adverserial Network that will produce images (this is due to the network's superior performance at image generation) of spectrograms. Then, I want to convert these spectrogram images into the quantitative spectrograms, i.e plot into a numpy array.
It seems you want to apply a colormap to a 2D array. Using matplotlib tools this could look like
import numpy as np
from matplotlib.colors import Normalize
import matplotlib.cm as cm
data = np.random.rand(74, 743)
cmap = cm.viridis
norm = Normalize(data.min(), data.max())
output = cmap(norm(data))
print(output.shape)
The output is an array of shape (74, 743, 4) with values between 0 and 1, denoting RGBA colors.
I'm new to Python and i need to draw a RGB spectrum as a numpy array.
For me it's clear that i need to rise the RGB values across the dimensions to get the spectrum.
import numpy as np
import matplotlib.pyplot as plt
spectrum = np.zeros([255,255, 3], dtype=np.unit8) #init the array
#fill the array with rgb values to create the spectrum without the use of loops
plt.imshow(spectrum)
plt.axis('off') # don't show axis
plt.show()
Is there a possiblity (e.g. a python or numpy method) to create the spectrum without the use of loops?
Not sure if this is the result you'd like, but you could define the arrays for the RGB values yourself (see HSV-RGB comparison). I've used Pillow to convert grayscale to colour.
import numpy as np
import matplotlib.pyplot as plt
from PIL import Image
spectrum = np.zeros([256,256*6, 3], dtype=np.uint8) # init the array
# fill the array with rgb values to create the spectrum without the use of loops
spectrum[:,:,0] = np.concatenate(([255]*256, np.linspace(255,0,256), [0]*256, [0]*256, np.linspace(0,255,256), [255]*256), axis=0)
spectrum[:,:,1] = np.concatenate((np.linspace(0,255,256), [255]*256, [255]*256, np.linspace(255,0,256), [0]*256,[0]*256), axis=0)
spectrum[:,:,2] = np.concatenate(([0]*256, [0]*256,np.linspace(0,255,256),[255]*256, [255]*256, np.linspace(255,0,256)), axis=0)
img = Image.fromarray(spectrum, 'RGB')
img.show()
Below is a simple section of code used to access an image using PIL, convert to a numpy array and then print the number of elements in the array.
The image in question is here - - and consists of exactly 100 pixels (10x10). However, the numpy array contains 300 elements (where I would expect 100 elements). What am I doing wrong?
import numpy as np
import PIL
impath = 'C:/Users/Ricky/Desktop/testim.tif'
im = PIL.Image.open(impath)
arr = np.array(im)
print arr.size #300
Every image can be composed by 3 bands (Red-Green-Blue or RGB composition).
Since your image is a black/white image those three bands are the same. You can see the difference using a colored image.
Try this to see what I mean:
import matplotlib.pyplot as pyplot
# this line above import a matplotlib library for plotting image
import numpy as np
import PIL
impath = 'C:/Users/Ricky/Desktop/testim.tif'
im = PIL.Image.open(impath)
arr = np.array(im)
print arr.shape # (10, 10, 3)
print arr[:, : ,0].size # 100
# next lines actually show the image
pyplot.imshow(arr[:, : ,0], cmap='gray')
pyplot.show()
I imported matplotlib.pyplot and also NumPy
I wanted to display an image from my desktop to the plot but I get a TypeError.
code :
img = (image) ( here do we need to give the location of the file or the file directly)
imshow(img, extent=[-25,25,-25,25], cmap = cm.bone)
colorbar()
Error: TypeError: Image data can not convert to float
I am using Pycharm as my ide.
You are a bit ambiguous about
here do we need to give the location of the file or the file directly
No you don't. You need to use some imaging library to read an image. img="C:\image.jpg" does not read an image!
For example, to read a 'png' image, you could:
# Copypaste from docs
import matplotlib.pyplot as plt
import matplotlib.image as mpimg
import numpy as np
img=mpimg.imread('myimage.png')
# end
# from now on you can use img as an image, but make sure you know what you are doing!
imgplot=plt.imshow(img)
plt.show()
Read more at Image tutorial at matplotlib's doc
Is img a numpy array of the right type?
If you read the image using pillow,etc. and have an Image object you have to get the numpy array from it ( img.getdata() )
X : array_like, shape (n, m) or (n, m, 3) or (n, m, 4)
Display the image in X to current axes. X may be a float array, a
uint8 array or a PIL image. If X is an array, it can have the
following shapes:
MxN – luminance (grayscale, float array only) MxNx3 – RGB (float or
uint8 array) MxNx4 – RGBA (float or uint8 array)
The value for each component of MxNx3 and MxNx4 float arrays should be in the range 0.0 to 1.0;
Either normalize img so it's between 0.0 and 1.0 or convert it to uint8 ( img=np.array(img, dtype=np.uint8) ).