Scipy imsave darken my image - python

I'm doing image processing and am working with Python 2.7 in a Jupyter Notebook.
But when I save a numpy array as an image with scipy.misc.imsave(), the result appears darker than when I visualize it with matplotlib.
Here is the result when I plot the image in my notebook :
import matplotlib.pyplot as plot
plot.imshow(img)
And here is the image when I save it :
scipy.misc.imsave(img, 'img.png')
The image appears darker than it should be and I have no idea why. Has someone ever faced a similar problem ?

This is because imsave() normalizes the image between min and max values.
You can do this:
scipy.misc.toimage(img, cmin=0, cmax=255).save('img.png')

It appears that some of my images have values <0, the solution is to clip them between 0 and 255 and the saved image is now correct. But I still don't know why the plot shows them correctly

Related

How to save an image plotted with matplolib 1.5

I want to save an image plotted with matplotlib. For that, I use the function savefig which has different parameters.
The problem is that when I saved the image, this function add additional white pixel.
In short, I would like to save the image I draw with the original size. In other words, if the data I draw has a dimension of 1000x560, I save the image with those dimensions without additional white parts.
Thus in this way a pixel of the saved image coincides with the pixel that the figure of matplotlib can see.
I'm using python 2.7
Can anyone help please?
Thanks
from matplotlib import pyplot as plt
plt.savefig('foo.png', bbox_inches='tight')

Show 2D array (grayscale image) in heatmap in python

I'm a beginner in working with images in python and I'm trying to display 2D array, 500px x 500px, array((500, 500)), which I usually display as an grayscale image as a color image, in heatmap.
to be displayed like this:
I tried but I couldn't find the answers in the internet, and what I found didn't work for me. Please help.
I don't really have much code, I only know that this one:
my_img = plt.imread(filename)
plt.imshow(my_img, cmap="hot")
doesn't work, it displays the same image, in grayscale.
try giving pcolor a try. It's the more usual analogy to a "heatmap". imshow is more aligned with the display of images true to the color values in the array. The fact that your ideal image is inverted from your practice image also tells me pcolor might be a better choice.

Trouble with pyplot displaying resized images in python

This is my first stack overflow question so please correct me if its not a good one:
I am currently processing a bunch of grayscale images as numpy ndarrays (dtype=uint8) in python 2.7. When I resize the images using resized=misc.imresize(image,.1), the resulting image will sometimes show up with different gray levels when I plot it with pyplot. Here is what my code looks like. I would post an image of the result, but I do not have the reputation points yet:
import cv2
from scipy import misc
from matplotlib import pyplot as plt
image=cv2.imread("gray_image.tif",cv2.CV_LOAD_IMAGE_GRAYSCALE)
resize=misc.imresize(image,.1)
plt.subplot(1,2,1),plt.imshow(image,"gray")
plt.subplot(1,2,2),plt.imshow(resize,"gray")
plt.show()
If I write the image to a file, the gray level appears normal.
If I compare the average gray level using numpy:
np.average(image) and np.average(resized),
the average gray level values are about the same, as one would expect.
If I display the image with cv2.imshow, the gray level appears normal.
Its not only an issue with resizing the image, but the gray level also gets screwy when I add images together (when most of one image is black and shouldn't darken the resulting image), and when I build an image pixel-by-pixel such as in:
import numpy as np
image_copy = np.zeros(image.shape)
for row in range(image.shape[0]):
for col in range(image.shape[1]):
image_copy[row,col]=image[row,col]
plt.imshow(image_copy,"gray") #<-- Will sometimes show up darker than original image
plt.show()
Does anyone have an idea as to what may be going on?
I apologize for the wordiness and lack of clarity in this question.
imshow is automatically scaling the color information to fit the whole available range. After resizing, the color range be smaller, resulting in changes of the apparent color (but not of the actual values, which explains why saved images work well).
You likely want to tell imshow not to scale your colors. This can be done using the vmin and vmax arguments as explained in the documentation. You probably want to use something like plt.imshow(image, "gray", vmin=0, vmax=255) to achieve an invariant appearance.

Some questions about scipy.misc.imshow

I just want to use python and scipy write a program achieve gray-level image histogram equalization, however, I find something is wrong when I use misc.imshow function.
I first read an image by misc.imread function and immediately I write a display function with misc.imshow, the image displayed is not the original one, it seems do the histogram equalization by default. Here is my code:
from sicpy import misc
import matplotlib.pyplot as plt
......
image1 = misc.imread('./images/Fig2.jpg')
misc.imsave('./iamges/post_Fig2.jpg',image1)
plt.figure()
plt.imshow(image1,cmap=plt.cm.gray)
plt.show()
image1 is an image with low contrast.
post_Fig2.jpg is also an image with low contrast
However the image displayed in figure window has a high contrast, significantly different from the two mentioned above.
So I am wondering whether the plt.imshow
function do the histogram equalization automatically? or something is wrong with my code or my method?

Python matplotlib imshow is slow

I want to display an image file using imshow. It is an 1600x1200 grayscale image and I found out that matplotlib uses float32 to decode the values. It takes about 2 seconds to load the image and I would like to know if there is any way to make this faster. The point is that I do not really need a high resolution image, I just want to mark certain points and draw the image as a background. So,
First question: Is 2 seconds a good performance for such an image or
can I speed up.
Second question: If it is good performance how can I make the process
faster by reducing the resolution. Important point: I still want the
image to strech over 1600x1200 Pixel in the end.
My code:
import matplotlib
import numpy
plotfig = matplotlib.pyplot.figure()
plotwindow = plotfig.add_subplot(111)
plotwindow.axis([0,1600,0,1200])
plotwindow.invert_yaxis()
img = matplotlib.pyplot.imread("lowres.png")
im = matplotlib.pyplot.imshow(img,cmap=matplotlib.cm.gray,origin='centre')
plotfig.set_figwidth(200.0)
plotfig.canvas.draw()
matplotlib.pyplot.show()
This is what I want to do. Now if the picture saved in lowres.png has a lower resolution as 1600x1200 (i.e. 400x300) it is displayed in the upper corner as it should. How can I scale it to the whole are of 1600x1200 pixel?
If I run this program the slow part comes from the canvas.draw() command below. Is there maybe a way to speed up this command?
Thank you in advance!
According to your suggestions I have updated to the newest version of matplotlib
version 1.1.0svn, checkout 8988
And I also use the following code:
img = matplotlib.pyplot.imread(pngfile)
img *= 255
img2 = img.astype(numpy.uint8)
im = self.plotwindow.imshow(img2,cmap=matplotlib.cm.gray, origin='centre')
and still it takes about 2 seconds to display the image... Any other ideas?
Just to add: I found the following feature
zoomed_inset_axes
So in principle matplotlib should be able to do the task. There one can also plot a picture in a "zoomed" fashion...
The size of the data is independent of the pixel dimensions of the final image.
Since you say you don't need a high-resolution image, you can generate the image quicker by down-sampling your data. If your data is in the form of a numpy array, a quick and dirty way would be to take every nth column and row with data[::n,::n].
You can control the output image's pixel dimensions with fig.set_size_inches and plt.savefig's dpi parameter:
import matplotlib.pyplot as plt
import matplotlib.cm as cm
import numpy as np
data=np.arange(300).reshape((10,30))
plt.imshow(data[::2,::2],cmap=cm.Greys)
fig=plt.gcf()
# Unfortunately, had to find these numbers through trial and error
fig.set_size_inches(5.163,3.75)
ax=plt.gca()
extent=ax.get_window_extent().transformed(fig.dpi_scale_trans.inverted())
plt.savefig('/tmp/test.png', dpi=400,
bbox_inches=extent)
You can disable the default interpolation of imshow by adding the following line to your matplotlibrc file (typically at ~/.matplotlib/matplotlibrc):
image.interpolation : none
The result is much faster rendering and crisper images.
I found a solution as long as one needs to display only low-resolution images. One can do so using the line
im = matplotlib.pyplot.imshow(img,cmap=matplotlib.cm.gray, origin='centre',extent=(0,1600,0,1200))
where the extent-parameter tells matplotlib to plot the figure over this range. If one uses an image which has a lower resolution, this speeds up the process quite a lot. Nevertheless it would be great if somebody knows additional tricks to make the process even faster in order to use a higher resolution with the same speed.
Thanks to everyone who thought about my problem, further remarks are appreciated!!!

Categories

Resources