Rescaling colormap in matplotlib - python

I'm switching from Matlab to Python and looking for a way to rescale the gray colormap of plt image. As I understand, plt.imshow() normalizes data to [0,1] and then to [0,255] when saved as an image file. However, I would like to normalize image to [0,160], for example. In Matlab it would be simply imshow(rescale(image,0,160/255)); however I don't see a similar straightforward method in Python.
I tried
plt.imshow(image, cmap="gray", vmin=0, vmax=160)
Which gave me a completely black image as well as
plt.imshow(image, cmap="gray", vmin=0, vmax=160./255)
resulting in an overexposure image.
Do I have to renormalize the data "manually" before plotting it, or does exist a more straightforward way to do it?

Related

Image artefacts when using cyclic colormaps for periodic data

I am currently trying to visualize the phase of an electromagnetic field which is 2pi-periodic. To visualize that e.g. 1.9 pi is almost the same as 0, I am using a cyclic colormap (twilight). However, when I plot my images, there are always lines at the sections where the phase jumps from (almost) 2pi to 0. When you zoom in on these lines, these artefacts vanish.
Here is a simple script and example images that demonstrate this issue.
import numpy as np
import matplotlib.pyplot as plt
x = np.linspace(-3,3,501)
x,y = np.meshgrid(x,x)
data = x**2+y**2
data = np.mod(data, 2)
plt.set_cmap('twilight')
plt.imshow(data)
plt.show()
I tested it with "twilight_shifted" and "hsv" as well and got the same issue. The problem also occurs after saving the image via plt.savefig(). I also tried other image formats like svg but it did not change anything.
As suggested in this answer you can set the image interpolation to "nearest", e.g.,
plt.imshow(data, interpolation="nearest")
See here for a discussion of image antialiasing effects with different interpolation methods.

Matplotlib doesn't render data exactly square

I'm using matplotlib.imshow to render a 2D numpy-array of integer-values as a heatmap. The problem is that the pixels in the final image are not entirely square. Sometimes they're a little bit rectangular. This is a big problem for me as I'm using this "heatmap" as an overlay in a map and this behaviour creates a weird visual glitch.
I'm rendering it like so:
fig = plt.imshow(data2d, cmap=cmap, norm=norm, aspect='equal', interpolation='none')
plt.axis('off')
fig.axes.get_xaxis().set_visible(False)
fig.axes.get_yaxis().set_visible(False)
fig.axes.set_adjustable('box-forced')
plt.savefig("output.png", bbox_inches='tight', pad_inches=0, dpi=72)
I thought setting the "aspect"-attribute to "equal" would take care of making the pixels exactly square. I've noticed that if I increase the DPI the effect is less noticeable as there are more pixels to work with but the rendering-time then becomes an issue.
I'd be glad if someone could point me in the right direction.
What I ended up doing is replacing matplotlib with rasterio and handling the colormap myself. It's definitely not an easy solution...

Image plotted from a FITS file with matplotlib oriented incorrectly

I'm having a little issue with something regarding plotting a fits image using matplotlib's imshow. It seems that my image is flipped both horizontally and vertically. I'm sure there is something simple I am overlooking, if anyone could point me in the right direction that would be great.
This is what my image should look like:
So, I'm loading my image as:
from astropy.io import fits
import matplotlib
import matplotlib.pyplot as pyplot
#Opening/reading in my fits file
hdulist = fits.open('.../myfits.fits')
#Accessing the image data and specifying the dimensions I wish to use
my_image = hdulist[0].data[0,0:,0:]
#Plotting the image
pyplot.imshow(image_SWIFT_uvm2_plot, cmap='gray', vmin=0, vmax=0.5)
pyplot.show()
This is what my image in the plot looks like (the plot is a little more complex than the code I have included, but I have given the critical lines as, hopefully, a self-sufficient code):
Those of you with keen eyes should see that the image has flipped both horizontally and vertically.
For FITS files the convention is that the origin is at the lower left hand corner of the image, so you need to use origin='lower' (by default Matplotlib uses origin='upper').
I have never used the astropy module, but I know that PyFITS opens the image data as a NumPy array (and from what I'm reading, astropy.io.fits has inherited the functionality of PyFITS anyway, so it should work the same way). If that is the case, then you may use numpy.fliplr and numpy.flipud to flip the array to your desired orientation. Just replace the line
pyplot.imshow(image_SWIFT_uvm2_plot, cmap='gray', vmin=0, vmax=0.5)
with
import numpy as np
pyplot.imshow(np.fliplr(np.flipud(image_SWIFT_uvm2_plot)), cmap='gray',
vmin=0, vmax=0.5)
Alternatively, you could do a little linear algebra to flip it, or just note that performing both of these flips is the same as using np.rot90 twice
pyplot.imshow(np.rot90(image_SWIFT_uvm2_plot, k=2), cmap='gray', vmin=0, vmax=0)

Trouble with pyplot displaying resized images in python

This is my first stack overflow question so please correct me if its not a good one:
I am currently processing a bunch of grayscale images as numpy ndarrays (dtype=uint8) in python 2.7. When I resize the images using resized=misc.imresize(image,.1), the resulting image will sometimes show up with different gray levels when I plot it with pyplot. Here is what my code looks like. I would post an image of the result, but I do not have the reputation points yet:
import cv2
from scipy import misc
from matplotlib import pyplot as plt
image=cv2.imread("gray_image.tif",cv2.CV_LOAD_IMAGE_GRAYSCALE)
resize=misc.imresize(image,.1)
plt.subplot(1,2,1),plt.imshow(image,"gray")
plt.subplot(1,2,2),plt.imshow(resize,"gray")
plt.show()
If I write the image to a file, the gray level appears normal.
If I compare the average gray level using numpy:
np.average(image) and np.average(resized),
the average gray level values are about the same, as one would expect.
If I display the image with cv2.imshow, the gray level appears normal.
Its not only an issue with resizing the image, but the gray level also gets screwy when I add images together (when most of one image is black and shouldn't darken the resulting image), and when I build an image pixel-by-pixel such as in:
import numpy as np
image_copy = np.zeros(image.shape)
for row in range(image.shape[0]):
for col in range(image.shape[1]):
image_copy[row,col]=image[row,col]
plt.imshow(image_copy,"gray") #<-- Will sometimes show up darker than original image
plt.show()
Does anyone have an idea as to what may be going on?
I apologize for the wordiness and lack of clarity in this question.
imshow is automatically scaling the color information to fit the whole available range. After resizing, the color range be smaller, resulting in changes of the apparent color (but not of the actual values, which explains why saved images work well).
You likely want to tell imshow not to scale your colors. This can be done using the vmin and vmax arguments as explained in the documentation. You probably want to use something like plt.imshow(image, "gray", vmin=0, vmax=255) to achieve an invariant appearance.

Editing Image using Python

I have to edit few image files using python. I have to open each image file, add few points at particular location & save the new edited image file(For fd my post-processing work).
Problem I am facing is:
1) I could not resize my plot axis. My plot axis should be 0-1 on both x &y with out any loss in image quality.
2) I could not save the edited image file, only the original file is getting saved.
This is what I tried:
im = Image.open('vortex.png')
implot = plt.plot(im)
fig, ax= plt.subplots()
myaximage = ax.imshow(im, aspect='auto', extent=(0,1,0,1),
alpha=0.5, origin='upper',
zorder=-2)
plt.implot([0.5], [0.5])
plt.show()
im.save("new","png")
Besides some small problems with your code, it seems you're basing your work on a wrong assumption: that you can turn a image into a matplotlib plot.
An image is simply a collection of pixels. While your brain interprets it as a plot, with a axis, and maybe a grid, you can't expect the computer to do so. You can't manipulate a collection of pixels as if it were a plot - it isn't.
You need to forget about matplotlib and use the image editing resourses of PIL.
Not sure about the axis change, but the saving of the file, see this post:
Python Imaging Library save function syntax
From the PIL Handbook:
im.save(outfile, options...)
im.save(outfile, format, options...)
Simplest case:
im.save('my_image.png')

Categories

Resources