Displaying random RGB pixels in Python - python

I am starting a little project and I am having some difficulty in finding the answer that I am looking for. I don't really know exactly what terms I should be using to search, and couldn't find anything similar so I am sorry if this has been asked previously.
I am essentially trying to have a 2D plot of a set size, 300x300 for example full of random RGB pixels. I have figured out how to plot them with imshow and remove the axis labels so far, but it looks odd like it is zoomed in too far. Also, I know I can use RGB arguements in imshow, but the matplotlib manual touches on it, but never gives any examples. The closest cmap I have found to RGB is hsv so I am using that for now until I find the RGB solution.
Can anyone help me with assigning a random RGB value to each pixel instead of using cmap, and maybe adjusting the apparent size of the image so the pixels are less "zoomed in"? I am open to using something other than imshow for flexibility, it is just the only thing I found to do what I want. Thank you very much in advance!
import matplotlib.pyplot as plt
import matplotlib.cm as cm
from numpy import random
Z = random.random((300,300))
plt.imshow(Z, cm.get_cmap("hsv"), interpolation='nearest')
plt.axis('off')
plt.show()

Here's how you do it with PIL:
from PIL import Image
import numpy as np
data = (np.random.random( (300,300,3) ) * 256).astype(np.uint8)
img = Image.fromarray(data)
img.show()
img.save('rand.png')

Related

How to draw a Heatmap in a image using the coordinates in Python OpenCV?

I've a huge list of Coordinates(x,y) of people walking in the streets and I like to design a heatmap using those Coordinates(x,y), i.e., it should look something like this. I want much hotter spot for multiple coordinates in a single spot, hotter means red colored spot. I've tried this but ended up getting a heatmap like this not the one I expected nor the site mentioned in the beginning(the wikipedia heatmap). This is the code I tried,
import cv2
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
x,y = [230,150,200],[200,100,150]
plt.imshow(img)
ax = sns.kdeplot(x,y, cmap = "Blues", shade= True, shade_lowest=False)
ax.set_frame_on(False)
plt.axis('off')
plt.savefig('heatmap_pic.png')
This code also made the result image size smaller the actual image was much bigger. I'm new to OpenCV and Matplotlib someone please help me with this, to plot a Heatmap(like this) in a image using a bunch of Coordinates.
You need to take image, create the heatmap, and superimpose it.
Check this or this.

Uneven squares in grid image upon saving

So, I am trying to save a matplotlib figure as a TIFF. The image has all of the correct information and content but for some reason, after saving, the squares on the grid that I plot appear uneven. In the matplotlib image window that comes up after running the code though, the image has perfect squares. I have attached a code snippet and samples of the produced images below. They are screenshots of a much larger, 332x335 grid. The image generally looks okay but if it is to be used in scientific papers, as I intend, it should be as close to perfect as possible. If someone could help here, I would greatly appreciate it.
fname = tif_file_name+'.tif'
aspect = grid_x/grid_y
plt.figure()
plt.imshow(circ_avg, cmap='gray', aspect=aspect, interpolation='none',)
plt.gca().invert_yaxis()
plt.savefig(fname, dpi = 1000, format='tif', bbox_inches='tight', pad_inches = 0)
plt.show()
Perfect squares from screenshot in plt.show() window:
Uneven squares when viewed after saving:
I was actually able to resolve this. It turns out the more effective way of doing this is by using the PIL library. This also greatly reduced the overall file size.
from PIL import Image
#scale to pixel vals (only multiplied by 255 here since my data already had 1 as the maximum)
vals= orig_vals*255
final_image = Image.fromarray(np.uint8(vals), mode='L')
final_image.save('blah.tif')

Image plotted from a FITS file with matplotlib oriented incorrectly

I'm having a little issue with something regarding plotting a fits image using matplotlib's imshow. It seems that my image is flipped both horizontally and vertically. I'm sure there is something simple I am overlooking, if anyone could point me in the right direction that would be great.
This is what my image should look like:
So, I'm loading my image as:
from astropy.io import fits
import matplotlib
import matplotlib.pyplot as pyplot
#Opening/reading in my fits file
hdulist = fits.open('.../myfits.fits')
#Accessing the image data and specifying the dimensions I wish to use
my_image = hdulist[0].data[0,0:,0:]
#Plotting the image
pyplot.imshow(image_SWIFT_uvm2_plot, cmap='gray', vmin=0, vmax=0.5)
pyplot.show()
This is what my image in the plot looks like (the plot is a little more complex than the code I have included, but I have given the critical lines as, hopefully, a self-sufficient code):
Those of you with keen eyes should see that the image has flipped both horizontally and vertically.
For FITS files the convention is that the origin is at the lower left hand corner of the image, so you need to use origin='lower' (by default Matplotlib uses origin='upper').
I have never used the astropy module, but I know that PyFITS opens the image data as a NumPy array (and from what I'm reading, astropy.io.fits has inherited the functionality of PyFITS anyway, so it should work the same way). If that is the case, then you may use numpy.fliplr and numpy.flipud to flip the array to your desired orientation. Just replace the line
pyplot.imshow(image_SWIFT_uvm2_plot, cmap='gray', vmin=0, vmax=0.5)
with
import numpy as np
pyplot.imshow(np.fliplr(np.flipud(image_SWIFT_uvm2_plot)), cmap='gray',
vmin=0, vmax=0.5)
Alternatively, you could do a little linear algebra to flip it, or just note that performing both of these flips is the same as using np.rot90 twice
pyplot.imshow(np.rot90(image_SWIFT_uvm2_plot, k=2), cmap='gray', vmin=0, vmax=0)

Trouble with pyplot displaying resized images in python

This is my first stack overflow question so please correct me if its not a good one:
I am currently processing a bunch of grayscale images as numpy ndarrays (dtype=uint8) in python 2.7. When I resize the images using resized=misc.imresize(image,.1), the resulting image will sometimes show up with different gray levels when I plot it with pyplot. Here is what my code looks like. I would post an image of the result, but I do not have the reputation points yet:
import cv2
from scipy import misc
from matplotlib import pyplot as plt
image=cv2.imread("gray_image.tif",cv2.CV_LOAD_IMAGE_GRAYSCALE)
resize=misc.imresize(image,.1)
plt.subplot(1,2,1),plt.imshow(image,"gray")
plt.subplot(1,2,2),plt.imshow(resize,"gray")
plt.show()
If I write the image to a file, the gray level appears normal.
If I compare the average gray level using numpy:
np.average(image) and np.average(resized),
the average gray level values are about the same, as one would expect.
If I display the image with cv2.imshow, the gray level appears normal.
Its not only an issue with resizing the image, but the gray level also gets screwy when I add images together (when most of one image is black and shouldn't darken the resulting image), and when I build an image pixel-by-pixel such as in:
import numpy as np
image_copy = np.zeros(image.shape)
for row in range(image.shape[0]):
for col in range(image.shape[1]):
image_copy[row,col]=image[row,col]
plt.imshow(image_copy,"gray") #<-- Will sometimes show up darker than original image
plt.show()
Does anyone have an idea as to what may be going on?
I apologize for the wordiness and lack of clarity in this question.
imshow is automatically scaling the color information to fit the whole available range. After resizing, the color range be smaller, resulting in changes of the apparent color (but not of the actual values, which explains why saved images work well).
You likely want to tell imshow not to scale your colors. This can be done using the vmin and vmax arguments as explained in the documentation. You probably want to use something like plt.imshow(image, "gray", vmin=0, vmax=255) to achieve an invariant appearance.

Python matplotlib imshow is slow

I want to display an image file using imshow. It is an 1600x1200 grayscale image and I found out that matplotlib uses float32 to decode the values. It takes about 2 seconds to load the image and I would like to know if there is any way to make this faster. The point is that I do not really need a high resolution image, I just want to mark certain points and draw the image as a background. So,
First question: Is 2 seconds a good performance for such an image or
can I speed up.
Second question: If it is good performance how can I make the process
faster by reducing the resolution. Important point: I still want the
image to strech over 1600x1200 Pixel in the end.
My code:
import matplotlib
import numpy
plotfig = matplotlib.pyplot.figure()
plotwindow = plotfig.add_subplot(111)
plotwindow.axis([0,1600,0,1200])
plotwindow.invert_yaxis()
img = matplotlib.pyplot.imread("lowres.png")
im = matplotlib.pyplot.imshow(img,cmap=matplotlib.cm.gray,origin='centre')
plotfig.set_figwidth(200.0)
plotfig.canvas.draw()
matplotlib.pyplot.show()
This is what I want to do. Now if the picture saved in lowres.png has a lower resolution as 1600x1200 (i.e. 400x300) it is displayed in the upper corner as it should. How can I scale it to the whole are of 1600x1200 pixel?
If I run this program the slow part comes from the canvas.draw() command below. Is there maybe a way to speed up this command?
Thank you in advance!
According to your suggestions I have updated to the newest version of matplotlib
version 1.1.0svn, checkout 8988
And I also use the following code:
img = matplotlib.pyplot.imread(pngfile)
img *= 255
img2 = img.astype(numpy.uint8)
im = self.plotwindow.imshow(img2,cmap=matplotlib.cm.gray, origin='centre')
and still it takes about 2 seconds to display the image... Any other ideas?
Just to add: I found the following feature
zoomed_inset_axes
So in principle matplotlib should be able to do the task. There one can also plot a picture in a "zoomed" fashion...
The size of the data is independent of the pixel dimensions of the final image.
Since you say you don't need a high-resolution image, you can generate the image quicker by down-sampling your data. If your data is in the form of a numpy array, a quick and dirty way would be to take every nth column and row with data[::n,::n].
You can control the output image's pixel dimensions with fig.set_size_inches and plt.savefig's dpi parameter:
import matplotlib.pyplot as plt
import matplotlib.cm as cm
import numpy as np
data=np.arange(300).reshape((10,30))
plt.imshow(data[::2,::2],cmap=cm.Greys)
fig=plt.gcf()
# Unfortunately, had to find these numbers through trial and error
fig.set_size_inches(5.163,3.75)
ax=plt.gca()
extent=ax.get_window_extent().transformed(fig.dpi_scale_trans.inverted())
plt.savefig('/tmp/test.png', dpi=400,
bbox_inches=extent)
You can disable the default interpolation of imshow by adding the following line to your matplotlibrc file (typically at ~/.matplotlib/matplotlibrc):
image.interpolation : none
The result is much faster rendering and crisper images.
I found a solution as long as one needs to display only low-resolution images. One can do so using the line
im = matplotlib.pyplot.imshow(img,cmap=matplotlib.cm.gray, origin='centre',extent=(0,1600,0,1200))
where the extent-parameter tells matplotlib to plot the figure over this range. If one uses an image which has a lower resolution, this speeds up the process quite a lot. Nevertheless it would be great if somebody knows additional tricks to make the process even faster in order to use a higher resolution with the same speed.
Thanks to everyone who thought about my problem, further remarks are appreciated!!!

Categories

Resources