This question already has answers here:
Display image as grayscale using matplotlib
(9 answers)
Closed 5 years ago.
I've been trying to convert an image to grayscale using opencv in Python but it converts the image to some kind of thermal camera image. What am I doing wrong?
Here is the code for image below:
img =X_tr[9999]
plt.imshow(img)
plt.show()
img = cv2.cvtColor(img.astype(np.uint8), cv2.COLOR_RGB2GRAY)
plt.imshow(img)
plt.show()
img.shape
This image is taken from CIFAR10 dataset.
Thanks.
Gray scale images, i.e. images with only one colorchannel, are interpreted by imshow as to be plotted using a colormap. You therefore need to specify the colormap you want to use (and the normalization, if it matters).
plt.imshow(img, cmap="gray", vmin=0, vmax=255)
Related
I'm switching from Matlab to Python and looking for a way to rescale the gray colormap of plt image. As I understand, plt.imshow() normalizes data to [0,1] and then to [0,255] when saved as an image file. However, I would like to normalize image to [0,160], for example. In Matlab it would be simply imshow(rescale(image,0,160/255)); however I don't see a similar straightforward method in Python.
I tried
plt.imshow(image, cmap="gray", vmin=0, vmax=160)
Which gave me a completely black image as well as
plt.imshow(image, cmap="gray", vmin=0, vmax=160./255)
resulting in an overexposure image.
Do I have to renormalize the data "manually" before plotting it, or does exist a more straightforward way to do it?
I am using python-opencv for simple conveniences like binarizing images and counting pixels.
There is a series of images captured from a camera, and some of them are completely black, and some have more information. For example:
The following code is loading the images:
for img in file_list:
image = cv2.imread(img)
plt.imshow(image)
plt.show()
gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
plt.imshow(gray, cmap='gray')
plt.show()
I need to convert them to grayscale because I'm applying binarization and Otsu's thresholding.
The result of the previous code yields the following images (the first one corresponds to the left side of the example, and the other corresponds to the right side):
So, if the problem was only with displaying the image, I would have no problems. But for pixel counting and binarization this is imposing a hurdle to my work. I tried other color spaces, but the same thing is happening. Any suggestions?
Btw, I'm developing this with jupyter-notebook and using py-opencv-3.4.2 and python 3.7.3 on conda 4.6.10. It does not seem to be a version problem since I also tried using Google Colab, and the same problem appears.
This question already has answers here:
How do I resize an image using PIL and maintain its aspect ratio?
(23 answers)
Closed 3 years ago.
I have to classfier a medical images, but these images are big(3000x2900),, i need way to resize. and then training it.
Can i training these images without resize???
You can train your model without resizing your pictures . But it will be time consuming. And you may not get the best results. I recommend you resize them to 128x128 pixels. You can use the PIL library and resize each picture then save it to a different directory.
from PIL import Image
img = Image.open('/your iamge path/image.jpg') # image extension *.png,*.jpg
new_width = 128
new_height = 128
img = img.resize((new_width, new_height), Image.ANTIALIAS)
img.save('/new directory path/output image name.png') # format may what u want ,*.png,*jpg,*.gif
This process is called preprocessing the dataset.
This question already has answers here:
OpenCV giving wrong color to colored images on loading
(7 answers)
Closed 4 years ago.
I am following this course on computer vision: https://in.udacity.com/course/introduction-to-computer-vision--ud810
The instructor explains how gaussian filter causes blurring of image. The instructor uses matlab to demonstrate it but I am using python 3 with opencv. I ran the following code:
import cv2
from matplotlib import pyplot as pl
image = cv2.imread("Desert.jpg")
blur = cv2.GaussianBlur(image,(95,95),5)
cv2.imshow("desert", image)
pl.imshow(blur)
pl.xticks([]), pl.yticks([])
pl.show()
This is the original image:
And this is is the "blur" image:
The image is blurred, no doubt. But how colors have interchanged ? The mountain is blue while the sky is brick red ?
Because you plot one with opencv and another with matplotlib.
The explanation given here is as follows:
There is a difference in pixel ordering in OpenCV and Matplotlib. OpenCV follows BGR order, while matplotlib likely follows RGB order.
Since you read and show the image with opencv, it is in BGR order and you see nothing wrong. But when you show it with matplotlib it thinks that the image is in RGB format and it changes the order of blue and red channels.
This is my first stack overflow question so please correct me if its not a good one:
I am currently processing a bunch of grayscale images as numpy ndarrays (dtype=uint8) in python 2.7. When I resize the images using resized=misc.imresize(image,.1), the resulting image will sometimes show up with different gray levels when I plot it with pyplot. Here is what my code looks like. I would post an image of the result, but I do not have the reputation points yet:
import cv2
from scipy import misc
from matplotlib import pyplot as plt
image=cv2.imread("gray_image.tif",cv2.CV_LOAD_IMAGE_GRAYSCALE)
resize=misc.imresize(image,.1)
plt.subplot(1,2,1),plt.imshow(image,"gray")
plt.subplot(1,2,2),plt.imshow(resize,"gray")
plt.show()
If I write the image to a file, the gray level appears normal.
If I compare the average gray level using numpy:
np.average(image) and np.average(resized),
the average gray level values are about the same, as one would expect.
If I display the image with cv2.imshow, the gray level appears normal.
Its not only an issue with resizing the image, but the gray level also gets screwy when I add images together (when most of one image is black and shouldn't darken the resulting image), and when I build an image pixel-by-pixel such as in:
import numpy as np
image_copy = np.zeros(image.shape)
for row in range(image.shape[0]):
for col in range(image.shape[1]):
image_copy[row,col]=image[row,col]
plt.imshow(image_copy,"gray") #<-- Will sometimes show up darker than original image
plt.show()
Does anyone have an idea as to what may be going on?
I apologize for the wordiness and lack of clarity in this question.
imshow is automatically scaling the color information to fit the whole available range. After resizing, the color range be smaller, resulting in changes of the apparent color (but not of the actual values, which explains why saved images work well).
You likely want to tell imshow not to scale your colors. This can be done using the vmin and vmax arguments as explained in the documentation. You probably want to use something like plt.imshow(image, "gray", vmin=0, vmax=255) to achieve an invariant appearance.