How to fit the figure axes to the data after using 'equal'? - python

I working with some satellite images. After I extract the data and convert them in arrays, I use matplotlib's ax.pcolor() to visualize, and got this after using plt.axis('equal')
How can I fit the figure axes to the data automatically? I can do it manually adjusting the axes dimensions. Is there a better way to do that?
src = rasterio.open('imagem_teste.tif')
red = src.read(1)
fig = plt.figure()
ax = fig.add_axes([.1, .1, .465, .8])
ax.pcolor(lon_mtx, lat_mtx, red, cmap = 'jet', shading='auto')
plt.axis('equal')
ax.set_xlabel('Longitude')
ax.set_ylabel('Latitude')

Related

Removing Lines from Contourf in Matplotlib

I am using the following code to contour plot some data using contourf in matplotlib. I have set the transparency of the colourbar to 0.6, but there are annoying lines between each colour interval that I cant get rid of. There doesnt seem to be a way to set linestyle in contourf, any ideas?
#instantiating and titiling the figure
fig, ax1 = plt.subplots(figsize=(7,5))
fig.suptitle('Testing Simple Neural Networks', y=0.96, fontsize=16, fontweight='bold');
#defining colour tables
cm = plt.cm.coolwarm
#plotting the contour plot
levels = np.linspace(0, 1, 25)
cont1 = ax1.contourf(p1_mesh, p2_mesh, y_mesh, levels=levels, cmap=cm, alpha=0.6, linewidths=10)
#plotting the entire dataset - training and test data.
scat1 = ax1.scatter(X['p1'],
X['p2'],
c=y,
cmap=cm,
edgecolors='k');
#setting axis and legend
ax1.set(ylabel='p2',
xlabel='p1',
xlim=(0,255),
ylim=(0,255));
ax1.legend(*scat1.legend_elements(), title='Target');
ax1.set_axisbelow(True)
ax1.grid(color='xkcd:light grey')
cbar = fig.colorbar(cont1)
You can add the option antialiased=True to ax1.contourf, it should fix it.

Using imshow() to create higher quality hist2d

I am playing around with volumetric data and I am trying to project a "cosmic web" like image.
I pretty much create a file path and open the data with a module that opens hdf5 files. The x and y values are denoted by indexing from a the file gas_pos and the histogram is weighted by different properties, gas_density in this case:
import matplotlib.pyplot as plt
import numpy as np
from matplotlib.colors import LinearSegmentedColormap
from matplotlib.ticker import LogFormatter
cmap = LinearSegmentedColormap.from_list('mycmap', ['black', 'steelblue', 'mediumturquoise', 'darkslateblue'])
fig = plt.figure()
ax = fig.add_subplot(111)
H = ax.hist2d(gas_pos[:,0]/0.7, gas_pos[:,1]/0.7, bins=500, cmap=cmap, norm=matplotlib.colors.LogNorm(), weights=gas_density);
cb = fig.colorbar(H[3], ax=ax, shrink=0.8, pad=0.01, orientation="horizontal", label=r'$ \rho\ [M_{\odot}\ \mathrm{kpc}^{-3}]$')
ax.tick_params(axis=u'both', which=u'both',length=0)
ax.get_xaxis().set_visible(False)
ax.get_yaxis().set_visible(False)
plt.show()
giving me this:
which is nice, but I want to up the quality and remove the grainyness of it. When I try imshow interpolation:
cmap = LinearSegmentedColormap.from_list('mycmap', ['black', 'steelblue', 'mediumturquoise', 'darkslateblue'])
fig = plt.figure()
ax = fig.add_subplot(111)
H = ax.hist2d(gas_pos[:,0]/0.7, gas_pos[:,1]/0.7, bins=500, cmap=cmap, norm=matplotlib.colors.LogNorm(), weights=gas_density);
ax.tick_params(axis=u'both', which=u'both',length=0)
ax.get_xaxis().set_visible(False)
ax.get_yaxis().set_visible(False)
im = ax.imshow(H[0], cmap=cmap, interpolation='sinc', norm=matplotlib.colors.LogNorm())
cb = fig.colorbar(H[3], ax=ax, shrink=0.8, pad=0.01, orientation="horizontal", label=r'$ \rho\ [M_{\odot}\ \mathrm{kpc}^{-3}]$')
plt.show()
Am I using this incorrectly? or is there something better I can use to modify the pixelation?
If anyone is wanting to play with my data, I will upload the data later on today!
Using interpolation='sinc' is indeed a good method to smoothen a plot. Others would e.g. be "gaussian", "bicubic" or "spline16".
The problem you observe is that the imshow plot is plotted on top of the hist2d plot and thus takes its axes limits. Those limits seem to be smaller than the number of points in the imshow plot and therefore you only see part of the total data.
The solution is either not to plot the hist2d plot at all or at least to plot it into another subplot or figure.
Pursuing the first idea, you would calculate your histogram without plotting it, using numpy.histogram2d
H, xedges, yedges = np.histogram2d(gas_pos[:,0]/0.7, gas_pos[:,1]/0.7,
bins=500, weights=gas_density)
im = ax.imshow(H.T, cmap=cmap, interpolation='sinc', norm=matplotlib.colors.LogNorm())
I would also recommend reading the numpy.histogram2d documentation, which includes an example of plotting the histogram output in matplotlib.
You'll probably want to set interpolation='None' in the call to imshow, instead of interpolation='sinc'

Specify boundaries on matplotlib colorbar

I have plotted some data using matplotlib's imshow. I have also managed to combine three different colormaps for the plot (see figure), but I want the greens to go from 0-1000, the yellow/orange to go from 1000-1600 and reds from 1600 and up. It's almost correct out of the box, but not entirely. Anyone know how I can accomplish this?
colors1 = plt.cm.Greens_r(np.linspace(0, 0.4, 256))
colors2 = plt.cm.Oranges(np.linspace(0.1, 0.5, 256))
colors3 = plt.cm.Reds(np.linspace(0.6, 1, 256))
cmap = np.vstack((colors1, colors2, colors3))
cmap_test = colors.LinearSegmentedColormap.from_list('colormap', cmap)
fig, ax = plt.subplots()
plt.imshow(z, origin='upper', cmap=cmap_test, interpolation='none', extent=[0,25,5,-20])
cbar = plt.colorbar()
ax.grid(linestyle='-')
plt.show()

Add image behind scatter subplot independent of scatter points axes

I am trying to add an image behind each subplot of my scatter plot figure. I want my image to take up all the subplot space. But I do not want to map my scatter points onto the image: That is, I want the axes of my scatter points to be independent to that of the image.
When I simply use imread() and implot() while making a subplot to insert the image, like so:
im = plt.imread("/Users/mac/Desktop/image.jpeg")
two = plt.subplot(222)
implot = plt.imshow(im)
plt.title('4-8 Hz')
plt.scatter(X,Y, s=100, marker ='o', c=AveragedHursts4to8, cmap = cm.plasma)
plt.colorbar()
two.axis('off')
I get the right-most image down below, where, clearly, the image axes and scatter points axes are shared.
I tried to use the twiny() function to make a new set of axes for the image, with the image set as the first axes and the second axes set to the scatter points, like so:
onetwin = plt.subplot(221)
plt.title('1-4 Hz')
implot = plt.imshow(im, zorder=1)
onetwin.axis('off')
one = onetwin.twiny()
plt.scatter(X,Y, s=100, marker ='o', c=AveragedHursts1to4, cmap = cm.plasma, zorder = 2)
plt.colorbar()
one.axis('off')
There I get the leftmost image, where the scatter points are squished on the y axis and the image, for some reason, has been shrunk.
And when I switch the ordering of the creation of the axes for twiny, the image takes up the whole subplot and the scatter points do not show at all.
Suggestions?
My suggestion would be to leave the points' positions untouched and scale the background image accordingly. One can use the extent keyword to imshow for that purpose.
In the example below I plot some random points on four different scales. Each time the image is scaled to the scatterplot's dimensions using the extent keyword.
import matplotlib.pyplot as plt
import numpy as np
x = np.random.rand(8*8).reshape((8,8))
image = plt.imread("https://upload.wikimedia.org/wikipedia/en/2/27/EU_flag_square.PNG")
fig, ax = plt.subplots(ncols=4, figsize=(11,3.8))
for i in range(len(ax)):
ax[i].scatter(x[2*i,:]*10**(i-1), x[2*i+1,:]*10**(i-1), c="#ffcc00", marker="*", s=280, edgecolors='none')
xlim = ax[i].get_xlim()
ylim = ax[i].get_ylim()
mini = min(xlim[0],ylim[0])
maxi = max(xlim[1],ylim[1])
ax[i].imshow(image, extent=[mini, maxi, mini, maxi])
plt.tight_layout()
plt.show()
The simplest, fastest solution I came up with is to solve for x and y in:
largest_x_coodinate_value(x) = x_dimension of image_in_pixels
largest_y_coordinate_value(y) = y_dimension_of_image_in_pixels
And then do vectorized multiplication over the numpy arrays containing the X and Y coordinates with those calculated x,y values, effectively scaling the coordinates to the size of the image.

Matplotlib format the scale label

I have searched around SO and haven't been able to find how to format this text (I've also checked around google and the matplotlib docs)
I'm currently creating a figure and then adding 4 subplots in a 2x2 matrix format so I'm trying to scale down all the text:
fig = plt.figure()
ax1 = fig.add_subplot(221)
ax1.tick_params(labelsize='xx-small')
ax1.set_title(v, fontdict={'fontsize':'small'})
ax1.hist(results[v], histtype='bar', label='data', bins=bins, alpha=0.5)
ax1.hist(results[v+'_sim'], histtype='bar', label='truth', bins=bins, alpha=0.8)
ax1.legend(loc='best', fontsize='x-small')
You can set the parameters before plot:
plt.rcParams['xtick.labelsize'] = "xx-small"
plt.rcParams['ytick.labelsize'] = "xx-small"

Categories

Resources