I would like to plot contourf with (lat,depth,temp) and then have similar spacing as in the figure below (the temperature vary more near the surface then at depth, so I want to emphasized this region).
My depth array is not uniform (i.e. depth = [5,15,...,4975,5185,...]. I want to have such non-uniform vertical spacing.
I would like to show yticks = [10,100,500,1000,1500,2000,3000,4000,5000], and depth array does not have those exact values.
z = np.arange(0,50) # I want uniform spacing
pos = ([0,2,5,10,15,20,30,40,48]) # I want some yticks (not all of them)
ax=plt.contourf(lat,z,temp) # temp is a variable with dimensions (lat,depth)
plt.colorbar()
plt.gca().yaxis.set_ticks(pos) # Set some yticks, not all of them
plt.yticks(z[pos],depth[pos].astype(int)) # Replace the dummy values of z-array by something meaningful
plt.gca().invert_yaxis()
plt.grid(linestyle=':')
plt.gca().set(ylabel='depth (m)',xlabel='Latitude')'''
Potential Temperature of the Atlantic Ocean:
Per the matplotlib docs on yticks, you can specify the labels you want to use. In your case, if you want to show the labels [10,100,500,1000,1500,2000,3000,4000,5000] you can simply pass that list as the second argument in plt.yticks(), like so
plt.yticks(z[pos], [10,100,500,1000,1500,2000,3000,4000,5000])
and it will display the yticks accordingly. The issue arises in the specification of the positions - since the depth array does not have points corresponding exactly to the desired ytick values you will need to interpolate in order to find the exact position at which to place the labels. Unless the approximate positions specified in pos are already sufficient, in which case the above suffices.
If the depth data are not uniformly spaced then you can use numpy.interp to perform the interpolation, as shown below
import matplotlib.pyplot as plt
import numpy as np
# Create some depth data that is not uniformly spaced over [0, 5500]
depth = [(np.random.random() - 0.5)*25 + ii for ii in np.linspace(0, 5500, 50)]
lat = np.linspace(-75, 75, 50)
z = np.linspace(0,50, 50)
yticks = [10,100,500,1000,1500,2000,3000,4000,5000]
# Interpolate depths to get z-positions
pos = np.interp(yticks, depth, z)
temp = np.outer(lat, z) # Arbitrarily populate temp for demonstration
ax = plt.contourf(lat,z,temp)
plt.colorbar()
plt.gca().yaxis.set_ticks(pos)
plt.yticks(pos,yticks) # Place yticks at interpolated z-positions
plt.gca().invert_yaxis()
plt.grid(linestyle=':')
plt.gca().set(ylabel='Depth (m)',xlabel='Latitude')
plt.show()
This will find the exact positions where the yticks would fall if the depth array had data at those positions and place them accordingly as shown below.
Related
I have a heatmap with ticks which have non equal deltas between themselves:
For example, in the attached image, the deltas are between 0.015 to 0.13. The current scale doesn't show the real scenario, since all cell sizes are equal.
Is there a way to place the ticks in their realistic positions, such that cell sizes would also change accordingly?
Alternatively, is there another method to generate this figure such that it would provide a realistic representation of the tick values?
As mentioned in the comments, a Seaborn heatmap uses categorical labels. However, the underlying structure is a pcolormesh, which can have different sizes for each cell.
Also mentioned in the comments, is that updating the private attributes of the pcolormesh isn't recommended. Moreover, the heatmap can be directly created calling pcolormesh.
Note that if there are N cells, there will be N+1 boundaries. The example code below supposes you have x-positions for the centers of the cells. It then calculates boundaries in the middle between successive cells. The first and the last distance is repeated.
The ticks and tick labels for x and y axis can be set from the given x-values. The example code supposes the original values indicate the centers of the cells.
import matplotlib.pyplot as plt
import numpy as np
import seaborn as sns
sns.set()
N = 10
xs = np.random.uniform(0.015, 0.13, 10).cumsum().round(3) # some random x values
values = np.random.rand(N, N) # a random matrix
# set bounds in the middle of successive cells, add extra bounds at start and end
bounds = (xs[:-1] + xs[1:]) / 2
bounds = np.concatenate([[2 * bounds[0] - bounds[1]], bounds, [2 * bounds[-1] - bounds[-2]]])
fig, ax = plt.subplots()
ax.pcolormesh(bounds, bounds, values)
ax.set_xticks(xs)
ax.set_xticklabels(xs, rotation=90)
ax.set_yticks(xs)
ax.set_yticklabels(xs, rotation=0)
plt.tight_layout()
plt.show()
PS: In case the ticks are mean to be the boundaries, the code can be simplified. One extra boundary is needed, for example a zero at the start.`
bounds = np.concatenate([[0], xs])
ax.tick_params(bottom=True, left=True)
I am trying to plot graphs in Matplotlib and embed them into pyqt5 GUI. Everything is working fine, except for the fact that my y axis has loads of leading zeros which I cannot seem to get rid of.
I have tried googling how to format the axis, but nothing seems to work! I can't set the ticks directly because there's no way of determining what they will be, as I am going to be working with varying sized data sets.
num_bins = 50
# create an axis
ax = self.figure.add_subplot(111)
# discards the old graph
ax.clear()
##draws the bars and legend
colours = ['blue','red']
ax.hist(self.histoSets, num_bins, density=True, histtype='bar', color=colours, label=colours)
ax.legend(prop={'size': 10})
##set x ticks
min,max = self.getMinMax()
scaleMax = math.ceil((max/10000))*10000
scaleMin = math.floor((min/10000))*10000
scaleRange = scaleMax - scaleMin
ax.xaxis.set_ticks(np.arange(scaleMin, scaleMax+1, scaleRange/4))
# refresh canvas
self.draw()
all those numbers on your y-axis are tiny, i.e. on the order of 1e-5. this is because the integral of the density is defined to be 1 and your x-axis spans such a large range
I can mostly reproduce your plot with:
import matplotlib.pyplot as plt
import numpy as np
y = np.random.normal([190000, 220000], 20000, (5000, 2))
a, b, c = plt.hist(y, 40, density=True)
giving me:
the tuple returned from hist contains useful information, notably the first element (a above) are the densities, and the second element (b above) are the bins that it picked. you can see this all sums to one by doing:
sum(a[0] * np.diff(b))
and getting 1 back.
as ImportanceOfBeingErnest says you can use tight_layout() to resize the plot if it doesn't fit into the area
My question is as far as I see closely related to this post.
I need to plot some data with marker size strictly proportional to the value of the axes. (Already asked the question here).
My approach is the following:
Create empty scatterplot for pixel reference
Scatter 2 points on the lower left and upper right corner
Limit the axes to exactly these two points
use transData.transform to get the pixel values of those two points
get the pixel distance as the difference in pixel number of these two points
(now that I have the distance-to-pixel-ratio, scatter my data with
s=(size*dist_to_pix_ratio)**2; but that is not important right now.)
Problem is: when I do exactly what I've described, I get two different values for the pixel number for the y-axis and the x-axis.
Here is a minimal code:
import matplotlib.pyplot as plt
fig = plt.figure(figsize=(7,7))
ax1 = fig.add_subplot(111, aspect='equal')
#setting up an empty scatterplot for pixel reference
xedges=[0.0, 1.0]
yedges=[0.0, 1.0]
emptyscatter=ax1.scatter(xedges, yedges, s=0.0)
#set axes limits
ax1.set_xlim(0.00,1.00)
ax1.set_ylim(0.00,1.00)
# Calculating the ratio of pixel-to-unit
upright = ax1.transData.transform((1.0,1.0))
lowleft = ax1.transData.transform((0.0,0.0))
x_to_pix_ratio = upright[0] - lowleft[0]
y_to_pix_ratio = upright[1] - lowleft[1]
print x_to_pix_ratio, y_to_pix_ratio
which returns:
434.0 448.0
Can anybody explain why they are not equal?
I'm not sure if it's relevant, but I'm using Python 2.7.12 and matplotlib 1.5.1
I would like to plot a 2D discretization rectangular mesh with non-regular
x y axes values, e.g. the typical discretization meshes used in CFD.
An example of the code may be:
fig = plt.figure(1,figsize=(12,8))
axes = fig.add_subplot(111)
matplotlib.rcParams.update({'font.size':17})
axes.set_xticks(self.xPoints)
axes.set_yticks(self.yPoints)
plt.grid(color='black', linestyle='-', linewidth=1)
myName = "2D.jpg"
fig.savefig(myName)
where self.xPoints and self.yPoints are 1D non-regular vectors.
This piece of code produce a good discretization mesh, the problem are the
xtics and ytics labels because they appear for all values of xPoints and yPoints (they overlap).
How can I easily redefine the printed values in the axes?
Let's say I only want to show the minimum and maximum value for x and y and not all values from the discretization mesh.
I cann't post a example-figure because it is the first time I ask something here (I can send it by mail if requested)
the problem is that you explicitly told matplotlib to label each point when you wrote:
axes.set_xticks(self.xPoints)
axes.set_yticks(self.yPoints)
comment out those lines and see what the result looks like.
Of course, if you only want the first and last point labelled, it becomes:
axes.set_xticks([self.xPoints[0], self.xPoints[-1]])
...
If the gridline was specified by axes.set_xticks(), I don't think it would be possible to show ticks without overlap in your case.
I may have a solution for you:
...
ax = plt.gca()
#Arr_y: y-direction data, 1D numpy array or list.
for j in range(len(Arr_y)):
plt.hline(y = Arr_y[j], xmin = Arr_x.min(), xmax = Arr_x.max(), color = 'black')
#Arr_x: x-direction data, 1D numpy array or list.
for i in range(len(Arr_x)):
plt.vline(x = Arr_x[i], ymin = Arr_y.min(), ymax = Arr_y.max(), color = 'black')
#Custom your ticks here, 1D numpy array or list.
ax.set_xticks(Arr_xticks)
ax.set_yticks(Arr_yticks)
plt.xlim(Arr_x.min(), Arr_x.max())
plt.ylim(Arr_y.min(), Arr_y.max())
plt.show()
...
hlines and vlines are horizontal and vertical lines, you can specify those lines with boundary data in both x and y directions.
I tried it with 60×182 non uniform mesh grid which cost me 1.2s, hope I can post a picture here.
I'm plotting a 2d histogram as image in pyqtgraph. I would like to set the axes scales correctly (i.e. representing the actual values of the binned data).
I found this article but I'm not quite sure how to translate it to my case.
I do:
h = np.histogram2d(x, y, 30, normed = True)
w = pg.ImageView(view=pg.PlotItem())
w.setImage(h[0])
but the scale of the PlotItem axes run from 0 to 30 (number of bins), which is not what I would like.
You need to set the position and scale of the image. The link you provided has the following code:
view.setImage(img, pos=[x0, y0], scale=[xscale, yscale])
You only need to determine the correct values of [x0, y0] and [xscale, yscale] based on your bin values in h[1].