Plotting a grid in Python - python

I'm trying to start 2D contour plot for a flow net and I'm having trouble getting the initial grid to show up properly.
Given the number of columns and the number of rows, how can I write a function that will plot a grid so that all points in the given range appear?
I tried plotting for 4 columns and 3 rows of points by doing this:
r = 3
c = 4
x = [i for i in range(c)]
y = [i for i in range(r)]
plot(x,y,'ro')
grid()
show()
and get this error:
'ValueError: x and y must have same first dimension'
So I tried testing it on a 4x4 grid and got this and I get close to what I want, however it only plots points (0,0), (1,1), (2,2), and (3,3)
However, I also want the points (0,0), (1,0), (2,0), (3,0), (1,0), (1,1)...(3,2), (3,3) to appear, as I will later need to plot vectors from this point indicating the direction of flow for my flow net.
Sorry, I know my terminology isn't that great. Does anyone know how to do this and how to make it work for grids that aren't square?

You could use itertools.product to generate the desired points.
Use plt.scatter to plot the points
Use plt.quiver to plot the vector field. (Relevant code taken from these SO answers)
import numpy as np
import matplotlib.pyplot as plt
import itertools
r = 3
c = 4
x = np.linspace(0, c, c+1)
y = np.linspace(0, r, r+1)
pts = itertools.product(x, y)
plt.scatter(*zip(*pts), marker='o', s=30, color='red')
X, Y = np.meshgrid(x, y)
deg = np.arctan(Y**3 - 3*Y-X)
QP = plt.quiver(X, Y, np.cos(deg), np.sin(deg))
plt.grid()
plt.show()

r = 3
c = 4
x = [i % c for i in range(r*c)]
y = [i / c for i in range(r*c)]
print x
print y
Gives:
[0, 1, 2, 3, 0, 1, 2, 3, 0, 1, 2, 3]
[0, 0, 0, 0, 1, 1, 1, 1, 2, 2, 2, 2]
When used to draw graph as you did it produces desired result.

The first two arguments specify your x and y components. So the number of points must match. I think what you want is something like:
from itertools import product
import matplotlib.pyplot as plt
points = np.array(list(product(range(3),range(4))))
plt.plot(points[:,0],points[:,1],'ro')
plt.show()

Related

How to generate a array with points of this curve?

I want to code a program to generate an array with coordinates to follow for drawing a shape like the white here, given are the blue points. Does anyone know how to do something like that or at least can give me a tip?
You could use e.g. InterpolatedUnivariateSpline to interpolate the points. As these spline functions are usually 1D, you could calculate x and y positions separately, depending on a new variable t going from 0 to 1.
import matplotlib.pyplot as plt
import numpy as np
from scipy import interpolate
# positions of the given points
px = [1, 4, 3, 2, 5]
py = [1, 3, 4, 3, 1]
# 5 t-values, at t=0 in point 1, at t=1 reaching point 5
pt = np.linspace(0, 1, len(px))
# sx and sy are functions that interpolate the points at the given t-values
sx = interpolate.InterpolatedUnivariateSpline(pt, px)
sy = interpolate.InterpolatedUnivariateSpline(pt, py)
# calculate many intermediate values
t = np.linspace(0, 1, 500)
x = sx(t)
y = sy(t)
# show the original points together with the spline
fig, ax = plt.subplots(facecolor='black')
ax.axis('off')
plt.scatter(px, py, s=80, color='skyblue')
plt.plot(x, y, color='white')
for i, (xi, yi) in enumerate(zip(px, py), start=1):
ax.text(xi, yi, f'\n {i}', ha='left', va='center', size=30, color='yellow')
plt.show()

What are the required data for pyplot trisurf?

I cannot make it clear for me, how pyplot trisurf works. All the examples I have seen on the Internet use numpy, pandas and other stuff impeding understanding this tool
Pyplot docs say it requires X, Y and Z as 1D arrays. But if I try to provide them, it issues a RuntimeError: Error in qhull Delaunay triangulation calculation: singular input data (exitcode=2); use python verbose option (-v) to see original qhull error. I tried using python list and numpy arange
What are exactly those 1D arrays the tool wants me to provide?
plot_trisurf, when no explicit triangles are given, connects nearby 3D points with triangles to form some kind of surface. X is a 1D array (or a list) of the x-coordinates of these points (similar for Y and Z).
It doesn't work too well when all points lie on the same 3D line. For example, setting all X, Y and Z to [1, 2, 3] will result in a line, not a triangle. P1=(1,1,1), P2=(2,2,2), P3=(3,3,3). The n'th point will use the n'th x, the n'th y and the n'th z. A simple example would be ´ax.plot_trisurf([0, 1, 1], [0, 0, 1], [1, 2, 3])`.
Here is an example:
from mpl_toolkits import mplot3d
import matplotlib.pyplot as plt
from math import sin, cos, pi
fig = plt.figure(figsize=(14, 9))
ax1 = fig.add_subplot(1, 2, 1, projection='3d')
ax1.plot_trisurf([0, 1, 1], [0, 0, 1], [1, 2, 3],
facecolor='cornflowerblue', edgecolor='crimson', alpha=0.4, linewidth=4, antialiased=True)
ax2 = fig.add_subplot(1, 2, 2, projection='3d')
N = 12
X = [0] + [sin(a * 2 * pi / N) for a in range(N)]
Y = [0] + [cos(a * 2 * pi / N) for a in range(N)]
Z = [1] + [0 for a in range(N)]
ax2.plot_trisurf(X, Y, Z,
facecolor='cornflowerblue', edgecolor='crimson', alpha=0.4, linewidth=4, antialiased=True)
plt.show()

Matplotlib plot is plotting the wrong way

Using numpy and matplotlib, I'm trying to plot a polyfitted set of data points:
x = [0, 5, 10, 15, 20]
y = [0, 0.07, 0.14, 0.2, 0.27]
Using this code:
import numpy as np
import matplotlib.pyplot as plt
x = [0, 5, 10, 15, 20]
y = [0, 0.07, 0.14, 0.2, 0.27]
poly = np.polyfit(x, y, 1)
f = np.poly1d(poly)
plt.plot(f)
plt.show()
The variable f in the above code is 0.0134 x + 0.002. This polynomial, when plotted, is supposed to be leaning to the right. But when I plot it, it shows this:
What could be wrong with the code?
What you see is the plot of coefficients of linear function f, but not its values. This is the same as plotting two points:
plt.plot([0.0134, 0.002])
This happens because f is converted to list inside plt.plot:
print(list(f))
[0.0134, 0.002]
The points are displayed with coordinates (0, 0.0134) and (1, 0.002), because 0 and 1 are default x-values in plt.plot.
What you want is to to evaluate f at points x and plot its values:
plt.plot(x, [f(xi) for xi in x])
[f(xi) for xi in x] can be shortened just as f(x), because f can take list arguments, so that the code becomes:
plt.plot(x, f(x))
as already mentioned in other answers.
Because f is a linear function, just 2 points will be enough. x[0] is the first point and x[-1] is the last:
plt.plot([x[0], x[-1]], [f(x[0]), f(x[-1])])
You need to pass x values into the polynomial to get the corresponding y values:
plt.plot(x, f(x)) # this should solve your issue
If you print out f, that returns poly1d([0.0134, 0.002 ]). So if you try to plot that, it will draw a line between 0.0134 and 0.002 on the [0, 1] interval.
What you really want to do is evaluate f at x:
plt.plot(x, f(x))

the convention how connecting lines connect the dots

I have two matrix, x and y. x has size of 10 rows and 50 columns, and so is y.
My data is row-to-row paired. It means that the
x[0][:] <-> y[0][:]
x[1][:] <-> y[1][:]
x[2][:] <-> y[2][:]
......
x[49][:] <-> y[0][:]
When I use following command to do the plot, the
plot(x[:][:],y[:][:],'b-o')
or
plot(x,y,'b-o')
to do the plot, the '-' connects the dots in horizontal direction like following:
However, when I do only plot one row of signal:
plot(x[0][:],y[0][:],'b-o')
it looks correct:
I would like for the '-' to connect the dots in a horizontal fashion. Something like this:
in stead of doing a for loop, how do I do it in matrix format? Thanks.
Make some data to demonstrate.
import numpy as np
from matplotlib import pyplot as plt
x = np.matrix(
[
[1, 1, 1, 1],
[2, 2, 2, 2],
[3, 3, 3, 3],
[4, 4, 4, 4]
]
)
y = x.transpose()
# Vertical Lines of grid:
plt.plot(x, y, 'b-o')
plt.show()
# Horizontal Lines
plt.plot(x, y, 'b-o')
plt.show()
# Together (this is what I think you want)
plt.plot(y, x, 'b-o')
plt.plot(x, y, 'b-o')
plt.show()
If you try to concatenate them to do it in one large matrix it does some seemingly silly things by connecting a couple of points that we really do not want connected.
# sillyness
x1 = np.concatenate((x, y), axis=0)
y1 = np.concatenate((y, x), axis=0)
plt.plot(x1, y1, 'b-o')
plt.show()

How to get unit vectors using `quiver()` in matplotlib?

I'm trying to wrap my head around the quiver function to plot vector fields. Here's a test case:
import numpy as np
import matplotlib.pyplot as plt
X, Y = np.mgrid[1:1.5:0.5, 1:1.5:0.5]
print(X)
print(Y)
u = np.ones_like(X)
v = np.zeros_like(Y)
plt.quiver(X,Y, u, v)
plt.axis([0, 3, 0, 3], units='xy', scale=1.)
plt.show()
I am trying to get a vector of length 1, point from (1,0) to (2,0), but here is what I get:
I have tried adding the scale='xy' option, but the behaviour doesn't change. So how does this work?
First funny mistake is that you put the quiver arguments to the axis call. ;-)
Next, looking at the documentation, it says
If scale_units is ‘x’ then the vector will be 0.5 x-axis units. To plot vectors in the x-y plane, with u and v having the same units as x and y, use angles='xy', scale_units='xy', scale=1.
So let's do as the documentation tells us,
import numpy as np
import matplotlib.pyplot as plt
X, Y = np.mgrid[1:1.5:0.5, 1:1.5:0.5]
u = np.ones_like(X)
v = np.zeros_like(Y)
plt.quiver(X,Y, u, v, units='xy', angles='xy', scale_units='xy', scale=1.)
plt.axis([0, 3, 0, 3])
plt.show()
and indeed we get a one unit long arrow:

Categories

Resources