Create surface points with normals on a mesh in python - python

I basically would like to generate random surface points on a mesh of a 3D object including surface normals in python. I dont have a lot of experience in that field. So can anyone recommend me some packages, approaches, methods to solve the task?
Looked into open3d and trimesh, but still have some troubles.
Thank you!

that's pretty straightforward with trimesh:
In [1]: import trimesh
In [2]: m = trimesh.creation.icosphere()
In [3]: m
Out[3]: <trimesh.base.Trimesh at 0x7f99bec7d550>
In [4]: m.sample?
Signature: m.sample(count, return_index=False)
Docstring:
Return random samples distributed normally across the
surface of the mesh
Parameters
---------
count : int
Number of points to sample
return_index : bool
If True will also return the index of which face each
sample was taken from.
Returns
---------
samples : (count, 3) float
Points on surface of mesh
face_index : (count, ) int
Index of self.faces
File: /media/psf/Dropbox/robotics/trimesh/trimesh/base.py
Type: method
In [5]: points, index = m.sample(1000, return_index=True)
In [6]: points
Out[6]:
array([[ 0.79934465, 0.58103816, 0.1308479 ],
[-0.07373243, 0.08338232, 0.99055759],
[ 0.71660325, 0.21369974, 0.65889903],
...,
[-0.08330094, 0.98915598, 0.10205582],
[ 0.2558548 , -0.68523377, -0.6770221 ],
[-0.11483521, 0.97023335, 0.19696663]])
In [8]: normals = m.face_normals[index]
In [9]: normals
Out[9]:
array([[ 0.79167915, 0.58957859, 0.16012871],
[-0.04950537, 0.06905681, 0.99638365],
[ 0.73810358, 0.21732806, 0.63872656],
...,
[-0.06905681, 0.99638365, 0.04950537],
[ 0.23921922, -0.7022584 , -0.67052763],
[-0.08142553, 0.97123096, 0.22378629]])
You could get nicer normals by finding the barycentric coordinates of each point and then interpolating vertex normals, but just using the face normals is super easy.

I would use matplotlib. Maybe 'Tri-Surface plots' is what you need on this page here? You can CTRL+F Tri-Surface plots literally to quickly find it and look at the image it generates to see if it's what you are looking for.

Related

reproject mesh points of vtu file using pyvista and pyproj

Is there a chance I can replace coordinates of a pyvista_ndarray by a ndarray?
My objective is to reproject points coordinates of a vtu file (unstructured grid). The current coordinates of the mesh object are in the coordinate reference system EPSG 27672 and I want them in EPSG 4326 (WGS84).
For this, I open the vtu file using the pyvista module :
mesh = pv.read("arg_Tm2__t0002.pvtu")
type(mesh.points)
>pyvista.core.pyvista_ndarray.pyvista_ndarray
As a result, mesh.points gives the 3 spatial coordinates. Then, I use pyproj module to reproject the 3 coordinates into EPSG 4326. By combining the 3 resulting x,y,z numpy.ndarray, I now get a NumPy array with a shape and size similar to mesh.points.
# set the pyproj transformer
crs1 = CRS.from_epsg(27562)
crs2 = CRS.from_epsg(4326)
reproj = Transformer.from_crs(crs1, crs2)
# Reprojection
reproj_dataY,reproj_dataX,altitude = reproj.transform(mesh.points[:,0],mesh.points[:,1],mesh.points[:,2])
reprojData = np.column_stack((reproj_dataX,reproj_dataY,altitude))
#compare objects
print('original Mesh points -> ', mesh.points)
print('original Mesh type: ', type(mesh.points))
print('Reprojected points-> ', reprojData)
print('Reprojected type: ', type(reprojData))
Original Mesh points -> [[958427.33 119680.95 2396.288549 ]
[957754.39 120023.85 2430.1833881 ]
[957256.56 120241.02 2112.22953263]
...
[963366.748527 115096.364632 3054.75408138]
[963401.840285 113351.753238 3024.50286566]
[963497.913738 113339.696062 3048.83674197]]
Original Mesh type: <class 'pyvista.core.pyvista_ndarray.pyvista_ndarray'>
Reprojected points-> [[ 6.96487903 45.9823843 2396.288549 ]
[ 6.95646936 45.98581994 2430.1833881 ]
[ 6.95021969 45.98803333 2112.22953263]
...
[ 7.02498443 45.93857775 3054.75408138]
[ 7.02409542 45.92289079 3024.50286566]
[ 7.02532248 45.92273099 3048.83674197]]
Reprojected type: <class 'numpy.ndarray'>enter code here
Now, It's time to replace the coordinates of the vtu object:
mesh.points = reprojData
Finally, I check the modified mesh: The X bounds and Y bounds have been modified and the ranges are correct. However, the plot shows a line of points instead a nice 3d object. :(.
Do you have any idea what is wrong? Do you see another way to manage reprojection?
The range of values of XY and Z after the transformation are significantly different:
>>>np.array(mesh.bounds).reshape((3,-1)).ptp(axis=1)
array([1.22515302e-01, 7.78657599e-02, 2.47978788e+03])
XY are indeed in degrees and Z still in meter. The visual representation of this data is unrealistic and data should be adapted to do so.

3D interpolation between two cloud of points

I want to interpolate a set of temperature, defined on each node of a mesh of a CFD simulation, on a different mesh.
Data from the original set are in csv (X1,Y1,Z1,T1) and I want to find new T2 values on a X2,Y2,Z2 mesh.
From the many possibilities that SCIPY provide us, which is the more suitable for that application? Which are the differences between a linear and a nearest-node approach?
Thank you for your time.
EDIT
Here is an example:
import numpy as np
from scipy.interpolate import griddata
from scipy.interpolate import LinearNDInterpolator
data = np.array([
[ -3.5622760653000E-02, 8.0497122655290E-02, 3.0788827491158E-01],
[ -3.5854682326000E-02, 8.0591522802259E-02, 3.0784350432341E-01],
[ -2.8168760240000E-02, 8.0819296043557E-02, 3.0988532075795E-01],
[ -2.8413346037000E-02, 8.0890746063578E-02, 3.1002054434659E-01],
[ -2.8168663383000E-02, 8.0981744777379E-02, 3.1015319609412E-01],
[ -3.4150537103000E-02, 8.1385114641365E-02, 3.0865343388355E-01],
[ -3.4461673349000E-02, 8.1537336777452E-02, 3.0858242919307E-01],
[ -3.4285601228000E-02, 8.1655884824782E-02, 3.0877386496235E-01],
[ -2.1832991391000E-02, 8.0380712111108E-02, 3.0867371621337E-01],
[ -2.1933870390000E-02, 8.0335713699008E-02, 3.0867959866155E-01]])
temp = np.array([1.4285955811000E+03,
1.4281038818000E+03,
1.4543135986000E+03,
1.4636379395000E+03,
1.4624763184000E+03,
1.3410919189000E+03,
1.3400545654000E+03,
1.3505817871000E+03,
1.2361110840000E+03,
1.2398562012000E+03])
linInter= LinearNDInterpolator(data, temp)
print (linInter(np.array([[-2.8168760240000E-02, 8.0819296043557E-02, 3.0988532075795E-01]])))
this code is working, but I have a dataset of 10million of points to be interpolated on a data set of the same size.
The problem is that this operation is very slow to do for all of my points: is there a way to improve my code?
I used LinearNDinterpolator beacuse it seems to be faster than NearestNDInterpolator (LinearVSNearest).
One solution would be to use RegularGridInterpolator (if your grid is regular). Another approach I can think of is to reduce your data size by taking intervals:
step = 4 # you can increase this based on your data size (eg 100)
m = ((data.argsort(0) % step)==0).any(1)
linInter= LinearNDInterpolator(data[m], temp[m])

Creating temporary curve in maya python plugin

I would like to get a world position of a specific parameter along a nurbs curve defined by a list of positions.
Currently I'm creating a temporary curve in the plugin just to get an array of positions along this curve:
# targetXforms = array of MPoints
# knots = knots list
# uValue = array of floats (0->1)
#
curveFn = om.MFnNurbsCurve()
curveFn.create(targetXforms, knots, 3, om.MFnNurbsCurve.kOpen, False, False, nullObj)
for i in range (numRefXforms):
point = curveFn.getPointAtParam (uValue[i])
print point
Is there a better way to do this (i.e. not have the overhead of creating a curve)? Some math libraries perhaps?
If not how do I delete this curve so I don't have a curve created every time the plugin is evaluated (MDGModifier seems to be a bit crashy)
Also, is there a way to find length along a curve for a given parameter value. Maya 2016 Extension 2 has a function for this:
MFnNurbsCurve::findLengthFromParam()
But, We don't have this extension yet. :(
Thanks in advance!
If you provide a Nurbs curve data object to MFnNurbsCurve.create() as the parent, instead of leaving it null, then the data doesn't appear as a curve in the scene graph and therefore you don't have to remove it.
Try this:
import pymel.core as pm
pts = ( [ 0,0,0 ], [ 0,10,0 ], [ 0,10,10 ], [ 0,0,10 ])
knots = [0,0,0,1,1,1]
curveFn = om.MFnNurbsCurve()
dataCreator = om.MFnNurbsCurveData()
curveDataObject = dataCreator.create()
curveFn.create(pts, knots, 3, om.MFnNurbsCurve.kOpen, False, False, curveDataObject)
for i in range(11):
point = curveFn.getPointAtParam (i/10.0)
pm.spaceLocator(p=(point.x, point.y, point.z))
To get the arc length at a parameter without the API, you could create an arcLengthDimension node. It means you would have to create a curve in the scene and connect it up.
dimNode = pm.arcLengthDimension( 'curveShape1.u[0]' )
dimNode.uParamValue.set( 0.5 )
print(dimNode.arcLength.get()

How do I make perspective transform of point with x and y coordinate

So I wrote this little program which allows me to select 4 points on two images.
Usign those points I get a transformation matrix. After that I select a point on one of the images and want to get visualization of where that point will be on other image.
Say my point is marked like this -> (x,y) - so it's a tuple. How should I format this "position" on image so it can be possible to transform it.
I have looked at documentation for perspectiveTransform() method and figured that I should be storing it in following shape:
numpy.array([
[self.points[self.length-1][0]],
[self.points[self.length-1][1]]
], dtype="float32")
Which would give me on a single click this format:
Point= [[ 2300.]
[ 634.]]
This format doesn't seem to work, I use this Transformation matrix:
M = [[ -1.71913123e+00 -4.76850572e+00 5.27968944e+03]
[ 2.07693562e-01 -1.09738424e+01 6.35222770e+03]
[ 1.02865125e-04 -4.80067600e-03 1.00000000e+00]]
in this method (and get following error):
cv2.perspectiveTransform(src, M)
OpenCV Error: Assertion failed (scn + 1 == m.cols) in cv::perspectiveTransform, file C:\builds\master_PackSlaveAddon-win64-vc12-static\opencv\modules\core\src\matmul.cpp
Any advice or tip is welcome.
I figured out the answer.
Found it on this link
The key is to put your point like this:
pts = numpy.array([[x,y]], dtype = "float32")
And then call another numpy.array on existing variable pts:
pts = numpy.array([pts])
The procedure is the same after this.

In python, what is a good way to match expected values to real values?

Given a Dictionary with ideal x,y locations, I have a list of unordered real x,y locations that are close to the ideal locations and I need to classify them to the corresponding ideal location dictionary key. Sometimes, I get no data at all (0,0) for a given location.
An example dataset is:
idealLoc= {1:(907,1026),
2:(892,1152),
3:(921,1364),
4:(969,1020),
5:(949,1220),
6:(951,1404),
'No_Data':(0,0)}
realLoc = [[ 892., 1152.],
[ 969., 1021.],
[ 906., 1026.],
[ 949., 1220.],
[ 951., 1404.],
[ 0., 0.]]
The output would be a new dictionary with the real locations assigned to the correct dictionary key from idealLoc. I have considered the brute force approach (scan the whole list n times for each best match), but I was wondering if there is a more elegant/efficient way?
Edit: Below is the "brute" force method
Dest = {}
dp = 6
for (y,x) in realLoc:
for key, (r,c) in idealLoc.items():
if x > c-dp and x < c+dp and y > r-dp and y < r+dp:
Dest[key] = [y,x]
break
K-d trees are an efficient way to partition data in order to perform fast nearest-neighbour searches. You can use scipy.spatial.cKDTree to solve your problem like this:
import numpy as np
from scipy.spatial import cKDTree
# convert inputs to numpy arrays
ilabels, ilocs = (np.array(vv) for vv in zip(*idealLoc.iteritems()))
rlocs = np.array(realLoc)
# construct a K-d tree that partitions the "ideal" points
tree = cKDTree(ilocs)
# query the tree with the real coordinates to find the nearest "ideal" neigbour
# for each "real" point
dist, idx = tree.query(rlocs, k=1)
# get the corresponding labels and coordinates
print(ilabels[idx])
# ['2' '4' '1' '5' '6' 'No_Data']
print(ilocs[idx])
# [[ 892 1152]
# [ 969 1020]
# [ 907 1026]
# [ 949 1220]
# [ 951 1404]
# [ 0 0]]
By default cKDTree uses the Euclidean norm as the distance metric, but you could also specify the Manhattan norm, max norm etc. by passing the p= keyword argument to tree.query().
There is also the scipy.interpolate.NearestNDInterpolator class, which is basically just a convenience wrapper around scipy.spatial.cKDTree.
Assuming you want to use euclidean distance, you can use scipy.spatial.distance.cdist to calculate the distance matrix and then choose the nearest point.
import numpy
from scipy.spatial import distance
ideal = numpy.array(idealloc.values())
real = numpy.array(realloc)
dist = distance.cdist(ideal, real)
nearest_indexes = dist.argmin(axis=0)

Categories

Resources