reproject mesh points of vtu file using pyvista and pyproj - python

Is there a chance I can replace coordinates of a pyvista_ndarray by a ndarray?
My objective is to reproject points coordinates of a vtu file (unstructured grid). The current coordinates of the mesh object are in the coordinate reference system EPSG 27672 and I want them in EPSG 4326 (WGS84).
For this, I open the vtu file using the pyvista module :
mesh = pv.read("arg_Tm2__t0002.pvtu")
type(mesh.points)
>pyvista.core.pyvista_ndarray.pyvista_ndarray
As a result, mesh.points gives the 3 spatial coordinates. Then, I use pyproj module to reproject the 3 coordinates into EPSG 4326. By combining the 3 resulting x,y,z numpy.ndarray, I now get a NumPy array with a shape and size similar to mesh.points.
# set the pyproj transformer
crs1 = CRS.from_epsg(27562)
crs2 = CRS.from_epsg(4326)
reproj = Transformer.from_crs(crs1, crs2)
# Reprojection
reproj_dataY,reproj_dataX,altitude = reproj.transform(mesh.points[:,0],mesh.points[:,1],mesh.points[:,2])
reprojData = np.column_stack((reproj_dataX,reproj_dataY,altitude))
#compare objects
print('original Mesh points -> ', mesh.points)
print('original Mesh type: ', type(mesh.points))
print('Reprojected points-> ', reprojData)
print('Reprojected type: ', type(reprojData))
Original Mesh points -> [[958427.33 119680.95 2396.288549 ]
[957754.39 120023.85 2430.1833881 ]
[957256.56 120241.02 2112.22953263]
...
[963366.748527 115096.364632 3054.75408138]
[963401.840285 113351.753238 3024.50286566]
[963497.913738 113339.696062 3048.83674197]]
Original Mesh type: <class 'pyvista.core.pyvista_ndarray.pyvista_ndarray'>
Reprojected points-> [[ 6.96487903 45.9823843 2396.288549 ]
[ 6.95646936 45.98581994 2430.1833881 ]
[ 6.95021969 45.98803333 2112.22953263]
...
[ 7.02498443 45.93857775 3054.75408138]
[ 7.02409542 45.92289079 3024.50286566]
[ 7.02532248 45.92273099 3048.83674197]]
Reprojected type: <class 'numpy.ndarray'>enter code here
Now, It's time to replace the coordinates of the vtu object:
mesh.points = reprojData
Finally, I check the modified mesh: The X bounds and Y bounds have been modified and the ranges are correct. However, the plot shows a line of points instead a nice 3d object. :(.
Do you have any idea what is wrong? Do you see another way to manage reprojection?

The range of values of XY and Z after the transformation are significantly different:
>>>np.array(mesh.bounds).reshape((3,-1)).ptp(axis=1)
array([1.22515302e-01, 7.78657599e-02, 2.47978788e+03])
XY are indeed in degrees and Z still in meter. The visual representation of this data is unrealistic and data should be adapted to do so.

Related

Exporting 3D points to Blender via the .ply format creates an empty object, while it works in MeshLab

I made a Python script to create a .ply file using the image and the cloud of points of a scan 3D, stored as a NumPy array.
I can open the resulting file.ply in MeshLab. It works well.
But when I import it in Blender, there is no point. The resulting object is empty.
Do you have an idea on how to solve that?
Thanks
def row_col_xyz_to_ply(self, xyz, rgb, name="output"):
"""Convers a numpy (row, col, xyz) cloud of points to ply format
Parameters:
xyz (NDArray): 3D points for each image pixel (row, col, (x,y,z))
rbg (NDArray): RGBA values for each image pixel (row, col, (r,g,b,a))
Returns:
None: save the .ply file on the disk instead
"""
# reshape
# Extract the coordinates of the points where there is actual values (not NaN) in the xyz cloud of points
points_rows, points_cols = np.where(~np.isnan(xyz[:,:,0]))
# Grab the corresponding points in the xyz cloud of points in an array
points_xyz = xyz[points_rows,points_cols,:] # n*3 array of 3D points (after nan filtering)
# Grab the corresponding points in the image in an array
points_image = rgb[points_rows,points_cols,0:3] # n*3 array of RGB points (after nan filtering)
# Create a dict of data
data = {
'x': points_xyz[:,0],
'y': points_xyz[:,1],
'z': points_xyz[:,2],
'red': points_image[:,0],
'green': points_image[:,1],
'blue': points_image[:,2]
}
# Convert it to a cloud of points
cloud = PyntCloud(pd.DataFrame(data=data))
# Path where to save it
filename = f"{name}.ply"
path = os.path.join(self.path_exports,filename)
# Save it
cloud.to_file(path)
# Debug
print("row_col_xyz_to_ply > saved: ",filename)
The problem is in the blender .ply importer. It doesn't support points that are not used by any triangle.
I ran into the same problem that Rockcat pointed out. I'm not sure if you're still looking for an answer but I found that this custom importer is a bit of a workaround. It imports every point as a vertex but doesn't need them to be connected

Why latitudes and longitudes are two dimensional arrays in netcdf file?

I have netCDF file, which contains temperature data over some location. Data shape is 1450x900.
I am creating search functionality in my app, to locate temperature data with lat, lon values.
So I extracted lat and lon coordinates data from netCDf file, but I was expecting that they would be 1D arrays and instead got 2D arrays with 1450x900 shape for both coordinates.
So my question: why they are 2d arrays, instead of 1450 latitude values and 900 longitude values? Doesn't 1450 lat values and 900 lon values describe whole grid?
Lets say we have 4x5 square, indices for locating rightmost and bottom-most point of the grid will be [4, 5]. So my indices for x will be[1, 2, 3, 4] and for y: [1, 2, 3, 4, 5]. 9 indices in total are enough to locate any point on that grid (consisting of 20 cells). So why do lat (x) and lon (y) coordinates in netcdf file contain 20 indices separately (40 in total), instead of 4 and 5 indices respectively (9 in total)? Hope you get what confuses me.
Is it possible to somehow map those 2D arrays and "downgrade" to 1450 latitude values and 900 longitude values? OR it is ok as it is right now? How can I use those values for my intention? Do I need to zip lat lon arrays?
here are the shapes:
>>> DS = xarray.open_dataset('file.nc')
>>> DS.tasmin.shape
(31, 1450, 900)
>>> DS.projection_x_coordinate.shape
(900,)
>>> DS.projection_y_coordinate.shape
(1450,)
>>> DS.latitude.shape
(1450, 900)
>>> DS.longitude.shape
(1450, 900)
consider that projection_x_coordinate and projection_y_coordinate are easting/northing values not lat/longs
here is the metadata of file if needed:
Dimensions: (bnds: 2, projection_x_coordinate: 900, projection_y_coordinate: 1450, time: 31)
Coordinates:
* time (time) datetime64[ns] 2018-12-01T12:00:00 ....
* projection_y_coordinate (projection_y_coordinate) float64 -1.995e+0...
* projection_x_coordinate (projection_x_coordinate) float64 -1.995e+0...
latitude (projection_y_coordinate, projection_x_coordinate) float64 ...
longitude (projection_y_coordinate, projection_x_coordinate) float64 ...
Dimensions without coordinates: bnds
Data variables:
tasmin (time, projection_y_coordinate, projection_x_coordinate) float64 ...
transverse_mercator int32 ...
time_bnds (time, bnds) datetime64[ns] ...
projection_y_coordinate_bnds (projection_y_coordinate, bnds) float64 ...
projection_x_coordinate_bnds (projection_x_coordinate, bnds) float64 ...
Attributes:
comment: Daily resolution gridded climate observations
creation_date: 2019-08-21T21:26:02
frequency: day
institution: Met Office
references: doi: 10.1002/joc.1161
short_name: daily_mintemp
source: HadUK-Grid_v1.0.1.0
title: Gridded surface climate observations data for the UK
version: v20190808
Conventions: CF-1.5
Your data adheres to version 1.5 of the Climate and Forecast conventions.
The document describing this version of the conventions is here, although the relevant section is essentially unchanged across many versions of the conventions.
See section 5.2:
5.2. Two-Dimensional Latitude, Longitude, Coordinate Variables
The latitude and longitude coordinates of a horizontal grid that was
not defined as a Cartesian product of latitude and longitude axes, can
sometimes be represented using two-dimensional coordinate variables.
These variables are identified as coordinates by use of the coordinates attribute.
It looks like you are using the HadOBS 1km resolution gridded daily minimum temperature, and this file in particular:
http://dap.ceda.ac.uk/thredds/fileServer/badc/ukmo-hadobs/data/insitu/MOHC/HadOBS/HadUK-Grid/v1.0.1.0/1km/tasmin/day/v20190808/tasmin_hadukgrid_uk_1km_day_20181201-20181231.nc (warning: >300MB download)
As it states, the data is on a transverse mercator grid.
If you look at output from ncdump -h <filename> you will also see the following description of the grid expressed by means of attributes of the transverse_mercator dummy variable:
int transverse_mercator ;
transverse_mercator:grid_mapping_name = "transverse_mercator" ;
transverse_mercator:longitude_of_prime_meridian = 0. ;
transverse_mercator:semi_major_axis = 6377563.396 ;
transverse_mercator:semi_minor_axis = 6356256.909 ;
transverse_mercator:longitude_of_central_meridian = -2. ;
transverse_mercator:latitude_of_projection_origin = 49. ;
transverse_mercator:false_easting = 400000. ;
transverse_mercator:false_northing = -100000. ;
transverse_mercator:scale_factor_at_central_meridian = 0.9996012717 ;
and you will also see that the coordinate variables projection_x_coordinate and projection_y_coordinate have units of metres.
The grid in question is the Ordnance Survey UK grid using numeric grid references.
See for example this description of the OS grid (from Wikipedia).
If you wish to express the data on a regular longitude-latitude grid then you will need to do some type of interpolation. I see that you are using xarray. You can combine this with pyresample to do the interpolation. Here is an example:
import xarray as xr
import numpy as np
from pyresample.geometry import SwathDefinition
from pyresample.kd_tree import resample_nearest, resample_gauss
ds = xr.open_dataset("tasmin_hadukgrid_uk_1km_day_20181201-20181231.nc")
# Define a target grid. For sake of example, here is one with just
# 3 longitudes and 4 latitudes.
lons = np.array([-2.1, -2., -1.9])
lats = np.array([51.7, 51.8, 51.9, 52.0])
# The target grid is regular (1-d lon, lat coordinates) but we will need
# a 2d version (similar to the input grid), so use numpy.meshgrid to produce this.
lon2d, lat2d = np.meshgrid(lons, lats)
origin_grid = SwathDefinition(lons=ds.longitude, lats=ds.latitude)
target_grid = SwathDefinition(lons=lon2d, lats=lat2d)
# get a numpy array for the first timestep
data = ds.tasmin[0].to_masked_array()
# nearest neighbour interpolation example
# Note that radius_of_influence has units metres
interpolated = resample_nearest(origin_grid, data, target_grid, radius_of_influence=1000)
# GIVES:
# array([[5.12490065, 5.02715332, 5.36414835],
# [5.08337723, 4.96372838, 5.00862833],
# [6.47538931, 5.53855722, 5.11511239],
# [6.46571817, 6.17949381, 5.87357538]])
# gaussian weighted interpolation example
# Note that radius_of_influence and sigmas both have units metres
interpolated = resample_gauss(origin_grid, data, target_grid, radius_of_influence=1000, sigmas=1000)
# GIVES:
# array([[5.20432465, 5.07436805, 5.39693221],
# [5.09069187, 4.8565934 , 5.08191639],
# [6.4505963 , 5.44018209, 5.13774416],
# [6.47345359, 6.2386732 , 5.62121948]])
I figured an answer by myself.
As appeared 2D lat long arrays are used to define the "grid" of some location.
In other words, if we zip lat long values and project on the map, we will get "curved grid" (earth curvature is considered in other words) over some location, which then are used to create grid reference of location.
Hope its clear for anyone interested.

Create surface points with normals on a mesh in python

I basically would like to generate random surface points on a mesh of a 3D object including surface normals in python. I dont have a lot of experience in that field. So can anyone recommend me some packages, approaches, methods to solve the task?
Looked into open3d and trimesh, but still have some troubles.
Thank you!
that's pretty straightforward with trimesh:
In [1]: import trimesh
In [2]: m = trimesh.creation.icosphere()
In [3]: m
Out[3]: <trimesh.base.Trimesh at 0x7f99bec7d550>
In [4]: m.sample?
Signature: m.sample(count, return_index=False)
Docstring:
Return random samples distributed normally across the
surface of the mesh
Parameters
---------
count : int
Number of points to sample
return_index : bool
If True will also return the index of which face each
sample was taken from.
Returns
---------
samples : (count, 3) float
Points on surface of mesh
face_index : (count, ) int
Index of self.faces
File: /media/psf/Dropbox/robotics/trimesh/trimesh/base.py
Type: method
In [5]: points, index = m.sample(1000, return_index=True)
In [6]: points
Out[6]:
array([[ 0.79934465, 0.58103816, 0.1308479 ],
[-0.07373243, 0.08338232, 0.99055759],
[ 0.71660325, 0.21369974, 0.65889903],
...,
[-0.08330094, 0.98915598, 0.10205582],
[ 0.2558548 , -0.68523377, -0.6770221 ],
[-0.11483521, 0.97023335, 0.19696663]])
In [8]: normals = m.face_normals[index]
In [9]: normals
Out[9]:
array([[ 0.79167915, 0.58957859, 0.16012871],
[-0.04950537, 0.06905681, 0.99638365],
[ 0.73810358, 0.21732806, 0.63872656],
...,
[-0.06905681, 0.99638365, 0.04950537],
[ 0.23921922, -0.7022584 , -0.67052763],
[-0.08142553, 0.97123096, 0.22378629]])
You could get nicer normals by finding the barycentric coordinates of each point and then interpolating vertex normals, but just using the face normals is super easy.
I would use matplotlib. Maybe 'Tri-Surface plots' is what you need on this page here? You can CTRL+F Tri-Surface plots literally to quickly find it and look at the image it generates to see if it's what you are looking for.

With max/min lat and lon and number of grid points, how to get lat/lon grid?

I'm downloading a netCDF dataset for contour plotting and analysis but cannot get the data projection quite right. The file says that it is a Lambert Conformal Projection and provides the lat/lon min/max:
geospatial_lat_min: 17.5812268306
geospatial_lat_max: 55.5349606426
geospatial_lon_min: -140.027321405
geospatial_lon_max: -57.2098720419
However, the x and y data points are a bit confusing (y shown below for reference):
print data.variables['y'][:]
[ -4.26348724e+02 -4.06030731e+02 -3.85712738e+02 -3.65394714e+02
-3.45076721e+02 -3.24758728e+02 -3.04440735e+02 -2.84122711e+02
-2.63804718e+02 -2.43486725e+02 -2.23168716e+02 -2.02850723e+02
-1.82532715e+02 -1.62214722e+02 -1.41896713e+02 -1.21578720e+02
-1.01260719e+02 -8.09427185e+01 -6.06247177e+01 -4.03067169e+01
-1.99887161e+01 3.29285234e-01 2.06472855e+01 4.09652863e+01
6.12832870e+01 8.16012878e+01 1.01919289e+02 1.22237289e+02
1.42555298e+02 1.62873291e+02 1.83191299e+02 2.03509293e+02
2.23827301e+02 2.44145294e+02 2.64463287e+02 2.84781311e+02
3.05099304e+02 3.25417297e+02 3.45735291e+02 3.66053314e+02
3.86371307e+02 4.06689301e+02 4.27007294e+02 4.47325317e+02
4.67643311e+02 4.87961304e+02 5.08279297e+02 5.28597290e+02
5.48915283e+02 5.69233337e+02 5.89551331e+02 6.09869324e+02
6.30187317e+02 6.50505310e+02 6.70823303e+02 6.91141296e+02
7.11459290e+02 7.31777344e+02 7.52095337e+02 7.72413330e+02
7.92731323e+02 8.13049316e+02 8.33367310e+02 8.53685303e+02
8.74003296e+02 8.94321350e+02 9.14639343e+02 9.34957336e+02
9.55275330e+02 9.75593323e+02 9.95911316e+02 1.01622931e+03
1.03654736e+03 1.05686536e+03 1.07718335e+03 1.09750134e+03
1.11781934e+03 1.13813733e+03 1.15845532e+03 1.17877332e+03
1.19909131e+03 1.21940930e+03 1.23972729e+03 1.26004529e+03
1.28036328e+03 1.30068140e+03 1.32099939e+03 1.34131738e+03
1.36163538e+03 1.38195337e+03 1.40227136e+03 1.42258936e+03
1.44290735e+03 1.46322534e+03 1.48354333e+03 1.50386133e+03
1.52417932e+03 1.54449731e+03 1.56481531e+03 1.58513330e+03
1.60545129e+03 1.62576941e+03 1.64608740e+03 1.66640540e+03
1.68672339e+03 1.70704138e+03 1.72735938e+03 1.74767737e+03
1.76799536e+03 1.78831335e+03 1.80863135e+03 1.82894934e+03
1.84926733e+03 1.86958533e+03 1.88990332e+03 1.91022131e+03
1.93053931e+03 1.95085742e+03 1.97117542e+03 1.99149341e+03
2.01181140e+03 2.03212939e+03 2.05244727e+03 2.07276538e+03
2.09308325e+03 2.11340137e+03 2.13371948e+03 2.15403735e+03
2.17435547e+03 2.19467334e+03 2.21499146e+03 2.23530933e+03
2.25562744e+03 2.27594531e+03 2.29626343e+03 2.31658130e+03
2.33689941e+03 2.35721729e+03 2.37753540e+03 2.39785327e+03
2.41817139e+03 2.43848950e+03 2.45880737e+03 2.47912549e+03
2.49944336e+03 2.51976147e+03 2.54007935e+03 2.56039746e+03
2.58071533e+03 2.60103345e+03 2.62135132e+03 2.64166943e+03
2.66198730e+03 2.68230542e+03 2.70262329e+03 2.72294141e+03
2.74325928e+03 2.76357739e+03 2.78389551e+03 2.80421338e+03
2.82453149e+03 2.84484937e+03 2.86516748e+03 2.88548535e+03
2.90580347e+03 2.92612134e+03 2.94643945e+03 2.96675732e+03
2.98707544e+03 3.00739331e+03 3.02771143e+03 3.04802930e+03
3.06834741e+03 3.08866553e+03 3.10898340e+03 3.12930151e+03
3.14961938e+03 3.16993750e+03 3.19025537e+03 3.21057349e+03
3.23089136e+03 3.25120947e+03 3.27152734e+03 3.29184546e+03
3.31216333e+03 3.33248145e+03 3.35279932e+03 3.37311743e+03
3.39343530e+03 3.41375342e+03 3.43407153e+03 3.45438940e+03
3.47470752e+03 3.49502539e+03 3.51534351e+03 3.53566138e+03
3.55597949e+03 3.57629736e+03 3.59661548e+03 3.61693335e+03
3.63725146e+03 3.65756934e+03 3.67788745e+03 3.69820532e+03
3.71852344e+03 3.73884155e+03 3.75915942e+03 3.77947754e+03
3.79979541e+03 3.82011353e+03 3.84043140e+03 3.86074951e+03
3.88106738e+03 3.90138550e+03 3.92170337e+03 3.94202148e+03
3.96233936e+03]
Edit: The x/y projected coordinates are defined to have units of km.
I have the x and y lengths (301 and 217, respectively) and feel like I could define the lat and lon values manually or convert the projection using PyProj. However, I'm a bit lost on where to start.
My initial thought was to use the min lat and lon and the average difference between grid points:
for i in range(0,len(lat_vals)):
lat_vals[i] = 16.28100013732909+0.18297297297*i
for j in range(0,len(lon_vals)):
lon_vals[j] = -139.9440104734173+0.27634406361*j
But, that was before I remembered that longitude change will not be constant with latitude.
Thanks for your help!
For anyone who is using siphon and TDSCatalog to grab data via the THREDDS server at thredds.ucar.edu (as I was), you can use the .add_lonlat('true') option on your query to get lat/lon coordinates. That solved it!

How do I make perspective transform of point with x and y coordinate

So I wrote this little program which allows me to select 4 points on two images.
Usign those points I get a transformation matrix. After that I select a point on one of the images and want to get visualization of where that point will be on other image.
Say my point is marked like this -> (x,y) - so it's a tuple. How should I format this "position" on image so it can be possible to transform it.
I have looked at documentation for perspectiveTransform() method and figured that I should be storing it in following shape:
numpy.array([
[self.points[self.length-1][0]],
[self.points[self.length-1][1]]
], dtype="float32")
Which would give me on a single click this format:
Point= [[ 2300.]
[ 634.]]
This format doesn't seem to work, I use this Transformation matrix:
M = [[ -1.71913123e+00 -4.76850572e+00 5.27968944e+03]
[ 2.07693562e-01 -1.09738424e+01 6.35222770e+03]
[ 1.02865125e-04 -4.80067600e-03 1.00000000e+00]]
in this method (and get following error):
cv2.perspectiveTransform(src, M)
OpenCV Error: Assertion failed (scn + 1 == m.cols) in cv::perspectiveTransform, file C:\builds\master_PackSlaveAddon-win64-vc12-static\opencv\modules\core\src\matmul.cpp
Any advice or tip is welcome.
I figured out the answer.
Found it on this link
The key is to put your point like this:
pts = numpy.array([[x,y]], dtype = "float32")
And then call another numpy.array on existing variable pts:
pts = numpy.array([pts])
The procedure is the same after this.

Categories

Resources