Triangular mesh in python - python

I have points in a plane. I want each point to be a vertex of at least one triangle. I also want to fill the plane confined within the vertices with no overlapping triangles. Something like:
Note that for each pair of points in any triangle, the line connecting them consists of points that are also in some triangle.
I want to get the list of all vertices triplets from which a triangle can be defined. In the picture, there would be 6 such triplets.
My attempt is this one:
indices = range(locs.shape[0])
visited = []
ind_list = []
for ind in indices:
if ind not in visited:
visited.append(ind)
nearest_idx = np.argsort(distances[ind])[1:3]
for ni in nearest_idx:
visited.append(ni)
ind_list.append([ind]+list(nearest_idx))
where locs is a Nx2 array containing the (x,y) coordinates of each vertex, distances is a NxN matrix whose i,j component is the euclidean distance between the i-th vertex and the j-th vertex. Note that ind_list is a list of indices that allow me to get the vertices by locs[ind_list]. What I want is the correct ind_list. In my case I clearly omit some of the triplets. An example of this failure can be seen in this figure:
where there are blank regions. Instead I want all the space to be filled with no overlapping triangles. Any idea of how to achieve this? Thanks a lot!

Related

How to ignore implicit zeros with scipy.sparse.csr_matrix.min?

Goal
I have a list of about 500K points in 3D space. I want to find the two coordinates with the maximum first nearest neighbor distance.
Approach
I am using scipy to calculate a sparse distance matrix:
from scipy.spatial import cKDTree
tree = cKDTree(points, 40)
spd = tree.sparse_distance_matrix(tree, 0.01)
spo = spd.tocsr()
spo.eliminate_zeros()
I eliminate explicit zeros to account for the diagonal elements where the distance between each point and itself is calculated.
I wanted to now find the coordinates of the minimum distance in each row/column, which should correspond to the first nearest neighbor of each point, with something like:
spo.argmin(axis=0)
By finding the maximum distance for the elements in this array I should be able to find the two elements with the maximum first nearest neighbor distance.
The problem
The issue is that the min and argmin functions of scipy.sparse.csr_matrix also take the implicit zeros into account, which for this application I do not want. How do I solve this issue? With this huge matrix, performance and memory are both issues. Or is there an entirely different approach to what I want to do?
I didn't find a solution with the distance matrix but it appears I overlooked the most obvious solution using the query method of the tree.
So to find the maximum distance between first nearest neighbors I did (with vectors a numpy array of shape (N, 3)):
tree = cKDTree(vectors, leaf_size)
# get the indexes of the first nearest neighbor of each vertex
# we use k=2 because k=1 are the points themselves with distance 0
nn1 = tree.query(vectors, k=2)[1][:,1]
# get the vectors corresponding to those indexes. Basically this is "vectors" sorted by
# first nearest neighbor of each point in "vectors".
nn1_vec = vectors[nn1]
# the distance between each point and its first nearest neighbor
nn_dist = np.sqrt(np.sum((vectors - nn1_vec)**2, axis=1))
# maximum distance
return np.max(nn_dist)

How to check for vertex matches and replace the repeat index on the edge?

After developing code for Delaunay triangulation I made a list of node coordinates and found that there are a few numbers of duplicate nodes.
So I avoid duplicate nodes:
distort = tri.Triangulation(mesh_x, mesh_y) #triangulation
#making list of nodes coordinates
data = np.array([mesh_x, mesh_y])
data = np.transpose(data)
#sorting avoids duplicated nodes
unique_data = np.unique(data, axis = 0)
Now I have a problem with edges that connect the nodes. After removing the duplicate nodes, the edges are reallocated and instead of a smooth grid I got something like this:image
How can I check for vertex matches and replace the repeat index on the edge and get a smooth grid? (like this image)
Every time you remove a node you need to subtract one to all the indices of triangles vertices bigger of the removed node index.

Selecting faces of a mesh based on vertices coordinates in numpy

I have two numpy arrays, one is for 3D vertices of a mesh, call it vert and one is for the triangular faces, call it faces:
The vert array is a N x 3 shape array of float, hence N three dimensional points. The x coordinate of each point can have both positive and negative values.
As a pure example this can be the vert array:
[[ 2.886495 24.886948 15.909558]
[ -13.916695 -58.985245 19.655312]
[ 40.415527 8.968353 8.515955]
...
[ 13.392465 -58.20602 18.752457]
[ -12.504704 -58.307934 18.912386]
[ 13.322185 -58.52817 19.165733]]
Since the mesh is centered, the left part of the mesh is the one with positive x component and the corresponding vertex indices are found by a np.where
i_vert_left = np.where(vert[:,0]>0)[0]
I now would like to filter out those faces made of triangles with coordinates entirely in the positive x axis.
However I have a problem in doing this indexing operation correctly.
My first attempt was to subset the faces such that their corresponding vertices have x>0
faces_left = np.asarray([f for f in faces if np.all(np.isin(i_vert_left,f)) ])
but the operation is incredibly slow on large meshes.
How can I exploit a smart indexing of the faces?
Assuming faces is a Nx3 array of integers indexing the three vertices of each triangle, I think you should just need:
# Check whether each vertex is left or not
vert_left_mask = vert[:, 0] > 0
# Check whether each face has all vertices on left or not
faces_left_mask = np.all(vert_left_mask[faces], axis=1)
# Select resulting left faces
faces_left = faces[faces_left_mask]
The main "trick" here is in vert_left_mask[faces], which replaces each integer vertex number with a boolean indicating whether the vertex is left or not, so it's easy to tell which face is fully left with np.all.

Finding Intersections Region Based Trajectories vs. Line Trajectories

I have two trajectories (i.e. two lists of points) and I am trying to find the intersection points for both these trajectories. However, if I represent these trajectories as lines, I might miss real world intersections (just misses).
What I would like to do is to represent the line as a polygon with certain width around the points and then find where the two polygons intersect with each other.
I am using the python spatial library but I was wondering if anyone has done this before. Here is a picture of the line segments which don't intersect because they just miss each other. Below is the sample data code that represents the trajectory of two objects.
object_trajectory=np.array([[-3370.00427248, 3701.46800775],
[-3363.69164715, 3702.21408203],
[-3356.31277271, 3703.06477984],
[-3347.25951787, 3704.10740164],
[-3336.739511 , 3705.3958357 ],
[-3326.29355823, 3706.78035903],
[-3313.4987339 , 3708.2076586 ],
[-3299.53433345, 3709.72507366],
[-3283.15486406, 3711.47077376],
[-3269.23487255, 3713.05635557]])
target_trajectory=np.array([[-3384.99966703, 3696.41922372],
[-3382.43687562, 3696.6739521 ],
[-3378.22995178, 3697.08802862],
[-3371.98983789, 3697.71490469],
[-3363.5900481 , 3698.62666805],
[-3354.28520354, 3699.67613798],
[-3342.18581931, 3701.04853915],
[-3328.51519511, 3702.57528111],
[-3312.09691577, 3704.41961271],
[-3297.85543763, 3706.00878621]])
plt.plot(object_trajectory[:,0],object_trajectory[:,1],'b',color='b')
plt.plot(vehicle_trajectory[:,0],vehicle_trajectory[:,1],'b',color='r')
Let's say you have two lines defined by numpy arrays x1, y1, x2, and y2.
import numpy as np
You can create an array distances[i, j] containing the distances between the ith point in the first line and the jth point in the second line.
distances = ((x1[:, None] - x2[None, :])**2 + (y1[:, None] - y2[None, :])**2)**0.5
Then you can find indices where distances is less than some threshold you want to define for intersection. If you're thinking of the lines as having some thickness, the threshold would be half of that thickness.
threshold = 0.1
intersections = np.argwhere(distances < threshold)
intersections is now a N by 2 array containing all point pairs that are considered to be "intersecting" (the [i, 0] is the index from the first line, and [i, 1] is the index from the second line). If you want to get the set of all the indices from each line that are intersecting, you can use something like
first_intersection_indices = np.asarray(sorted(set(intersections[:, 0])))
second_intersection_indices = np.asarray(sorted(set(intersections[:, 1])))
From here, you can also determine how many intersections there are by taking only the center value for any consecutive values in each list.
L1 = []
current_intersection = []
for i in range(first_intersection_indices.shape[0]):
if len(current_intersection) == 0:
current_intersection.append(first_intersection_indices[i])
elif first_intersection_indices[i] == current_intersection[-1]:
current_intersection.append(first_intersection_indices[i])
else:
L1.append(int(np.median(current_intersection)))
current_intersection = [first_intersection_indices[i]]
print(len(L1))
You can use these to print the coordinates of each intersection.
for i in L1:
print(x1[i], y1[i])
Turns out that the shapely package already has a ton of convinience functions that get me very far with this.
from shapely.geometry import Point, LineString, MultiPoint
# I assume that self.line is of type LineString (i.e. a line trajectory)
region_polygon = self.line.buffer(self.lane_width)
# line.buffer essentially generates a nice interpolated bounding polygon around the trajectory.
# Now we can identify all the other points in the other trajectory that intersects with the region_polygon that we just generated. You can also use .intersection if you want to simply generate two polygon trajectories and find the intersecting polygon as well.
is_in_region = [region_polygon.intersects(point) for point in points]

Alternating triangulation pattern on square lattice

I have an array of points in python making a square lattice. I want to triangulate it like this:
so that the braces alternate in a checkerboard pattern from square to square.
My attempts have been based on deforming it before triangulation. For example, shearing the lattice before triangulating via
xy_skew = np.dstack((xypts[:,0]+ 0.1*xypts[:,1], xypts[:,1]))[0]
tri = scipy.spatial.Delaunay(xy_skew)
TRI = tri.vertices
can give me all 'rightward' diagonals or all 'leftward' diagonals, but I haven't found a deformation that can lead to the desired triangulation.
How can I do this efficiently (for lattices of ~million points)?
If relevant, the indices of my points increase in Y first, then increase in X.
Thanks!

Categories

Resources