Algorithm for Chinese Postman where some edges are optional - python

I have a graph that contains edges that must be visited, as well as edges that are optional. The edges have varying weights and can be traveled in either direction and as many times as required. I am trying to determine the route that minimises the total weight.
As I understand it, the Chinese Postman Problem deals with a graphs where every edge of a graph must be visited at least once. Can anyone tell me if the variant described above has a 'name' or point me in the direction of algorithms that might deal with solving this type of graph?
I am attempting to program a solution in Python so any solutions that use that would be great, otherwise I'm sure I will be able to work through a solution.

Related

Check edges in a graph

So given a graph that has a cycle from vertex S to E as it goes through every vertex then ends on E. My goal here is to remove all extra edges so that there is just a path from S to E. To help me with this I have a function called check(node) I'm not given the code for it but it returns True or False if there still exist a path from S to E such that all nodes were visited only once until we end at E.
Example:
The plan is to remove a edge from vertex a to b and then run check(node) on the mutated graph and see if it still returns True so we know its safe to remove that edge, and if it returns False then add it back. Do that for every edge so only the needed edge remains, however I have no idea how to iterate through the edges.
I stored the graphs in a dictionary
Usually the approach to an Algorithms problem like this is you first figure out what algorithmic tools you can use. Most basic problems can be solved with an existing algorithm. Your first objective is to see if you can modify the problem set (ie the given graph) in such a way that you don't need to modify the algorithm, because modifying the algorithm lends to difficulties in assessing Big-O for the problem. If the graph can't be modified in any way that makes running a black boxed Algorithm easy, then you modify the algorithm. The last resort is to come up wit your own algorithm to solve the problem.
If my Algorithms recollection is correct, in short this is the Travelling Salesman Problem. If I'm understanding your question correctly, you want the shortest path possible that visits every node. You don't even need to modify your given graph in order to use the algorithm. It should theoretically find you the desired path. Only after the algorithm has run do you need to reduce the graph to its desired state. So I suggest finding some way to implement TSP to your specifications, and remove all edges that aren't part of the solution.
Here is some sample code from GeeksForGeeks that could help you get started

Finding edges that have the property that if you follow them, you'll have to come back to the node you just left to reach the rest of the graph

I'm working on a travelling salesman problem with multiple salesmen, and I wish to find and mark the entrance to "pockets" (I don't know a better word, that's the problem), where if one salesman enters that pocket, there's no point of another one going in there unless it's too large a job for the first one.
These are all over the place in real street networks. If you go in this way, you have to come out again the same way sooner or later, because there's no other way out. There may be some inner structure, loops and branches, but no way back to the city proper except where you came in.
I don't care about sub-pockets, I just want to get a list of lists of nodes, where one of them is most of the city, and the others are all these pockets that are connected as described above to the main road network.
I'm working on a MultiDiGraph as provided by osmnx.
Your starting point is the term cut, a set of edges to remove that partitions the graph. You're looking for any "minimal cut" of size 1.
For something as connected as street maps, I think you'll want Karger's algorithm. This searches for minimal cut points indirectly, by collapsing any heavily-connected set of nodes into a single node, until finally there are relatively few single-edge connections remaining. From here, it's easier to find cuts.

Find geometry (shapes) from node cloud

I am working on some code that needs to recognize some fairly basic geometry based on a cloud of nodes. I would be interested in detecting:
plates (simple bounded planes)
cylinders (two node loops)
half cylinders (arc+line+arc+line)
domes (n*loop+top node)
I tried searching for "geometry from node cloud", "get geometry from nodes", but I cant find a nice reference. There is probably a whole field on this, can someone point me the way? i already started coding something, but I feel like re-inventing the wheel...
A good start is to just get the convex hull (the tightest fitting polygon that can surround your node cloud) of the nodes, use either Grahams algorithm or QuickHull. Note that QuickHull is easier to code and probably faster, unless you are really unlucky. There is a pure python implementation of QuickHull here. But I'm sure a quick Google search will show many other results.
Usually the convex hull is the starting point for most other shape recognition algorithms, if your cloud can be described as a sequence of strokes, there are many algorithms and approaches:
Recognizing multistroke geometric shapes: an experimental evaluation
This may be even better, once you have the convex hull, break down the polygon to pairs of vertices and run this algorithm to match based on similarity to training data:
Hierarchical shape recognition using polygon approximation and dynamic alignment
Both of these papers are fairly old, so you can use google scholar to see who cites these papers and there you have a nice literature trail of attempts to solve this problem.
There are a multitude of different methods and approaches, this has been well studied in the literature, what method you take really depends on the level of accuracy you hope to achieve, and the amount of shapes you want to recognize, as well as your input data set.
Either way, using a convex hull algorithm to produce polygons out of point clouds is the very first step and usually input to the more sophisticated algorithmms.
EDIT:
I did not consider the 3D case, for that their is a lot of really interesting work in computer graphics that has focused on this, for instance this paper Efficient RANSAC for Point-Cloud Shape Detection
Selections from from Abstract:
We present an automatic algorithm to detect basic shapes in unorganized point clouds. The algorithm decomposes the point cloud into a concise, hybrid structure of inherent shapes and a set of remaining points. Each detected shape serves as a proxy for a set of corresponding points. Our method is based on random sampling and detects planes, spheres, cylinders, cones and tori...We demonstrate that the algorithm is robust even in the presence of many outliers and a high degree of noise...Moreover the algorithm is conceptually simple and easy to implement...
To complement Josiah's answer -- since you didn't say whether there is a single such object to be detected in your point cloud -- a good solution can be to use a (generalized) Hough transform.
The idea is that each point will vote for a set of candidates in the parameter space of the shape you are considering. For instance, if you think the current object is a cylinder, you have a 7D parameter space consisting of the cylinder center (3D), direction (2D), height (1D) and radius (1D), and each point in your point cloud will vote for all parameters that agree with the observation of that point. Doing so allows to find the parameters of the actual cylinder by taking the set of parameters who have the highest number of votes.
Doing the same thing for planes, spheres etc.., will give you the best matching shape.
A strength of this method is that it allows for multiple objects in the same point cloud.

Running AStar on an updating graph

We are working on a project which involves running a shortest path algorithm on a big map.
We are using AStar with the Air Distance heaurstic for now.
Our project involves receiving updates to links in the database.
Currently we restart the search for each link update or at every predefined interval.
Is there a way to update the AStar algorithm to update the search without restarting the search on every update received? Is there a better algorithm suited for this task?
Disclosure: This is part of students project.
Thank you.
You might be looking for a routing algorithm (that by nature deals with constantly changing graphs).
One way to achieve it is with Distance Vector Routing Protocol (which is a distributed version of Bellman Ford algorithm) and works as follows1:
periodically, every vertex sends its "distances vector" to its
neighbors [the vector indicates how much it 'costs' to travel from the sending vertex, to each other vertex.
Its neighbors try to update their routing tables [through which edge is it best to move to the each target]
for your case, each node knows what is the fastest way to get to its neighbors (1 if the graph is unweighted) and it (the vertex) adds this number to each entree in the distance vector, in order to know how to and how much time it will take, to get to the destination. Every time a modification arrives, the relevant node will invoke a new iteration of the protocol until it re-converges.
Note however, that this algorithm is uninformed (but deals well with changing graphs, with certain limitations, there is still the count to infinity problem)
(1) The explanation of the algorithm is based on an explanation I provided some time back in this thread, with some modifications. (It is the same suggested algorithm after all).

What algorithms can I use to make inferences from a graph?

Edited question to make it a bit more specific.
Not trying to base it on content of nodes but solely of structure of directed graph.
For example, pagerank(at first) solely used the link structure(directed graph) to make inferences on what was more relevant. I'm not totally sure, but I think Elo(chess ranking) does something simlair to rank players(although it adds scores also).
I'm using python's networkx package but right now I just want to understand any algorithms that accomplish this.
Thanks!
Eigenvector centrality is a network metric that can be used to model the probability that a node will be encountered in a random walk. It factors in not only the number of edges that a node has but also the number of edges the nodes it connects to have and onward with the edges that the nodes connected to its connected nodes have and so on. It can be implemented with a random walk which is how Google's PageRank algorithm works.
That said, the field of network analysis is broad and continues to develop with new and interesting research. The way you ask the question implies that you might have a different impression. Perhaps start by looking over the three links I included here and see if that gets you started and then follow up with more specific questions.
You should probably take a look at Markov Random Fields and Conditional Random Fields. Perhaps the closest thing similar to what you're describing is a Bayesian Network

Categories

Resources