How to add the weights to the transition graph of a Markov chain?

The following working program uses Graph and Markov Chain

P = {{1/2, 1/2, 0, 0}, {1/2, 1/2, 0, 0}, {1/4, 1/4, 1/4, 1/4}, {0, 0,     0, 1}}; proc = DiscreteMarkovProcess[3, P]; Graph[proc, GraphStyle -> "DiagramBlue",   EdgeLabels ->    With[{sm = MarkovProcessProperties[proc, "TransitionMatrix"]},     Flatten@Table[DirectedEdge[i, j] -> sm[[i, j]], {i, 2}, {j, 2}]]]  sm = MarkovProcessProperties[proc, "TransitionMatrix"] sm == P 

Since I couldn’t make it work for larger matrices, I clarified in the last two lines that sm is just P. But, if I try to replace sm by P in the first part, all hell breaks loose. So, I tried copy paste changing just P to a larger matrix, but this does not work. Why?

P = {{0, 1/4, 1/2, 1/4, 0, 0}, {0, 1, 0, 0, 0, 0}, {0, 0, 1/3, 0, 2/3,      0}, {0, 0, 0, 0, 0, 1},    {0, 0, 1/4, 0, 3/4, 0}, {1/4, 0, 0, 0, 3/4, 0}}; P // MatrixForm proc = DiscreteMarkovProcess[1, P]; Graph[proc,   EdgeLabels ->    With[{sm = MarkovProcessProperties[proc, "TransitionMatrix"]},     Flatten@Table[DirectedEdge[i, j] -> sm[[i, j]], {i, 6}, {j, 6}]]] 

Is there a way to solve the optimal branching / arborescence problem with path-dependent weights?

The optimal branching problem (solved by Edmond’s algo or Tarjan’s algo) finds the spanning arborescence for a particular graph. [0]

I’m looking for a formulation of the problem that allows for path-dependent weights.

i.e. if I have a path from ROOT to A, B, C, with weights 2 for each of the edges:

ROOT =[2]> A =[2]> B =[2]> C

The overall cost of the path is 8, not 6.

Is such a formulation possible?

[0] https://en.wikipedia.org/wiki/Edmonds%27_algorithm

Bipartite graphs with min weights

I have a full bipartite graph with node sets $ A=\{a_1,a_2,\ldots,a_n\}$ and $ B=\{b_1,b_2,\ldots,b_n\}$ . Each node has a weight, $ v_i$ for $ a_i$ and $ w_i$ for $ b_i$ . Each node $ a_i$ is connected to exactly one node of $ B$ , say $ b_j$ , via an edge $ e_i$ , whose weight is $ \min(v_i,w_j)$ . Now I want to find a one-to-one mapping from $ A$ to $ B$ , whose sum of edge weights is as small as possible.

My idea is to sort $ v_i$ s increasingly and $ w_i$ s decreasingly and then find the sum of all $ \min(v_i,w_i)$ after sorting. Is it correct? Can you give a proof/disproof?

Minimum spanning tree with small set of possible edge weights

Given a undirected graph which only has two different edge weights $ x$ and $ y$ is it possible to make a faster algorithm than Prim’s algorithm or Kruskal’s algorithm?

I saw on Kruskal’s algorithm that it already assumes that the edges can be sorted in linear time with count sort, so then can we gain any benefit by knowing the weights of all of the edges ahead of time?

I don’t think there is any benefit, but I am not sure…

Labeled points in $\{0,1\}^n$ such that every linear separator requires exponential weights

I want to find labeled samples in $ \{0,1\}^n$ such that the Perceptron algorithm takes $ 2^{\Omega(n)}$ steps to converge. One way to do this would be to find a sequence of labeled examples that are linearly separable, but require every linear separator to have at least one exponentially large weight. To show that the samples are linearly separable, it is enough to show that they are consistent with a decision list, which should be apparent from the list of samples. So, my question is

Does there exist a set of labeled samples $ S$ in $ \{0,1\}^n$ that are consistent with a decision list and such that any linear threshold function that correctly labels $ S$ has at least one exponentially large weight $ w_i = 2^{\Omega(n)}$ ?

Here are the definitions that I’m working with: A linear threshold function $ f \colon \{0,1\}^n \to \{0,1\}$ with associated weights $ w_0, \dots, w_n \in \mathbb{R}$ gives $ f(x) = 1$ if and only if $ w_1x_1 + w_2x_2 + \dots + w_nx_n \geq w_0$ . Given a set $ S$ of points in $ \{0,1\}^n$ labeled $ 0$ or $ 1$ , we say that a linear threshold function $ f$ correctly labels $ S$ if $ f(x) = 1$ whenever $ x$ is labeled $ 1$ and $ f(x) = 0$ whenever $ x$ is labeled $ 0$ for all $ x \in S$ .

Note: I had asked the same question on math.stackexchange since it seemed relevant to both fields. Here is the link for that.

How to best calculate the best possible path with weights

Given a set of nodes, with connections in certain directions (see image), what is the most coins you can collect between the first and last given node. Not all rooms have coins, and we want to output the path taken, as well as total coins taken. You may only travel in the direction of the arrows. We are allowed to take the same path twice. The solution for the situation in the figure below is the path 1 – 4 – 3 – 2 – 1 – 5 – 7 – 9 – 10 – 8 – 12, with a total of 7 coins. I think this should be possible in linear time. My idea so far is to start at the last node and go back, saving the “best attainable score” for each node. However, this implementation runs into issues when there are cycles in the graph. Is there a better way of doing this?

edit: Assuming n nodes, there will never be more than 10n connections total.

Text

A pathfinding algorithm for graphs in which arc weights can change over time

So I’m not really sure even what to be googling for solutions to this. Hence this question, hopefully, someone can point me in the right direction.

Here’s the situation, I have a weighted undirected graph of nodes and arcs. I have an implementation that uses A* for pathfinding on this graph. However, I now have a situation where the weights (cost) of each arc can change over time. That is at each step in the A* pathfinding algorithm the weights of the entire graph can change.

So I’m trying to see if there is an existing algorithm or alteration of A*-like algorithm that handles changing weights well. If anyone has any keywords I should be looking into I’d appreciate any pointers you can provide.

Graph theory question involving weights of edges

I’m trying to solve the following problem but I can’t understand it. Could you guys kindly break it down for me? I’m not asking for anyone to solve it. I just want to be able to grasp the problem.

Given a graph of siblings who have different interests, you’d like to know which groups of siblings have the most interests in common. You will then use a little math to determine a value to return.

You are given integers sibling_nodes and sibling_edges, representing the number of nodes and edges in the graph respectively. You are also given three array integers, sublings_from, siblings_to and siblings_weight, which describe the edges between siblings.

The graph consists of nodes numbered consecutively from 1 to siblings_nodes. Any members or groups of members who share the same interests are said to be connected by that interest (note that two group members can be connected by some interest even if they are not directly connected by the corresponding edge).

Once you’ve determined the node pairs with the maximum numbers of shared interests, return the product of the node pairs’ labels. If there are multiple pairs with the maximum number of shared interests, return the maximum product among them.

For example, you are given a graph with subling_nodes = 4 and sibling_edges = 5:

   FROM   TO   WEIGHT    1      2    2    1      2    3    2      3    1    2      3    3    2      4    4 

If we look at each interest, we have the following connections:

   INTEREST   CONNECTIONS       1          2,3    1          1,2    2          1,2,3    2          2,4 

Example input:

   siblings_nodes: 4    siblings_edges: 5    siblings_from: [1, 1, 2, 2, 2]    siblings_to: [2, 2, 3, 3, 4]    siblings_weight: [1, 2, 1, 3, 3] 

output:

   6 

Is it possible to keep weights of left and right subtree at each node of BST that has duplicate values?


Is it possible to keep weights of left and right subtree at each node of BST that has duplicate values?

I must be able to delete a node completely(irrespective of how many times it is present)

Currently, in my code, I am keeping count variable in each node that records the number of times it is present in tree.

During insertion, I can increase the size of left and right subtree weight at each node according to if my value is less or more. but how do I adjust the weights when I delete a node(because I may delete a node with count >1)