## Adding edges to a DAG to make it strongly connected with minimum cost

I have a weighted DAG and a function computing the weight of edges that is not connected in the DAG. The weight of u to v equals to the weight of v to u.

I want to connect edges to make the DAG strongly connected with minimum total weights of added edges.

I know that the minimum number of added edges equals to $$\text{max}(|source|, |sink|)$$ but which vertex connects to which vertex that the total weights is minimum?

## Strongly Connected component algorithm implementation(Python)

My goal is to implement Strongly Connected Components algorithm using python. I have splitted up my code on 3 parts:

import csv as csv import numpy as np import random as random import copy as copy import math import sys, threading import time sys.setrecursionlimit(800000) threading.stack_size(67108864)  start = time.time()  num_nodes = 160000 graph = [[] for i in range(num_nodes)] reverse_graph = [[] for i in range(num_nodes)] graph_2_step = [[] for i in range(num_nodes)]  file = open("C:\Users\yefida\Desktop\Study_folder\Online_Courses\Algorithms\Project 5\test7.txt", "r")  data = file.readlines() for line in data:     if line.strip():         items = line.split()           if int(items[1]) not in reverse_graph[int(items[1]) - 1]:                reverse_graph[int(items[1]) - 1].append(int(items[1]))              reverse_graph[int(items[1]) - 1].append(int(items[0]))          else:             reverse_graph[int(items[1]) - 1].append(int(items[0]))             if int(items[0]) not in graph[int(items[0]) - 1]:                graph[int(items[0]) - 1].append(int(items[0]))              graph[int(items[0]) - 1].append(int(items[1]))          else:             graph[int(items[0]) - 1].append(int(items[1]))    for i in range(len(graph)):     if len(graph[i]) == 0:         graph[i] = [i+1,i+1]     if len(reverse_graph[i]) == 0:         reverse_graph[i] = [i+1,i+1]   end = time.time() time_taken = end - start print('Time: ',time_taken) 
2. Depth-first search algorithm on the reversed graph:

#2. Run DFS-loop on reversed Graph: start = time.time()  t = 0 # for finishing lines: how many nodes are processed so far s = None # current source vertex explored = set() finish_time = {}    def DFS(graph,node):     explored.add(node)     global s      for vertex in graph[node - 1][1:]:         if vertex not in explored:             DFS(graph,vertex)          global t     t+= 1     finish_time[node] = t  #Nodes starts from n to 1 for i in range(max(reverse_graph)[0],0,-1):     if i not in explored:         s = i         DFS(reverse_graph,i)  #Mapping to the new list in increasing order for edge in range(len(graph)):     for vertex in range(len(graph[edge])):         graph[edge][vertex] = finish_time[graph[edge][vertex]]      graph_2_step[graph[edge][0] - 1] = graph[edge]    end = time.time() time_taken = end - start print('Time: ',time_taken) 
3. Depth-first-search algortihm on the graph after step 2:

start = time.time() #3. Run DFS-loop on Graph with original directions(but with labeled finishing times): all_components = []#Saves all strongly connected components all_comp_elem = set()#check if element is in Strongly Connected Components(already explored) SCC = set() # strongly connected component, that will be saved in "all_components" explored= set() # variables, that are already explored  next_elem = 0 # contains information how many elements have to be checked, before making a decision  #c)modification of DFS def DFS_2_Path(graph,node):     global all_components     global SCC     global next_elem     explored.add(node)#node is explored       next_elem += len(graph[node - 1][1:]) # add number elements, that must be explored from the current node      #checking one vertex -> minus one element that must be explored     for vertex in graph[node - 1][1:]:         next_elem -= 1         #check if element is in Strongly Connected Components(already explored)         if node not in all_comp_elem:             SCC.add(node)          #if vertex is not explored, than reccursion and go to the next vertex         if vertex not in explored:                   SCC.add(vertex)             DFS_2_Path(graph,vertex)            #if vertex is not the last element in the chain(Ex: [6,5,1,7] -> 6 is a main Node, and 7 is the last element, to which     #node 6 is connected)             elif vertex in explored and vertex != graph[node - 1][1:][len(graph[node - 1][1:]) - 1]:             continue         #if vertex is the last element in the chain(Ex: [6,5,1,7] -> 6 is a main Node, and 7 is the last element, to which     #node 6 is connected) -> update stringly connected components              elif vertex in explored and vertex == graph[node - 1][1:][len(graph[node - 1][1:]) - 1] and next_elem == 0:             all_components.append(SCC)             all_comp_elem.update(SCC)             SSC = set()  #Main loop             for i in range(max(graph_2_step)[0],0,-1):     if i not in explored:         DFS_2_Path(graph_2_step,i)    end = time.time() time_taken = end - start print('Time: ',time_taken) 

I have tested my algorithm on different test cases -> it works correct. First two parts of the algorithm work fast(on the data set with 160000 nodes). But when I run the third part -> kernel in Jupyter dies.

I have improved the speed of the code as much as I could. I definitely need a fresh look on my code.

P.S Don’t look at first 2 parts of the code. I provided them to you only for the test, if you want to test.

P.S.S The link to the file, that I have used for the test: https://github.com/beaunus/stanford-algs/blob/master/testCases/course2/assignment1SCC/input_mostlyCycles_64_160000.txt

## Distance between Verma modules in certain “strongly” standard filtrations

On p. 128 of the book: Representations of Semisimple Lie Algebras in the BGG Category $$\mathcal{O}$$.

I quote: “……Delorme arrives at vanishing criteria for Extn O which are more general that those in Theorem 6.11 above. These involve a kind of length function $$\ell(\mu, \lambda)$$ expressing the distance between $$M(\mu)$$ and $$M(\lambda)$$ in certain “strongly” standard filtrations:

$$\mathrm{Ext}^n_\mathcal{O}(M(\mu), M(\lambda)) = 0$$ for all $$n > \ell(\mu, \lambda)$$,

$$\mathrm{Ext}^n_\mathcal{O}(M(\mu), L(\lambda)) = 0$$ for all $$n > \ell(\mu, \lambda)$$,……”

My question:

1. Does anyone know what is the definition of length function $$\ell(\mu, \lambda)$$?

Let $$\rho$$ be the half sum of positive roots in $$\Phi^+$$, $$M(u\cdot(-2\rho))$$ be the Verma module with highest weight $$u\cdot(-2\rho)$$ and $$L(u\cdot(-2\rho))$$ be the simple highest weight module with highest weight $$u\cdot(-2\rho)$$.

1. I want to show that for all $$x\not\le w$$, we have $$\mathrm{Ext}^n_\mathcal{O}(M(x\cdot(-2\rho)), L(w\cdot(-2\rho))) = 0$$ for all $$n\in\mathbb{N}$$. Does anyone know how to show this fact?

## Strongly connected component in graph(python)

My goal is to calculate the length of each strognly connected componnent( SSC)

I have an input that looks like this:

[['1', '2'],  ['1', '2'],  ['1', '5'],  ['1', '6'],  ['1', '7'],  ['1', '3'],  ['1', '8'],  ['1', '4'],  ['2', '47646'],  ['2', '47647'],  ['2', '13019']... 

where lists inside big list means edge and elements of inside lists mean first and second vertex respectively.

Here is my code:

#1.Create reverse graph: changing directions of the directed graph #a) df_reverse = [None] * len(df) for i in range(len(df)):     df_reverse[i] = [int(df[i][1])]     df_reverse[i].append(int(df[i][0])) #b) Sort the array according to df_reverse[i][0] df_reverse = sorted(df_reverse,reverse = True)  #2. Run DFS-loop on reversed Graph:  t = 0 # for finishing lines: how many nodes are processed so far s = None # current source vertex explored = [] finish_time = {}   def DFS(graph,node):     explored.append(node)     global s     leader = s      print('Node:',node)     print('Leader:',leader)     #index = [ind for ind,vertex  in enumerate(df_reverse) if vertex[0] == node]     for second_vert in graph:         print('Second_vert:',second_vert)         print('Second_vert[0] == node:',second_vert[0] == node)         if second_vert[0] == node:             print('second_vert[1] not in explored :',second_vert[1] not in explored)             if second_vert[1] not in explored:                 print('---------------------------------')                 print('NEXT ITERATION OF THE INNER LOOP')                 print('-------------------------------------')                 DFS(graph,second_vert[1])       global t     print('t was:',t)     t+= 1     print('t is :',t)     print('Index:',index)     finish_time[node] = t      print('LEADER TO THE NODE ',node,' IS ASSIGNED!')     print('-------------------------------------------')  #Nodes starts from n to 1 for i in range(max(df_reverse[0]),0,-1):     if i not in explored:         s = i         DFS(df_reverse,i)   #mapping finishing time to nodes for ind,val in enumerate(df_reverse):     df_reverse[ind][0] = finish_time[df_reverse[ind][0]]     df_reverse[ind][1] = finish_time[df_reverse[ind][1]]     #3. Run DFS-loop on Graph with original directions(but with labeled finishing times): df_reversed_back = [None] * len(df_reverse) for i in range(len(df_reverse)):     df_reversed_back[i] = [int(df_reverse[i][1])]     df_reversed_back[i].append(int(df_reverse[i][0])) #b) Sort the array according to df_reverse[i][0] df_reversed_back = sorted(df_reversed_back,reverse = True)  all_components = [] SSC = [] explored= [] #c)modification of DFS def DFS_2_Path(graph,node):     #global SSC     global all_components     explored.append(node)     print('Node:',node)     #index = [ind for ind,vertex  in enumerate(df_reverse) if vertex[0] == node]     for second_vert in graph:         print('Second_vert:',second_vert)         print('Second_vert[0] == node:',second_vert[0] == node)         if second_vert[0] == node:             print('second_vert[1] not in explored :',second_vert[1] not in explored)             if second_vert[1] not in explored:                 print('SSC was:',SSC)                 SSC.append(second_vert[1])                 print('SSC is:',SSC)                 print('---------------------------------')                 print('NEXT ITERATION OF THE INNER LOOP')                 print('-------------------------------------')                 DFS_2_Path(graph,second_vert[1])             if second_vert[1] in explored and len(SSC)> 0 :#check if second vert is not explored and if it's not a new SSC                 print('SSC was:',SSC)                 SSC.append(second_vert[1])                 print('SSC is:',SSC)                  all_components.append(SSC[:])                 print('All_components is :',all_components)                 SSC[:] = []      print('All_components was:',all_components)   for i in range(max(df_reversed_back[0]),0,-1):     if i not in explored:         s = i         DFS_2_Path(df_reversed_back,i) 

The probem is, that my code is very slow. I would appreciate any improvments and suggestions.

## Circle bundles and surface bundles which admit no strongly irreducible Heegaard splittings

Let $$S$$ be a closed connected orientable surface with $$g(S)>0$$. Jennifer Schultens, in her paper “The Classification of Heegaard Splittings for (Compact Orientable Surface)$$\times S^1$$”, proves that $$S\times S^1$$ does not admit any strongly irreducible Heegaard splitting. My questions are:

1. Are there other (i.e. non-trivial) closed circle bundles which admit no strongly irreducible Heegaard splittings? Furthermore, is there a classification of closed circle bundles which admit no strongly irreducible Heegaard splittings?
2. Are there other surface bundles which admit no strongly irreducible Heegaard splittings?

## Algorithm to convert undirected connected graph with no bridges to strongly connected directed graph

I am not sure how to go about this exactly. My attempt is find the pick an arbitrary node and run DFS, ordering each node by order of discovery. After this, You can orient each of edges so it pointing from lower order to higher order. Now I know this will create a path from one node to another, but doesn’t guarantee a path from the other back to the initial. Is there any way to modify my algorithm so that it it would strongly connected and not just connected.

## Finding topologically sorted strongly connected components in directed acyclic graph

I am aware topological sort and strongly connected components are very related algorithms, but I have been looking for an algorithm to simultaneously compute both, rather than one after the other, and I am finding it surprisingly hard.

Assuming a directed acyclic graph, is it possible to simultaneously find the list of strongly connected components while at the same time topologically sorting the nodes in each of the components?

To clarify, the output would be a list of components, where each component is a topological order of the nodes contained in that component.

Bonus points if the list of strongly connected components is sorted following the insertion order of nodes in the graph (i.e. the first connected component will contain the first node ever inserted in the graph, regardless of global topological order; etc for subsequent components).

## Incremental strongly connected components

For a changing directed graph, I would like to maintain information about strongly connected components. The graph operations are incremental: only vertex addition and edge addition. What data structures achieve the best known amortized complexity for these operations?

If the graph was undirected, the answer would be the union-find structure. And as undirected graphs can be seen as special cases of directed graphs, the (ever so slightly) superconstant lower bound carries over.

For a linear upper bound, the strongly connected components can be computed from scratch after each edge addition, without recycling any data at all. I wonder if there is any way to do better.

In the setting where I need this, I somehow expect non-trivial SCCs to be the exception rather than the rule. And in the absense of cycles I can achieve linear time in total (that is amortized constant time per operation). [EDIT:] Let me clarify what I mean. Of course, in the absense of cycles I do not need to keep track of SCCs at all. The amortized constant time is for what I do in my setting besides worrying abouts SCCs.

So I would also be interested in data structures that are, in general, not better than the above upper bound but use amortized constant time per operation as long as the graph remains a DAG.

## How to prove the logistic loss function is strongly convex?

The logistic loss function is: $$\mathcal{L}=\frac{1}{n}\sum_{i=1}^n\log(1+\exp(-y_ix_i^T\theta))$$ in which $$y_i\in\{-1,+1\},x\in \mathbb{R}^d$$. How to show that $$\mathcal{L}$$ is strongly convex.

My thinkings: Can we get the $$\nabla^2 \mathcal{L}(\theta)$$ and show $$\nabla^2 \mathcal{L}(\theta)-mI$$ is PSD for some $$m$$?

## Strongly rigid regular graphs

A simple, undirected graph $$G = (V,E)$$ is said to be strongly rigid if the identity is the only graph homormorphism.

For which positive integers $$k>2$$ is there a strongly rigid $$k$$-regular graph?