## reduction of independence problem and cluster problem

independent problem is: there is a simple and undirected graph, we are looking for the maximum vertex in which there is no edge between any two of them.

cluster problem is: there is a simple and undirected graph, we are looking for the maximum number of the vertex in which there are proximity every two vertexes ( there is an edge between any two vertexes)

how can I reduct independent problem to cluster problem and vise versa?

## Independence of order of insertion hashtable with open addressing

I’m taking a data-structure class, and the lecturer made the following assertion:

the number of attempts needed to insert n keys in a hash table with linear probing is independent of their order.

No proof was given, so I tried to get one myself. However, I’m stuck.

My approach at the moment: I try to show that if I swap two adjacent keys the number of attempts doesn’t change. I get the idea behind it, and I think it’s going in the right direction, but I can’t manage to make it into a rigorous proof.

Aside, does this fact also hold for other probing techniques such as quadratic or double hashing?

## A regular independence induced graph in a $\Delta+1$ coloring

Consider any regular graph $$G$$ with order $$n$$ and size $$E$$ and maximum degree $$\Delta$$. Now, we give a $$\Delta+1$$ coloring to the vertices such that each vertex and its neighbors receive distinct colors.

Consider a color class of independent vertices in such a coloring. If we remove the color class, then do we have a regular subgraph with with maximum degree $$\Delta-1$$?

I think yes. It is easily seen to be true in the case of complete graphs. It can also be extended, I think, to graphs with $$\Delta\ge\frac{n}{2}$$, since each color class in such a coloring would have at most $$2$$ vertices. But, given any regular graph, is the claim true? Thanks beforehand.

## Independence number of $4$-uniform regular hypergraph

Let $$H$$ be a $$4$$-uniform hypergraph on $$[1..n]$$, i.e. $$H$$ is a collection of $$4$$-element subsets of $$[1..n]$$. The elements of $$H$$ are called edges. A hypergraph is regular if every element of $$[1..n]$$ are in the same number of edges.

An independent set $$I$$ of $$H$$ is a subset of $$[1..n]$$ such that $$I$$ does not contain any edge. The independence number of $$H$$ is the maximal cardinality among independent sets of $$H$$.

Question:

If a regular hypergraph $$H$$ has size $$O(n^3)$$, does it have an independent set with size $$O(\sqrt{n})$$?

Motivation:

Consider the problem of finding a Sidon set in $$\mathbb{Z}_n$$. If we relax the problem by allowing 3-term arithmetic progressions, the problem can be encoded into hypergraph independence: $$H=\{\{a,b,c,d\}|a,b,c,d\in\mathbb{Z}_n,b-a=d-c ≠0,a≠c,a≠d\}$$. Sidon sets are independent sets of $$H$$, the largest with size $$O(\sqrt{n})$$. I would like to find a purely combinatorial analog of Sidon sets, with similar size and constraints. Randomized methods give independent sets with size $$O(\sqrt[3]{n})$$.

References about hypergraph independence featuring some group structure (hence not “purely combinatorial”) are also welcome.

## Linear Independence of Binary Vectors over reals

Suppose we have $$Y_1, \ldots, Y_n \in \mathbb{R}^m$$, $$n$$ independent random vectors ($$m \geq n$$), where the entries of each $$Y_i$$ are i.i.d. Bernoulli random variables taking the values $$\{0, 1\}$$ with equal probability. I am interested in estimates for the probability that the $$Y_i$$‘s are linearly independent over $$\mathbb{R}$$. Or more generally, for $$k \leq n$$, what is the probability that for all subsets $$S \subset [n]$$ of size $$k$$, that $$\{ Y_i : i \in S\}$$ are linearly independent.

I have found some references that answer related questions. This post has references that speak on the asymptotic probability of a random square $$\{0, 1\}$$ matrix being invertible: Number of invertible {0,1} real matrices?

Most notably it includes this reference ( https://arxiv.org/abs/0905.0461 ) that shows the probability that a $$n \times n$$ random $$\{0,1\}$$ matrix is singular is $$\mathcal{O}\left((\sqrt{\frac{1}{2}} + o(1))^n \right)$$. Using this I can get some crude asymptotic estimates, but I am searching for a non-asymptotic result ideally.

Similar problems have been investigated where the independence is instead over $$\mathbb{F}^2$$: Expected number of random binary vectors so that the form a basis

I am interested in any references that offer any tools that might help with this problem. I am most interested in the case when $$m$$ is a large constant multiple of $$n$$. Thank you.

## Vectors linear dependence/ independence

Let A be k x k matrix with real entries and x ≠ 0. Then the vectors x, Ax, A2x, A3x, A4x, A5x …….AK x are

linear dependence/ independence cannot be determined from given data

linearly independent

linearly dependent

linearly dependent if and only if A is symmetric

## verify linear independence of solutions of ODE

I am trying to verify linear independence of three solutions of ODE, which are$$\begin{bmatrix}1\1\0 \end{bmatrix} e^{t}, \begin{bmatrix}1\-1\1 \end{bmatrix} e^{t}, \begin{bmatrix}1\-1\1 \end{bmatrix} e^{2t}.$$ Plugging in $$t=0$$ we can immediately find that only when $$c_{1} = c_{2} = c_{3} = 0$$ can we have $$\begin{cases}c_{1}e^t +c_{2}e^t+c_{3}e^{2t}=0 \c_{1}e^t -c_{2}e^t-c_{3}e^{2t}=0\0 +c_{2}e^t+c_{3}e^{2t}=0\end{cases}$$ holds for $$\forall t$$.

But since the determinant of linear independent vectors would be not zero, the matrix made up with these three vectors should not have determinant equal to zero. I made the matrix and found that $$Det(\begin{bmatrix}e^t&e^t&e^{2t}\e^t&-e^t&-e^{2t}\0&e^t&e^{2t} \end{bmatrix}) = 0.$$

I don’t know where it goes wrong, If vectors are linear independent, their determinant should be not equal to zero.

## Independence of Duistermaat-Heckman measure

Suppose a compact Kähler manifold $$(X,\omega)$$ is toric. Then for any smooth function from $$PSH_{tor}(X,\omega) := \{\phi \in PSH(X,\omega)\;|\; \phi\text{ – invariant}\}$$ the form $$\omega_\phi := \omega + i\partial\bar{\partial}\phi$$ is another invariant symplectic form in the same cohomology class, thus it has the same moment polytope.

Now the question is: if $$J$$ is the moment map for $$\omega$$ and $$J_\phi$$ is the moment map for $$\omega_\phi$$, do they produce the same Duistermaat-Heckman measure on the polytope? I suppose they do, but why?

## Conditioning on future events, strong Markov property, independence

I have a question on an argument appearing in this article P.

Setting

Let $$S=(1,\infty) \times (-1,1) \subset \mathbb{R}^2$$ and let $$X=(\{X_t\},\{P_x\}_{x \in S})$$ be a diffusion process on $$S$$. Imagine something like the Brownian motion on $$S$$ conditioned to hit $$\{1\} \times (-1,1)$$.

We denote by $$r(t)$$, $$y(t)$$ the first coordinate process of $$X$$ and the second coordinate process of $$X$$, respectively. Let $$\tau_r=\inf\{t>0 \mid r(t)=r\}$$, $$r \ge 1$$.

The author consider random variables of the form \begin{align*} R=\int_{0}^{\tau_1}\frac{1}{r(s)^2}\,ds,\quad R_k=\int_{\tau_{k}}^{\tau_{k-1}}\frac{1}{r(s)^2}\,ds,\quad k \ge 2. \end{align*}

My question

Let $$(n,y) \in (2,\infty) \times (-1,1)$$ and let $$\{y_k\}_{k=1}^{n-1} \in (-1,1)^{n-1}$$.

• The author claims that the random variables $$\{R_k\}_{k=2}^{n}$$ are independent under $$P_{n,y}(\cdot \mid y(\tau_k)=y_k,\ k=1,\cdots,n-1)$$. Here, $$P_{n,y}(\cdot \mid y(\tau_k)=y_k,\ k=1,\cdots,n-1)$$ is defined as follows: for any events $$A$$ and Borel subset $$B \subset (-1,1)^{n-1}$$

\begin{align*} &P_{n,y}(A \cap \{(y(\tau_1),\cdots, y(\tau_{n-1})) \in B\} )\ &=\int_{B}P_{n,y}(A \mid y(\tau_k)=y_k,\ k=1,\cdots,n-1)\,d\nu(y_1,\cdots, y_n),\ &\nu=\text{ the distribution of } (y(\tau_1),\cdots, y(\tau_{n-1})). \end{align*}

• The author seems to use the strong Markov property for \begin{align*} E_{n,y}\left[\int_{\tau_k}^{\tau_{k-1}}\frac{1}{r(s)^2}\,ds \mid y(\tau_k)=y_k,\ k=1,\cdots,n-1 \right]. \end{align*} to obtain the following: \begin{align*} &E_{n,y}\left[\sum_{k=2}^{n}\int_{\tau_{k}}^{\tau_{k-1}}\frac{1}{r(s)^2}\,ds \mid y(\tau_k)=y_k,\ k=1,\cdots,n-1 \right]\ &=\sum_{k=2}^{n}E_{k,y_{k}}\left[\int_{\tau_k}^{\tau_{k-1}}\frac{1}{r(s)^2}\,ds \mid y(\tau_{k-1})=y_{k-1} \right]. \end{align*}

However, I do not know how to use the strong Markov property. The author seems to consider conditioning on future events. Is this possible?

## Independence of two events with non empty intersection

Let $$A, B$$ be two non empty events. If they are disjoint, i. e. exclusive, they are not independent. In the case they are not disjoint, they can be either independent or not independent. Intuitively, in the case two events have non empty intersection, the occurrence of one event will condition the occurrence of the other one since their intersection is not empty. I am thus confused.

Can you provide an explanation and an example of 2 non empty events with non empty intersection which are independent?

Many thanks.