Proof associated Legendre polynomials are orthogonal: integral doesn’t solve

Wondering why Mathematica can’t solve this integral:

Integrate[  LegendreP[l1, 1, x] *    LegendreP[l2, 1, x], {x, -1, 1}] 

Mathematica outputs: $ \int_{-1}^1 P_{\text{l1}}^1(x) P_{\text{l2}}^1(x) \, dx$

But I see there is an analytical solution: $ $ \int_{-1}^{1} P^{m}_{l}(x) P^{m}_{k}(x)dx=\frac{2}{2l + 1}\frac{(l + m)!}{(l – m)!}\delta_{lk} $ $ Where $ \delta_{lk}$ is Kronecker delta

The solution looks like it involves integration by parts and trig substitution. I’m wondering why Mathematica can’t solve it, or if there is some way to modify the input so that Mathematica can figure it out. Trying to build intuition on how to use Mathematica and what are it’s limits.

I also tried forming the function myself but it didn’t help:

p[x_, l_, m_] := ((-1)^m/(2^l l!)*(1 - x^2)^(m/2) * D[(y^2 - 1)^l, {y, l + m}]) /. y -> x; Integrate[p[x, l1, 1] * p[x, l2, 1], {x, -1, 1}] 

Pseudo metric on the orthogonal group

Let $ O(n)$ be the set of all $ n\times n$ orthogonal matrices. Define an equivalence relation $ \sim$ on $ O(n)$ as follows: $ U\sim V$ iff there exists a permutation matrix $ \Pi$ and a diagonal matrix $ \Lambda$ where all diagonal entries are either 1 or -1, such that $ U=V\Pi\Lambda$ . Let $ \tilde{O}(n)$ be the quotient set $ O(n)/\sim$ .

The question is that: does there exist a psedo metric $ \rho$ on $ O(n)$ such that there is a well-defined metric $ \tilde{\rho}$ on $ \tilde{O}(n)$ such that $ \tilde{\rho}([U],[V])=\rho(U,V)$ , where $ [U],[V]\in\tilde{O}(n)$ are the equivalent classes with representative elements $ U,V\in O(n)$ respectively.

Projection of a polytope along 4 orthogonal axes

Consider the following problem:

Given an $ \mathcal{H}$ -polytope $ P$ in $ \mathbb{R}^d$ and $ 4$ orthogonal vectors $ v_1, …, v_4 \in \mathbb{R}^d$ , compute the projection of $ P$ to the subspace generated by $ v_1, …, v_4$ (and ouput it as an $ \mathcal{H}$ -polytope).

I know that the problem of computing projections along $ k$ orthogonal vectors in NP-hard (if $ k$ and $ d$ are part of the input), as shown in this paper. But does it help if $ k$ is a constant? Specifically, does it help if $ k \leq 4$ ? Do we have a polynomial algorithm in this case?

Best orthogonal approximation of rank 1 matrix

Let $ X=\lambda_0u_0v_0^T\in\mathbb{R}^{n\times n}$ be a rank 1 matrix where $ \lambda_0\in\mathbb{R}$ , $ u_0,v_0$ are of unit Euclidean norm. What is the solution of the following problem? $ $ \hat{X}=\arg\min_{\substack{Y:Y=aU\U\in O(n)\a\in\mathbb{R}}}\|Y-X\|_F^2$ $ where $ U$ is orthogonal and $ \|\|_F$ is the Frobenius norm. In other words, what is the best scaled orthogonal approximation of rank 1 matrix?

About a family of orthogonal polynoms satisfying a recurrence relation

let $ P_0(x)=0$ ;$ P_1(x)=1$ Let $ \forall n $ integer $ \geq 2$ , $ \forall x$ real, $ $ P_n(x)=\displaystyle \sum_{k=0}^{n-1} C_{n+k}^n (-x)^k \alpha_{n,k}$ $ and where $ \forall k$ such that $ 0 \leq k \leq n-1 $ $ $ \alpha_{n,k}= \displaystyle \sum_{p=1}^{n-k} \displaystyle C_{n}^{n-k-p} \frac{(-1)^{p+1}}{p}$ $ So it’s easy to check that $ \forall n $ integer $ \geq 1$ , degree of $ P_n$ is $ n-1$ .

I have found the following using Mapple: $ \forall 0 \leq n \leq 20,$

$ (n+2).P_{n+2}(x)-(2.n+3).(1-2.x).P_{n+1}(x)+(n+1).P_{n}(x)=0$

I think i can proove that this recurrence relation is true $ \forall n $ integer with working on the coefficient of the polynom $ P_n$ .

What interess me is to know the weight $ w$ such that $ \int_{0}^1 P_n(x)w(x)x^idx=0$ , ( PS:i’m not sure for the existence of $ w$ )

and i need to proove that $ \forall n \geq 1, \forall x \in [0,1], |P_n(x)| \leq |P_n(0)|$ .

Thanks for your help

Ps: the familiy of legendre polynomial $ L_n(x)=\displaystyle \frac{1}{n!}(x^n (1-x)^n)^{(n)}$ satisfy exactly the same recurrence relation but obviously with not with others initials conditions.

Find the standard matrix for the orthogonal projections onto col(A), and find least squares line y=ax+b (0,1),(1,3),(2,4) and (3,4)

Find the standard matrix for the orthogonal projections onto col(A), Find the least squares line y=ax+b of best fit to the four points(0,1),(1,3),(2,4) and (3,4)

A=enter image description here

A is 3×3 matrix [ 1 1 2] [ 2 1 3] [-1 2 1]

(a)Find the standard matrix for the orthogonal projections onto col(A) *Calculate the inverse of a matrix. *Do not calculate the product of matrices.

(b)Find the least squares line y=ax+b of best fit to the four points(0,1),(1,3),(2,4) and (3,4)

Orthogonal similarity of adjacency matrices of graphs which are cospectral, have cospectral complements and have a common equitable partition

Let $ G$ and $ H$ be two undirected graphs of the same order (i.e., they have the same number of vertices). Denote by $ A_G$ and $ A_H$ the corresponding adjacency matrices. Furthermore, denote by $ \bar G$ and $ \bar H$ the complement graphs of $ G$ and $ H$ , respectively.

When $ G$ and $ H$ are cospectral, and $ \bar G$ and $ \bar H$ are cospectral, it is known (see e.g., Theorem 3 in Van Dam et al. [1]) that there exists an orthogonal matrix $ O$ such that $ A_G\cdot O=O\cdot A_H$ and furthermore, $ O\cdot \mathbf{1}=\mathbf{1}$ , where $ \mathbf{1}$ denotes the vector consisting of all ones.

Suppose that, in addition, $ G$ and $ H$ have a common equitable partition. That is, there exist partitions $ {\cal V}=\{V_1,\ldots,V_\ell\}$ of the vertices in $ G$ and $ {\cal W}=\{W_1,\ldots,W_\ell\}$ of the vertices in $ H$ such that (i) $ |V_i|=|W_i|$ for all $ i=1,\ldots,\ell$ ; and (ii) $ \text{deg}(v,V_j)=\text{deg}(w,W_j)$ for any $ v$ in $ V_i$ and $ w$ in $ W_i$ , and this for all $ i,j=1,\ldots,\ell$ .

Question:

  • What extra structural conditions on the orthogonal matrix $ O$ , apart from $ A_G\cdot O=O\cdot A_H$ and $ O\cdot \mathbf{1}=\mathbf{1}$ , can be derived when $ G$ and $ H$ are cospectral, have cospectral complements, and have a common equitable partition?

I am particularly interested in showing that one can assume that $ O$ is block structured according to the partitions involved. That is, if $ \mathbf{1}_{V_i}$ and $ \mathbf{1}_{W_i}$ denote the indicator vectors of the (common) partitions $ {\cal V}$ and $ {\cal W}$ , respectively, can $ O$ be assumed to satisfy $ $ \text{diag}(\mathbf{1}_{V_i})\cdot O=O\cdot \text{diag}(\mathbf{1}_{W_i}), $ $ for $ i=1,\ldots,\ell$ ? Here, $ \text{diag}(v)$ for a vector $ v$ denotes the diagonal matrix with $ v$ on its diagonal.

[1] Cospectral graphs and the generalized adjacency matrix, E.R. van Dam, W.H. Haemers, J.H. Koolen. Linear Algebra and its Applications 423 (2007) 33–41. https://doi.org/10.1016/j.laa.2006.07.017

Orthogonal range reporting with fixed upper rectangular corner

Consider the following special case of orthogonal range searching:

Given a set $ S$ of $ n$ points in $ d$ dimensions, and rectangular queries with a fixed “upper-left” rectangle corner $ (0,0,…0)$ , report the total number of points inside the rectangle.

This is different in that all rectangles queried have a fixed corner. In this case, will we have a dynamic algorithm with a better runtime, or is the problem still as hard as general orthogonal range searching?