## In theory, should neuromorphic computers be more efficient than traditional computers when performing logic?

There is this general sentiment about neuromorphic computers that they are simply "more efficient" than von Neuman.

I’ve heard a lot of talk about neuromorphic computers in the context of machine learning.
But, is there any research into performing logic and maths in general on such computers? How would one translate arithmetic, logic and algorithms and into "instructions" for a neuromorphic computer, if there are no logic structures in the hardware itself?

It is common to draw parallels with a brain in this context, so here’s one: Brains are great at recognising faces and such, but I don’t think I can do maths faster than an Arduino (and that thing doesn’t need much energy).

## How to apply Arden’s theory to determine a regular expression

If P=(ab) and Q=(a)* How do I use Arden’s theorem to determine the regular expression R. I’m not sure if I am supposed to just substitute the values of P and Q in the equation R= Q + RP. Also how would I use that to check that R satisfies Arden’s equation

## Want help with proving a calculus theory

Let $$f(x)$$ have a second derivative on the closed interval $$[-2,2]$$. If $$\left| f(x) \right| \le 1$$ and $$\frac{1}{2} (f^{\prime}(0))^2+f(0)^3>\frac{3}{2}$$ when $$-2\le x\le2$$, now I need to prove that there must be a point $$x_{0}$$ on the interval $$(-2,2)$$ such that $$f^{\prime \prime}\left(x_{0}\right)+3\left[f\left(x_{0}\right)\right]=0$$.

(Series[1/2 (f'[x])^2 + f[x]^3, {x, 0, 1}]) // FullSimplify  1/2 (Series[f'[x], {x, 0, 1}] // Normal)^2 + (Series[      f[x], {x, 0, 1}] // Normal)^3 // Expand  (Series[f''[x], {x, 0, 1}] // Normal)^2 + (3*Series[f[x], {x, 0, 1}] //      Normal)^2 // Expand 

The above code does not reveal the nature of the problem and solve it cleverly. What can I do to solve this problem?

## What are the applications of homotopy type theory to everyday programming?

What are the applications of homotopy type theory to everyday programming?

I know of only two applications, neither of which I understand:

• "Homotopical Patch Theory"
• "HoTTSQL: Proving Query Rewrites with Univalent SQL Semantics"

Is there a capsule summary of how HoTT is relevant to these problems?

Is there a general kind of programming problem for which HoTT is suited? Based on the applications so far, is it likely that future applications will all have to do with program efficiency? Or might there be applications to distributed systems, for example?

Higher inductive types strike me as the most obviously "new" thing from a programmer’s point of view. Is there a capsule summary of why programmers might use higher inductive types? Do these applications only have to do with program correctness, or do they also give us the ability to solve problems differently?

I know it’s early days and that we probably don’t know what the applications may be, but it’s also likely that more is known now than several years ago when the articles above were written.

## Is this a graph theory problem?

My basic problem includes a graph where each node $$i$$ is associated with a weight $$c_i$$, and the problem is to find a minimum (or maximum) weighted independent set with a fixed cardinality $$p$$. This is I believe a well-known problem in graph theory that is well-studied for different types of graphs.

Now, suppose I am dealing with a generalized form of the problem as following. The weight of each node can take $$p$$ different values, that is each node is associated with $$p$$ different weights. The aim is again to find a minimum (or maximum) weighted independent set with a fixed cardinality $$p$$, however, each type of weight can be selected only once. Precisely, if the weight type $$j$$ is selected for the node $$i$$, i.e., we select the weight $$c_{ij}$$, then the other selected nodes cannot take a weight of type $$j$$.

My question is that, is this still a graph theory problem? Is it a known generalization in the graph theory problems?

Any help and/or reference is appreciated.

## Halting problem theory vs. practice

It is often asserted that the halting problem is undecidable. And proving it is indeed trivial.

But that only applies to an arbitrary program.

Has there been any study regarding classes of programs humans usually make?

It can sometimes be easy to analyze a program and enumerate all of its degrees of freedom and conclude that it will halt.

For example, has there ever been an effort to create a programming language (scripting really) that guarantees halting? It would not be widely applicable but could still be useful for mission critical modules.

## Consistent theory based on L and not(A->A) is a theorem

I am working on this problem in which I have a theory T based on L language and the only information we have is that T is consistent and |- not(A -> A). Given this information, how can I know if this theory is sound, complete and/or decidable?

My only guess is that I can say that T is sound because since T is consistent and we can derive not(A->A) from axioms and inference rules, not(A->A) is a theorem and because of that we can assume that T is sound (because the premises and conclusions are true).

Thank you!

## Any reason why Turing Machine would prevail on recursion theory?

Nowadays, most introduction books, videos, and comments about theoretical computer science talk about Turing machines but don’t discuss recursion theory anymore. These approaches are known to be equivalent.

What is the reason behind this? Has recursion theory simply gone out of fashion or is there a fundamental/mathematical reason explaining this?

## Is Group Theory useful in Computer Science in other areas but cryptography?

I have heard many times that Group Theory is highly important in Computer Science, but does it have any use other than cryptography? I tend to believe that it does have many other usages, but cannot find out where and how to apply Group Theory to other areas in CS, such as algorithms, data structres, graphs, complexity and so forth.

## Ramsey Theory Check

So, I want to create a program that would check if a graph contains a complete sub-graph (more in the Party Problem) and my point is to prove that the computer fails to do so quickly as the number of graph’s vertices increase. So far, I have come up with only this code, but it doesn’t work for some reason. Any suggestions?

   RamseyNumber[k_, l_] := Module[{i = 3, r = 0},   While[    i <= Length[list] && r == 0,    If[((#[[1]] >= k || #[[2]] >= l) & /@       (And @@ list[[i]])), r = i, i++]    ]; r   ]  RamseyNumber[n_] := RamseyNumber[n, n]  Timing[  in = Table[     Length /@ MaximumIndependentSet /@ Graphs[n], {n, 8}     ];  ]