Theoretically, If you know the hash of a program one intends to install and you generate another file that hashes to that value what could you do?

If I know the hash of a program you intend to install is d306c9f6c5…, if I generate some other file that hashes to that value, I could wreak all sorts of havoc. – from

Theoretically, If you know the hash of a program one intends to install and you generate another file that hashes to that value what could you do?

Should a math major who wants to get into a PhD program in AI take CS classes that are unrelated to AI research?

I am a 2nd year math major who wants to do a PhD in AI. I already have a little background in programming and CS. I have some experience in game design, app development and computer security. I wouldn’t call myself an expert, but I am comfortable with C, C++, Python, Javascript C#, Java and Assembly and have written and sold a couple of commercial software to businesses before. I am also somewhat familiar with amortized analysis, data structures, algorithms and proofs of correctness. My question is, do I still have to take introductory classes in order to prove to graduate schools that I know programming or is it okay if I only take classes related to AI research like algorithms, machine learning and introduction to artifical intelligence? Also, do I need to take “engineering” type classes like digital design, operating systems, databases etc. that are normally required from computer science students but not related to AI research? Thanks in advance for taking the time to answer these questions.

How does one program in a tag system?

I’ve played with 2-tag systems a bit and read all about tag/lag systems. They’re great for experimenting with computation, and obviously useful as intermediaries in various proofs.

My question is: does anyone understand how to actually program in them? I mean, Turing Machines, if I sit down and work through it, I can eventually design one to do something I want, and/or understand what’s going on under the hood.

Contrast this to 2-tag systems, where I am lost. I think the most complex thing I was ever able to make without brute forcing it was one that added two numbers (well, concatenated two strings) together.

Does anyone have any advice on how to look at tag systems as a programming paradigm? Or are they just too chaotic to be intentionally used in that way?

What does a model X need to be so that one program of $X/O(1)$ solve in $X$?

Let $ X=P$ , then we can have function

L = len(input) k = 0 while (L>0):     k = k+1     L = log2(L) if (k mod 4 == c):     for 2^len(input):         pass return 1 

which, for every $ n$ , we have a $ c\in \{0,1,2,3\}$ , making all inputs shorter than $ n$ runs in $ Poly(n)$ time; but no matter what $ c$ is, it doesn’t run in polynomial time. Yet, let $ X=R$ , then proof that finding is possible is shown here.

Note that function

return 1 

always return the same value and is polynomial time, but we require to select one rather than create one program. Proof that creating is possible is shown here.

So what should $ X$ be so that, if solver needs to consider smaller inputs, at least one of the limit(s) of $ X/O(1)$ solve problem in $ X$ ? What should $ X$ be so that, All of the limit(s) of $ X/O(1)$ solve problem in $ X$ ?

How to translate automaton (Turing machine) into the program of high level programming language?

Every program in high level (“industrial”) programming language can be expresses as some Turing machine. I guess, that there exists universal algorithm for doing that (e.g. one can take the Cartesian multiplication of the domains of all the variables and the resulting space can be the state space of the Turing machine, though the handling of computer-representable floats can be a tricky – is there such general algorithm or system that does that? is example for the programming language for Turing machine and for the transpiler that translates C programs into the language of Turing machines or see for some kind of Turing assembler language). But what about the other direction – can Turing machine be rewritten in conscise program in high level programming language that uses functions, compositionality of functions and higher-order functions?

Of course, there can be infinite results to that conversion – staring from the naming of the variables and functions and ending with the data structures, content of functions, etc. But there are metrics for the quality of the software code and maximizing such metrics can result ir more or less unique answer to this stated problem.

Such conversion is very actual in the current context of reward macines for the reinforcement learning (e.g. – symbolic representation of the reward function (as opposite to tabular or deep neural represenation). Such symbolic representation greatly facilitates skill transfer among different tasks and it introduces the inference during the learning process and in such way reward machines reduces the need for data and need for learning time.

One can say that the extraction of first order and higher-order functions is quite hard tasks, but this task is being tackled by the higher order meta-interpretative learning, e.g.

So – are there research trends, works, results, frameworks, ideas, algorithms about conversion of the Turing machine into program in high level programming language (and possibly back). I am interested into any answer – be it about the functional, logic or imperative programming.

Is checking if the length of a C program that can generate a string is less than a given number decidable?

I was given this question:

Komplexity(S) is the length of the smallest C program that generates the string S as an output. Is the question “Komplexity(S) < K” decidable?

With respect to decidability, I only know about the Halting Problem and just learned about Rice’s Theorem while searching online (though I don’t think it can be applied here?). I couldn’t reduce the problem to any undecidable problem I know about. Thanks in advance for any help

How can I actually program Assembly

So, I did some basic assembly programming and had some fun with it, but I kinda feel very limited by ‘only’ using simulators. So I was wondering how I can actually truly do assembly or even machine code programming.

What I’ve seen so far the only “real” way to do this is by getting an actual processor/circuit I can talk to directly, but I also did not find such thing, and the only way I saw to programm arduino like that is by implementing it into C code, which sounds like basically compiling it into C, which is completly against the point.

Are there some machines built for exacly that? Will I have to ‘build my own Computer’ out of such processor and clock?