## Lambda calculus without free variables is as strong as lambda calculus?

First question: How would one prove that by removing free (unbound) variables from lambda calculus, and allowing only bound variables, its power is not reduced (it is still Turing-complete) ?

Second question: Is the proposition given above really true? Is lambda calculus sans free variables really Turing-complete?

## What does Lambda Calculus teach us about data?

1. Can we generalize that data is just a suspended computation?
2. Is this true for other models of computation?
3. What books, or papers, one should read to better understand the nature of data and its relation to computation?

Some context: as a software developer, I got used to the concept of data so much that I never considered its true nature. I’d very much appreciate any references that could help me better understand the general connection between data and computation.

## Where is typed lambda calculus on the Chomsky hiererchy?

The functions definable in typed lambda calculus are the computable functions, for which idt is in turn possible to efine equivalences to the concepts of Turing machines, recursive enumerability and Type-0 grammars.

But what about typed lambda calculus — where on the Chomskian computability hierarchy are the functions definable by expressions of simply-typed lambda calculus?

Assuming that there is a natural way of transferring the idea of lambda-definability of a recursive function from untyped to simply-typed lambda calculus, along the lines of:

A $$k$$-ary number-theoretic function $$f$$ is simply-typed-lambda definable iff there is a simply typable $$\lambda$$-term $$P$$ such that for all $$x_1, \ldots x_k$$, where $$\underline{x}$$ is the encoding of $$x$$, $$P \underline{\vec{x}} =_\beta \underline{y} \text{ iff } f(\vec{x}) = y$$, if $$f(\vec{x})$$ is defined, and $$P$$ has no $$\beta$$-normal form othewise.

To make the bridge from functions to formal languages and the Chomsky hierarchy, I guess my question is:

Between which levels of the Chomsky hierarchy is the class of languages located such that $$L$$ is in the class iff there is a simply-typed-lambda-definable function $$f$$ such that $$f(w)$$ is defined if and only if $$w \in L$$?

Alternatively, are there other ways of building an correspondence between typed lambda calculus and formal languages or automata that makes it possible to locate it on the known computability scale in a meaningful way?

All I could find so far was about modifications of lambda calculus corresponding to certain types of grammars, or auotmata to recognize strings certain kinds of lambda expressions, but, surpisingly, nothing specifically about (Curry-style) typed lambda calculus.

## lambda calculus reduction: (((lambda f (lambda x (f x))) (lambda y (* y y))) 12)

given the input

(((lambda f (lambda x (f x))) (lambda y (* y y))) 12)

what does this step evaluate to: lambda x (f x)

I am trying to evaluate this and I have the following tree so far:

how do I evaluate this ? looking for guidance on what I might be doing wrong or how to proceed with this.

## Is there an abstract architecture equivalent to Von Neumann’s for Lambda expressions?

In other words, was a physical implementation modelling lambda calculus (so not built on top of a Von Neumann machine) ever devised? Even if just on paper?
If there was, what was it? Did we make use of its concepts somewhere practical (where it can be looked into and studied further)?

— I’m aware of specialised LISP machines. They were equipped with certain hardware components that made them better but eventually they were still the same at their core.

If there isn’t such thing, what stops it from being relevant or worth the effort? Is it just a silly thought to diverge so greatly from the current hardware and still manage to create a general-purpose computer?

## Proof of lambda reductions

I am not sure how to approach this question or what exactly it is asking. Need to prove the following reductions:

Need to prove those knowing that: N = λf.λc.(f (f . . .(f c)). . .) and that: addition: + = λM.λN.λa.λb.((M a)((N a) b)) multiplication: × = λM.λN.λa.(M (N a)) exponentiation: ∧ = λM.λN.(N M)

## Lambda Expression Reduction

I am unable to solve the following lambda expression using both normal order (Call-by-name) and applicative order (Call-by-value) reduction. I keep getting different answers for both. This is the lambda expression that has to be reduced using both techniques:

(λfx.f (f x)) (λfx.f (f x)) f x

## How to find a lambda term to complete a function?

I tried to complete this exercise but i stopped… Defining a $$\lambda$$-term M such that: $$() \: \simeq_{\beta} \: $$

I chose $$M=\lambda m \lambda a \lambda b \lambda p \,((p)m)b \:$$ then i have to find a representation T of a function using M that value true if the sequence is empty and false if it’s not. A sequence is defined as: $$[]=\lambda x_0\lambda x_1 \lambda z z \ [b]=\lambda x_0 \lambda x_1 \lambda z (z) x_b\ [b_1 b_2]=\lambda x_0 \lambda x_1 \lambda z ((z)x_{b_1})x_{b_2} \ .\. \ . \ [b_1 .. b_n]= \lambda x_0 \lambda x_1 \lambda z (…((z) x_{b_1})x_{b_2}…)x_{b_n}$$ so the sequence of exercise is : $$[01101]= \lambda x_0 \lambda x_1 \lambda z (((((z)x_0)x_1)x_1)x_0)x_1$$ For example T need to be: $$(T)[01101] \simeq_{\beta}$$ false while $$(T) []\simeq_{\beta}$$ true. I really find that difficult. How i can do that?

## Is it secure to rely on the data in a lambda context authorizer claims?

I am working on lambda authorization and I learned that there are generally two options.

Either use the default authorizer on the API gateway level, which will do all the heavy lifting (validate the tokens), or write a custom authorizer, which will require me to implement all the logic including all the token validations, which I would like to avoid if possible. I don’t want to write such code, I want to use something that is time proven and tested.

My question is, is it considered secure to write code in my lambda (e.g. python decorator) that will do authorization based on the data in the lambda context.authorizer.claims? assuming of course all I need is there (e.g. cognito:groups, cognito:username, etc.)

can I treat the authorizer data in the context as solid (passed the security validation)?

## in the lambda calculus with products and sums is $f : [n] \to [n]$ $\beta\eta$ equivalent to $f^{n!}$?

$$\eta$$-reduction is often described as arising from the desire for functions which are point-wise equal to be syntactically equal. In a simply typed calculus with products it is sufficient, but when sums are involved I fail to see how to reduce point-wise equal functions to a common term.

For example, it is easy to verify that any function $$f: (1+1) \to (1+1)$$ is point-wise equal to $$\lambda x.f(fx)$$, or more generally $$f$$ is point-wise equal to $$f^{n!}$$ when $$f: A \to A$$ and $$A$$ has exactly $$n$$ inhabitants. Is it possible to reduce $$f^{n!}$$ to $$f$$? If not, is there an extension of the simply typed calculus which allows this reduction?