We can think of Type Theory as being a catalog of a variety of notions of computation. The type structure determines the “programming language features”. For example, whether you have higher order functions amounts to saying “do you have exponential types”; whether you have structs or tuples amounts to saying “do you have Cartesian product types”; whether you have choices or multiple classes of data corresponds to “whether you have sum types”. A programming language is really just a collection of types. So type theory is just a theory of construction. From that point of view, logic is just an application of type theory, because there are particular constructions which correspond to proofs. Other constructions like natural numbers or geometric objects don’t correspond to proofs of particular propositions. They are just mathematical constructions. What type theory is intersted in is the general concept of what is a mathematical construction. That’s why intuitionistic logic is also called constructive logic. From this point of view, we can say that logic is just a corner of mathematics, and mathematics is just a corner of computer science ;)
“Logic as if people matters”. We are talking about communication of knowledge. We treat proofs as mathematical objects, or programs. We claim that $A$ is true, we actually mean that we have a proof of $A$. $M:A$ means that $M$ is a proof of $A$, or $M$ is of type $A$, they are the same thing. There are many strong connections among proof theory, type theory, and category theory.
means $A$ has a verification.
means $A$ ay be used
A conversion rule ($\downarrow\uparrow$):
where $P$ is atomic
Here we will talk about computational interpretations by the example of lax logic. Hope from the example we can have sense of how logic and PL are connected.
We state “A is true”, then “A” is a proposition, and “A is true” as a whole is a judgement.
Some examples of application of logic branches in programming languagues:
K knows A | epistemic logic | distributed computing |
A true at time t | temporal logic | partitial evaluation |
A is a resource | linear logic | concurrent computing |
A is possible | lax logic | monad |
A is valid | modal logic | runtime code generation |
To define a judgement, we must have Introduction rules ($I$), and Elimination rules ($E$). They must satisfy Local Soundness (checks elimination rules are not too strong) such that for every way to apply the elimination rules, we can reduce it to one that already existed. The process of doing that is called local reduction. ($\beta$ rule) They should also satisfy Local Completeness (checks elimination rules are not too weak) such that there is someway to apply the elimination rules so that from the pieces we can re-introduce what the original proposition is. The process of doing that is called local expansion. ($\eta$ rule)
Note: $MN$ means $M$ applies to $N$
Some notations:
means $M$ is a proof of $A$ is $true$, or $M$ is a program of type $A$, where
means substitude $N$ for $x$ based on the structure of $M$ (plug in $N$ for $x$)
For products, we have types $\tau$:
and expressions $e$:
Ordered pairs
Projections
Statics:
Dynamics (two cases, lazy or eager):
lazy:
end lazy
eager:
end eager
E.Coli of Total PLs - Godel’s T: It codifies higher order functions and inductive types (nat)
A PL consists of two parts:
Basic criterion: Coherence