mail gabriel e coppo

This commit is contained in:
Francesco Mecca 2020-04-11 00:26:05 +02:00
parent b0d1da99c1
commit 13b8bb01f2
9 changed files with 1646 additions and 909 deletions

1
tesi/.gitignore vendored
View file

@ -2,6 +2,7 @@ tesi.tex
.#tesi.tex .#tesi.tex
tesi_unicode.tex tesi_unicode.tex
part4_unicode.tex
## Core latex/pdflatex auxiliary files: ## Core latex/pdflatex auxiliary files:

View file

@ -2,8 +2,12 @@ import json
import re import re
from sys import argv from sys import argv
allsymbols = json.load(open('./unicode-latex.json')) try:
mysymbols = ['', '', '', '', '', '', '', '', '', 'ε', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 'ʲ', '', 'π', 'α', 'β', '', 'σ', '', '', '', '', '', '', '', '', '', '', '', 'ˡ', '', '', '', '' ] allsymbols = json.load(open('./unicode-latex.json'))
except:
allsymbols = json.load(open('../unicode-latex.json'))
mysymbols = ['', '', '', '', '', '', '', '', '', 'ε', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', '', 'ʲ', '', 'π', 'α', 'β', '', 'σ', '', '', '', '', '', '', '', '', '', '', '', 'ˡ', '', '', '', '', '' ]
extrasymbols = {'': '\llbracket', '': r'\rrbracket', '̸': '\neg', '¬̸': '\neg', '': '\in ', '': '_S', '': '_T'} extrasymbols = {'': '\llbracket', '': r'\rrbracket', '̸': '\neg', '¬̸': '\neg', '': '\in ', '': '_S', '': '_T'}
symbols = {s: allsymbols[s] for s in mysymbols} symbols = {s: allsymbols[s] for s in mysymbols}
@ -39,8 +43,9 @@ def convert(ch, mathmode):
def latex_errors_replacements(charlist): def latex_errors_replacements(charlist):
text = ''.join(charlist).split(' ') text = ''.join(charlist).split(' ')
replacements = {'\n\end{comment}\n\end{enumerate}\n\end{enumerate}\n\n\subsection{Symbolic': replacements = {
'\n\end{comment}\n\n\subsection{Symbolic'} '\n\end{comment}\n\end{enumerate}\n\end{enumerate}\n\n\subsection{Symbolic': '\n\end{comment}\n\n\subsection{Symbolic',
}
r_set = set(replacements.keys()) r_set = set(replacements.keys())
for word in text: for word in text:
it = r_set.intersection(set([word])) it = r_set.intersection(set([word]))

22
tesi/gabriel/Makefile Normal file
View file

@ -0,0 +1,22 @@
SRC = part4.tex
AUX = part4.aux
DEL = part4.aux part4.bbl part4.blg part4.log part4.out part4_unicode.tex part4.pdf part4.tex texput.log
NAME = part4
part4:
@echo
@echo "=== Removing temporary files ==="
rm -f $(DEL)
@echo
@echo "=== Building from scratch ==="
emacs -batch part4_unicode.org -f org-latex-export-to-latex --kill
python3 ../conv.py part4_unicode.tex part4.tex
pdflatex $(SRC)
bibtex $(AUX)
pdflatex $(SRC)
pdflatex $(SRC)
@echo
@echo "=== All done. Generated $(NAME).pdf ==="
clean:
rm -f $(DEL)

View file

@ -1,354 +0,0 @@
# Add headers to export only this section
* Correctness of the algorithm
Running a program tₛ or its translation 〚tₛ〛 against an input vₛ
produces as a result /r/ in the following way:
| ( 〚tₛ〛ₛ(vₛ) = Cₛ(vₛ) ) → r
| tₛ(vₛ) → r
Likewise
| ( 〚tₜ〛ₜ(vₜ) = Cₜ(vₜ) ) → r
| tₜ(vₜ) → r
where result r ::= guard list * (Match blackbox | NoMatch | Absurd)
and guard ::= blackbox.
Having defined equivalence between two inputs of which one is
expressed in the source language and the other in the target language
vₛ ≃ vₜ (TODO define, this talks about the representation of source values in the target)
we can define the equivalence between a couple of programs or a couple
of decision trees
| tₛ ≃ tₜ := ∀vₛ≃vₜ, tₛ(vₛ) = tₜ(vₜ)
| Cₛ ≃ Cₜ := ∀vₛ≃vₜ, Cₛ(vₛ) = Cₜ(vₜ)
The proposed equivalence algorithm that works on a couple of
decision trees is returns either /Yes/ or /No(vₛ, vₜ)/ where vₛ and
vₜ are a couple of possible counter examples for which the constraint
trees produce a different result.
** Statements
Theorem. We say that a translation of a source program to a decision tree
is correct when for every possible input, the source program and its
respective decision tree produces the same result
| ∀vₛ, tₛ(vₛ) = 〚tₛ〛ₛ(vₛ)
Likewise, for the target language:
| ∀vₜ, tₜ(vₜ) = 〚tₜ〛ₜ(vₜ)
Definition: in the presence of guards we can say that two results are
equivalent modulo the guards queue, written /r₁ ≃gs r₂/, when:
| (gs₁, r₁) ≃gs (gs₂, r₂) ⇔ (gs₁, r₁) = (gs₂ ++ gs, r₂)
Definition: we say that Cₜ covers the input space /S/, written
/covers(Cₜ, S) when every value vₛ∈S is a valid input to the
decision tree Cₜ. (TODO: rephrase)
Theorem: Given an input space /S/ and a couple of decision trees, where
the target decision tree Cₜ covers the input space /S/, we say that
the two decision trees are equivalent when:
| equiv(S, Cₛ, Cₜ, gs) = Yes ∧ covers(Cₜ, S) → ∀vₛ≃vₜ ∈ S, Cₛ(vₛ) ≃gs Cₜ(vₜ)
Similarly we say that a couple of decision trees in the presence of
an input space /S/ are /not/ equivalent when:
| equiv(S, Cₛ, Cₜ, gs) = No(vₛ,vₜ) ∧ covers(Cₜ, S) → vₛ≃vₜ ∈ S ∧ Cₛ(vₛ) ≠gs Cₜ(vₜ)
Corollary: For a full input space /S/, that is the universe of the
target program we say:
| equiv(S, 〚tₛ〛ₛ, 〚tₜ〛ₜ, ∅) = Yes ⇔ tₛ ≃ tₜ
*** Proof of the correctness of the translation from source programs to source decision trees
We define a source term tₛ as a collection of patterns pointing to blackboxes
| tₛ ::= (p → bb)^{i∈I}
A pattern is defined as either a constructor pattern, an or pattern or
a constant pattern
| p ::= | K(pᵢ)ⁱ, i ∈ I | (p|q) | n ∈
A decision tree is defined as either a Leaf, a Failure terminal or
an intermediate node with different children sharing the same accessor /a/
and an optional fallback.
Failure is emitted only when the patterns don't cover the whole set of
possible input values /S/. The fallback is not needed when the user
doesn't use a wildcard pattern.
%%% Give example of thing
| Cₛ ::= Leaf bb | Node(a, (Kᵢ → Cᵢ)^{i∈S} , C?)
| a ::= Here | n.a
| vₛ ::= K(vᵢ)^{i∈I} | n ∈
\begin{comment}
Are K and Here clear here?
\end{comment}
We define the decomposition matrix /mₛ/ as
| SMatrix mₛ := (aⱼ)^{j∈J}, ((p_{ij})^{j∈J} → bbᵢ)^{i∈I}
\begin{comment}
Correggi prendendo in considerazione l'accessor
\end{comment}
We define the decision tree of source programs
〚tₛ〛
in terms of the decision tree of pattern matrices
〚mₛ〛
by the following:
〚((pᵢ → bbᵢ)^{i∈I}〛 := 〚(Root), (pᵢ → bbᵢ)^{i∈I} 〛
decision tree computed from pattern matrices respect the following invariant:
| ∀v (vᵢ)^{i∈I} = v(aᵢ)^{i∈I} → 〚m〛(v) = m(vᵢ)^{i∈I} for m = ((aᵢ)^{i∈I}, (rᵢ)^{i∈I})
where
| v(Here) = v
| K(vᵢ)ⁱ(k.a) = vₖ(a) if k ∈ [0;n[
\begin{comment}
TODO: EXPLAIN
\end{comment}
We proceed to show the correctness of the invariant by a case
analysys.
Base cases:
1. [| ∅, (∅ → bbᵢ)ⁱ |] := Leaf bbᵢ where i := min(I), that is a
decision tree [|ms|] defined by an empty accessor and empty
patterns pointing to blackboxes /bbᵢ/. This respects the invariant
because a decomposition matrix in the case of empty rows returns
the first expression and we known that (Leaf bb)(v) := Match bb
2. [| (aⱼ)ʲ, ∅ |] := Failure
Regarding non base cases:
Let's first define
| let Idx(k) := [0; arity(k)[
| let First(∅) := ⊥
| let First((aⱼ)ʲ) := a_{min(j∈J≠∅)}
\[
m := ((a_i)^i ((p_{ij})^i \to e_j)^{ij})
\]
\[
(k_k)^k := headconstructor(p_{i0})^i
\]
\begin{equation}
Groups(m) := ( k_k \to ((a)_{0.l})^{l \in Idx(k_k)} +++ (a_i)^{i \in I\backslash \{0\} }), \\
( if p_{0j} is k(q_l) then \\
(qₗ)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\backslash \{0\}} \to e_j \\
if p_{0j} is \_ then \\
(\_)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\backslash \{0\}} \to e_j \\
else \bot )^j ), \\
((a_i)^{i \in I\backslash \{0\}}, ((p_{ij})^{i \in I\backslash \{0\}} \to eⱼ if p_{0j} is \_ else \bot)^{j \in J})
\end{equation}
Groups(m) is an auxiliary function that decomposes a matrix m into
submatrices, according to the head constructor of their first pattern.
Groups(m) returns one submatrix m_r for each head constructor k that
occurs on the first row of m, plus one "wildcard submatrix" m_{wild}
that matches on all values that do not start with one of those head
constructors.
Intuitively, m is equivalent to its decompositionin the following
sense: if the first pattern of an input vector (v_i)^i starts with one
of the head constructors k, then running (v_i)^i against m is the same
as running it against the submatrix m_k; otherwise (its head
constructor is none of the k) it is equivalent to running it against
the wildcard submatrix.
We formalize this intuition as follows:
Lemma (Groups):
Let \[m\] be a matrix with \[Groups(m) = (k_r \to m_r)^k, m_{wild}\].
For any value vector \[(v_i)^l\] such that \[v_0 = k(v'_l)^l\] for some
constructor k,
we have:
\[
if k = kₖ for some k then
m(vᵢ)ⁱ = mₖ((v'ₗ)ˡ +++ (vᵢ)^{i∈I\backslash \{0\}})
else
m(vᵢ)ⁱ = m_{wild}(vᵢ)^{i∈I\backslash \{0\}}
\]
*** Proof:
Let \[m\] be a matrix with \[Group(m) = (k_r \to m_r)^k, m_{wild}\].
Let \[(v_i)^i\] be an input matrix with \[v_0 = k(v'_l)^l\] for some k.
We proceed by case analysis:
- either k is one of the kₖ for some k
- or k is none of the (kₖ)ᵏ
Both m(vᵢ)ⁱ and mₖ(vₖ)ᵏ are defined as the first matching result of
a family over each row rⱼ of a matrix
We know, from the definition of
Groups(m), that mₖ is
\[
((a){0.l})^{l∈Idx(kₖ)} +++ (aᵢ)^{i∈I\backslash \{0\}}),
(
if p_{0j} is k(qₗ) then
(qₗ)ˡ +++ (p_{ij})^{i∈I\backslash \{0\}} → eⱼ
if p_{0j} is _ then
(_)ˡ +++ (p_{ij})^{i∈I\backslash \{0\}} → eⱼ
else ⊥
\]
By definition, m(vᵢ)ⁱ is
m(vᵢ)ⁱ = First(rⱼ(vᵢ)ⁱ)ʲ for m = ((aᵢ)ⁱ, (rⱼ)ʲ)
(pᵢ)ⁱ (vᵢ)ⁱ = {
if p₀ = k(qₗ)ˡ, v₀ = k'(v'ₖ)ᵏ, k=Idx(k') and l=Idx(k)
if k ≠ k' then ⊥
if k = k' then ((qₗ)ˡ +++ (pᵢ)^{i∈I\backslash \{0\}}) ((v'ₖ)ᵏ +++ (vᵢ)^{i∈I\backslash \{0\}})
if p₀ = (q₁|q₂) then
First( (q₁pᵢ^{i∈I \backslash \{0\}}) vᵢ^{i∈I \backslash \{0\}}, (q₂pᵢ^{i∈I \backslash \{0\}}) vᵢ^{i∈I \backslash \{0\}} )
}
For this reason, if we can prove that
| ∀j, rⱼ(vᵢ)ⁱ = r'ⱼ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
it follows that
| m(vᵢ)ⁱ = mₖ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
from the above definition.
We can also show that aᵢ = a_{0.l}ˡ +++ a_{i∈I\backslash \{0\}} because v(a₀) = K(v(a){0.l})ˡ)
** Proof of equivalence checking
\begin{comment}
TODO: put ^i∈I where needed
\end{comment}
\subsubsection{The trimming lemma}
The trimming lemma allows to reduce the size of a decision tree given
an accessor → π relation (TODO: expand)
| ∀vₜ ∈ (a→π), Cₜ(vₜ) = C_{t/a→π(kᵢ)}(vₜ)
We prove this by induction on Cₜ:
a. Cₜ = Leaf_{bb}: when the decision tree is a leaf terminal, we
know that
| Leaf_{bb/a→π}(v) = Leaf_{bb}(v)
That means that the result of trimming on a Leaf is the Leaf itself
b. The same applies to Failure terminal
| Failure_{/a→π}(v) = Failure(v)
c. When Cₜ = Node(b, (π→Cᵢ)ⁱ)_{/a→π} then
we look at the accessor /a/ of the subtree Cᵢ and
we define πᵢ' = πᵢ if a≠b else πᵢ∩π
Trimming a switch node yields the following result:
| Node(b, (π→Cᵢ)ⁱ)_{/a→π} := Node(b, (π'ᵢ→C_{i/a→π})ⁱ)
\begin{comment}
Actually in the proof.org file I transcribed:
e. Unreachabe → ⊥
This is not correct because you don't have Unreachable nodes in target decision trees
\end{comment}
For the trimming lemma we have to prove that running the value vₜ against
the decistion tree Cₜ is the same as running vₜ against the tree
C_{trim} that is the result of the trimming operation on Cₜ
| Cₜ(vₜ) = C_{trim}(vₜ) = Node(b, (πᵢ'→C_{i/a→π})ⁱ)(vₜ)
We can reason by first noting that when vₜ∉(b→πᵢ)ⁱ the node must be a Failure node.
In the case where ∃k| vₜ∈(b→πₖ) then we can prove that
| C_{k/a→π}(vₜ) = Node(b, (πᵢ'→C_{i/a→π})ⁱ)(vₜ)
because when a ≠ b then πₖ'= πₖ and this means that vₜ∈πₖ'
while when a = b then πₖ'=(πₖ∩π) and vt∈πₖ' because:
- by the hypothesis, vₜ∈π
- we are in the case where vₜ∈πₖ
So vₜ ∈ πₖ' and by induction
| Cₖ(vₜ) = C_{k/a→π}(vₜ)
We also know that ∀vₜ∈(b→πₖ) → Cₜ(vₜ) = Cₖ(vₜ)
By putting together the last two steps, we have proven the trimming
lemma.
\begin{comment}
TODO: what should I say about covering??? I swap π and π'
Covering lemma:
∀a,π covers(Cₜ,S) → covers(C_{t/a→π}, (S∩a→π))
Uᵢπⁱ ≈ Uᵢπ'∩(a→π) ≈ (Uᵢπ')∩(a→π) %%
%%%%%%% Also: Should I swap π and π' ?
\end{comment}
\subsubsection{Equivalence checking}
The equivalence checking algorithm takes as parameters an input space
/S/, a source decision tree /Cₛ/ and a target decision tree /Cₜ/:
| equiv(S, Cₛ, Cₜ) → Yes | No(vₛ, vₜ)
When the algorithm returns Yes and the input space is covered by /Cₛ/
we can say that the couple of decision trees are the same for
every couple of source value /vₛ/ and target value /vₜ/ that are equivalent.
\begin{comment}
Define "covered"
Is it correct to say the same? How to correctly distinguish in words ≃ and = ?
\end{comment}
| equiv(S, Cₛ, Cₜ) = Yes and cover(Cₜ, S) → ∀ vₛ ≃ vₜ∈S ∧ Cₛ(vₛ) = Cₜ(vₜ)
In the case where the algorithm returns No we have at least a couple
of counter example values vₛ and vₜ for which the two decision trees
outputs a different result.
| equiv(S, Cₛ, Cₜ) = No(vₛ,vₜ) and cover(Cₜ, S) → ∀ vₛ ≃ vₜ∈S ∧ Cₛ(vₛ) ≠ Cₜ(vₜ)
We define the following
| Forall(Yes) = Yes
| Forall(Yes::l) = Forall(l)
| Forall(No(vₛ,vₜ)::_) = No(vₛ,vₜ)
There exists and are injective:
| int(k) ∈ (arity(k) = 0)
| tag(k) ∈ (arity(k) > 0)
| π(k) = {n\vert int(k) = n} x {n\vert tag(k) = n}
where k is a constructor.
\begin{comment}
TODO: explain:
∀v∈a→π, C_{/a→π}(v) = C(v)
\end{comment}
We proceed by case analysis:
\begin{comment}
I start numbering from zero to leave the numbers as they were on the blackboard, were we skipped some things
I think the unreachable case should go at the end.
\end{comment}
0. in case of unreachable:
| Cₛ(vₛ) = Absurd(Unreachable) ≠ Cₜ(vₜ) ∀vₛ,vₜ
1. In the case of an empty input space
| equiv(∅, Cₛ, Cₜ) := Yes
and that is trivial to prove because there is no pair of values (vₛ, vₜ) that could be
tested against the decision trees.
In the other subcases S is always non-empty.
2. When there are /Failure/ nodes at both sides the result is /Yes/:
|equiv(S, Failure, Failure) := Yes
Given that ∀v, Failure(v) = Failure, the statement holds.
3. When we have a Leaf or a Failure at the left side:
| equiv(S, Failure as Cₛ, Node(a, (πᵢ → Cₜᵢ)ⁱ)) := Forall(equiv( S∩a→π(kᵢ)), Cₛ, Cₜᵢ)ⁱ)
| equiv(S, Leaf bbₛ as Cₛ, Node(a, (πᵢ → Cₜᵢ)ⁱ)) := Forall(equiv( S∩a→π(kᵢ)), Cₛ, Cₜᵢ)ⁱ)
The algorithm either returns Yes for every sub-input space Sᵢ := S∩(a→π(kᵢ)) and
subtree Cₜᵢ
| equiv(Sᵢ, Cₛ, Cₜᵢ) = Yes ∀i
or we have a counter example vₛ, vₜ for which
| vₛ≃vₜ∈Sₖ ∧ cₛ(vₛ) ≠ Cₜₖ(vₜ)
then because
| vₜ∈(a→πₖ) → Cₜ(vₜ) = Cₜₖ(vₜ) ,
| vₛ≃vₜ∈S ∧ Cₛ(vₛ)≠Cₜ(vₜ)
we can say that
| equiv(Sᵢ, Cₛ, Cₜᵢ) = No(vₛ, vₜ) for some minimal k∈I
4. When we have a Node on the right we define πₙ as the domain of
values not covered but the union of the constructors kᵢ
| πₙ = ¬(⋃π(kᵢ)ⁱ)
The algorithm proceeds by trimming
| equiv(S, Node(a, (kᵢ → Cₛᵢ)ⁱ, C_{sf}), Cₜ) :=
| Forall(equiv( S∩(a→π(kᵢ)ⁱ), Cₛᵢ, C_{t/a→π(kᵢ)})ⁱ +++ equiv(S∩(a→π(kᵢ)), Cₛ, C_{a→πₙ}))
The statement still holds and we show this by first analyzing the
/Yes/ case:
| Forall(equiv( S∩(a→π(kᵢ)ⁱ), Cₛᵢ, C_{t/a→π(kᵢ)})ⁱ = Yes
The constructor k is either included in the set of constructors kᵢ:
| k \vert k∈(kᵢ)ⁱ ∧ Cₛ(vₛ) = Cₛᵢ(vₛ)
We also know that
| (1) Cₛᵢ(vₛ) = C_{t/a→πᵢ}(vₜ)
| (2) C_{T/a→πᵢ}(vₜ) = Cₜ(vₜ)
(1) is true by induction and (2) is a consequence of the trimming lemma.
Putting everything together:
| Cₛ(vₛ) = Cₛᵢ(vₛ) = C_{T/a→πᵢ}(vₜ) = Cₜ(vₜ)
When the k∉(kᵢ)ⁱ [TODO]
The auxiliary Forall function returns /No(vₛ, vₜ)/ when, for a minimum k,
| equiv(Sₖ, Cₛₖ, C_{T/a→πₖ} = No(vₛ, vₜ)
Then we can say that
| Cₛₖ(vₛ) ≠ C_{t/a→πₖ}(vₜ)
that is enough for proving that
| Cₛₖ(vₛ) ≠ (C_{t/a→πₖ}(vₜ) = Cₜ(vₜ))

Binary file not shown.

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,701 @@
#+LANGUAGE: en
#+LaTeX_CLASS: article
#+LaTeX_HEADER: \linespread{1.25}
#+LaTeX_HEADER: \usepackage{algorithm}
#+LaTeX_HEADER: \usepackage{comment}
#+LaTeX_HEADER: \usepackage{algpseudocode}
#+LaTeX_HEADER: \usepackage{amsmath,amssymb,amsthm}
#+LaTeX_HEADER: \newtheorem{definition}{Definition}
#+LaTeX_HEADER: \usepackage{mathpartir}
#+LaTeX_HEADER: \usepackage{graphicx}
#+LaTeX_HEADER: \usepackage{listings}
#+LaTeX_HEADER: \usepackage{color}
#+LaTeX_HEADER: \usepackage{stmaryrd}
#+LaTeX_HEADER: \newcommand{\semTEX}[1]{{\llbracket{#1}\rrbracket}}
#+LaTeX_HEADER: \newcommand{\EquivTEX}[3]{\mathsf{equiv}(#1, #2, #3)} % \equiv is already taken
#+LaTeX_HEADER: \newcommand{\coversTEX}[2]{#1 \mathrel{\mathsf{covers}} #2}
#+LaTeX_HEADER: \newcommand{\YesTEX}{\mathsf{Yes}}
#+LaTeX_HEADER: \newcommand{\DZ}{\backslash\text{\{0\}}}
#+LaTeX_HEADER: \newcommand{\NoTEX}[2]{\mathsf{No}(#1, #2)}
#+LaTeX_HEADER: \usepackage{comment}
#+LaTeX_HEADER: \usepackage{mathpartir}
#+LaTeX_HEADER: \usepackage{stmaryrd} % llbracket, rrbracket
#+LaTeX_HEADER: \usepackage{listings}
#+LaTeX_HEADER: \usepackage{notations}
#+LaTeX_HEADER: \lstset{
#+LaTeX_HEADER: mathescape=true,
#+LaTeX_HEADER: language=[Objective]{Caml},
#+LaTeX_HEADER: basicstyle=\ttfamily,
#+LaTeX_HEADER: extendedchars=true,
#+LaTeX_HEADER: showstringspaces=false,
#+LaTeX_HEADER: aboveskip=\smallskipamount,
#+LaTeX_HEADER: % belowskip=\smallskipamount,
#+LaTeX_HEADER: columns=fullflexible,
#+LaTeX_HEADER: moredelim=**[is][\color{blue}]{/*}{*/},
#+LaTeX_HEADER: moredelim=**[is][\color{green!60!black}]{/!}{!/},
#+LaTeX_HEADER: moredelim=**[is][\color{orange}]{/(}{)/},
#+LaTeX_HEADER: moredelim=[is][\color{red}]{/[}{]/},
#+LaTeX_HEADER: xleftmargin=1em,
#+LaTeX_HEADER: }
#+LaTeX_HEADER: \lstset{aboveskip=0.4ex,belowskip=0.4ex}
#+EXPORT_SELECT_TAGS: export
#+EXPORT_EXCLUDE_TAGS: noexport
#+OPTIONS: H:2 toc:nil \n:nil @:t ::t |:t ^:{} _:{} *:t TeX:t LaTeX:t
#+STARTUP: showall
* Translation validation of the Pattern Matching Compiler
** Source program
The algorithm takes as its input a source program and translates it
into an algebraic data structure which type we call /decision_tree/.
#+BEGIN_SRC
type decision_tree =
| Unreachable
| Failure
| Leaf of source_expr
| Guard of source_blackbox * decision_tree * decision_tree
| Switch of accessor * (constructor * decision_tree) list * decision_tree
#+END_SRC
Unreachable, Leaf of source_expr and Failure are the terminals of the three.
We distinguish
- Unreachable: statically it is known that no value can go there
- Failure: a value matching this part results in an error
- Leaf: a value matching this part results into the evaluation of a
source black box of code
The algorithm doesn't support type-declaration-based analysis
to know the list of constructors at a given type.
Let's consider some trivial examples:
#+BEGIN_SRC
function true -> 1
#+END_SRC
is translated to
|Switch ([(true, Leaf 1)], Failure)
while
#+BEGIN_SRC
function
true -> 1
| false -> 2
#+END_SRC
will be translated to
|Switch ([(true, Leaf 1); (false, Leaf 2)])
It is possible to produce Unreachable examples by using
refutation clauses (a "dot" in the right-hand-side)
#+BEGIN_SRC
function
true -> 1
| false -> 2
| _ -> .
#+END_SRC
that gets translated into
Switch ([(true, Leaf 1); (false, Leaf 2)], Unreachable)
We trust this annotation, which is reasonable as the type-checker
verifies that it indeed holds.
Guard nodes of the tree are emitted whenever a guard is found. Guards
node contains a blackbox of code that is never evaluated and two
branches, one that is taken in case the guard evaluates to true and
the other one that contains the path taken when the guard evaluates to
true.
\begin{comment}
[ ] Finisci con Switch
[ ] Spiega del fallback
[ ] rivedi compilazione per tenere in considerazione il tuo albero invece che le Lambda
\end{comment}
The source code of a pattern matching function has the
following form:
|match variable with
|\vert pattern₁ \to expr₁
|\vert pattern₂ when guard \to expr₂
|\vert pattern₃ as var \to expr₃
|⋮
|\vert pₙ \to exprₙ
Patterns could or could not be exhaustive.
Pattern matching code could also be written using the more compact form:
|function
|\vert pattern₁ \to expr₁
|\vert pattern₂ when guard \to expr₂
|\vert pattern₃ as var \to expr₃
|⋮
|\vert pₙ \to exprₙ
This BNF grammar describes formally the grammar of the source program:
| start ::= "match" id "with" patterns \vert{} "function" patterns
| patterns ::= (pattern0\vert{}pattern1) pattern1+
| ;; pattern0 and pattern1 are needed to distinguish the first case in which
| ;; we can avoid writing the optional vertical line
| pattern0 ::= clause
| pattern1 ::= "\vert" clause
| clause ::= lexpr "->" rexpr
| lexpr ::= rule (ε\vert{}condition)
| rexpr ::= _code ;; arbitrary code
| rule ::= wildcard\vert{}variable\vert{}constructor_pattern\vert{}or_pattern ;;
| ;; rules
| wildcard ::= "_"
| variable ::= identifier
| constructor_pattern ::= constructor (rule\vert{}ε) (assignment\vert{}ε)
| constructor ::= int\vert{}float\vert{}char\vert{}string\vert{}bool \vert{}unit\vert{}record\vert{}exn\vert{}objects\vert{}ref \vert{}list\vert{}tuple\vert{}array\vert{}variant\vert{}parameterized_variant ;; data types
| or_pattern ::= rule ("\vert{}" wildcard\vert{}variable\vert{}constructor_pattern)+
| condition ::= "when" bexpr
| assignment ::= "as" id
| bexpr ::= _code ;; arbitrary code
A source program tₛ is a collection of pattern clauses pointing to
/bb/ terms. Running a program tₛ against an input value vₛ produces as
a result /r/:
| tₛ ::= (p → bb)^{i∈I}
| p ::= \vert K(pᵢ)ⁱ, i ∈ I \vert (p\vert{}q) \vert n ∈
| r ::= guard list * (Match bb \vert{} NoMatch \vert{} Absurd)
| tₛ(vₛ) → r
TODO: argument on what it means to run a source program
/guard/ and /bb/ expressions are treated as blackboxes of OCaml code.
A sound approach for treating these blackboxes would be to inspect the
OCaml compiler during translation to Lambda code and extract the
blackboxes compiled in their Lambda representation.
This would allow to test for equality with the respective blackbox at
the target level.
Given that this level of introspection is currently not possibile, we
decided to restrict the structure of blackboxes to the following (valid) OCaml
code:
#+BEGIN_SRC
external guard : 'a -> 'b = "guard"
external observe : 'a -> 'b = "observe"
#+END_SRC
We assume these two external functions /guard/ and /observe/ with a valid
type that lets the user pass any number of arguments to them.
All the guards are of the form \texttt{guard <arg> <arg> <arg>}, where the
<arg> are expressed using the OCaml pattern matching language.
Similarly, all the right-hand-side expressions are of the form
\texttt{observe <arg> <arg> ...} with the same constraints on arguments.
#+BEGIN_SRC
type t = K1 | K2 of t (* declaration of an algebraic and recursive datatype t *)
let _ = function
| K1 -> observe 0
| K2 K1 -> observe 1
| K2 x when guard x -> observe 2
| K2 (K2 x) as y when guard x y -> observe 3
| K2 _ -> observe 4
#+END_SRC
In our prototype we make use of accessors to encode stored values.
\begin{minipage}{0.2\linewidth}
\begin{verbatim}
let value = 1 :: 2 :: 3 :: []
(* that can also be written *)
let value = []
|> List.cons 3
|> List.cons 2
|> List.cons 1
\end{verbatim}
\end{minipage}
\hfill
\begin{minipage}{0.5\linewidth}
\begin{verbatim}
(field 0 x) = 1
(field 0 (field 1 x)) = 2
(field 0 (field 1 (field 1 x)) = 3
(field 0 (field 1 (field 1 (field 1 x)) = []
\end{verbatim}
\end{minipage}
An \emph{accessor} /a/ represents the
access path to a value that can be reached by deconstructing the
scrutinee.
| a ::= Here \vert n.a
The above example, in encoded form:
\begin{verbatim}
Here = 1
1.Here = 2
1.1.Here = 3
1.1.1.Here = []
\end{verbatim}
In our prototype the source matrix mₛ is defined as follows
| SMatrix mₛ := (aⱼ)^{j∈J}, ((p_{ij})^{j∈J} → bbᵢ)^{i∈I}
Source matrices are used to build source decision trees Cₛ.
A decision tree is defined as either a Leaf, a Failure terminal or
an intermediate node with different children sharing the same accessor /a/
and an optional fallback.
Failure is emitted only when the patterns don't cover the whole set of
possible input values /S/. The fallback is not needed when the user
doesn't use a wildcard pattern.
%%% Give example of thing
| Cₛ ::= Leaf bb \vert Switch(a, (Kᵢ → Cᵢ)^{i∈S} , C?) \vert Failure \vert Unreachable
| vₛ ::= K(vᵢ)^{i∈I} \vert n ∈
\begin{comment}
Are K and Here clear here?
\end{comment}
We say that a translation of a source program to a decision tree
is correct when for every possible input, the source program and its
respective decision tree produces the same result
| ∀vₛ, tₛ(vₛ) = 〚tₛ〛ₛ(vₛ)
We define the decision tree of source programs
〚tₛ〛
in terms of the decision tree of pattern matrices
〚mₛ〛
by the following:
| 〚((pᵢ → bbᵢ)^{i∈I}〛 := 〚(Here), (pᵢ → bbᵢ)^{i∈I} 〛
Decision tree computed from pattern matrices respect the following invariant:
| ∀v (vᵢ)^{i∈I} = v(aᵢ)^{i∈I} → 〚m〛(v) = m(vᵢ)^{i∈I} for m = ((aᵢ)^{i∈I}, (rᵢ)^{i∈I})
| v(Here) = v
| K(vᵢ)ⁱ(k.a) = vₖ(a) if k ∈ [0;n[
\begin{comment}
TODO: EXPLAIN
\end{comment}
We proceed to show the correctness of the invariant by a case
analysys.
Base cases:
1. [| ∅, (∅ → bbᵢ)ⁱ |] ≡ Leaf bbᵢ where i := min(I), that is a
decision tree [|ms|] defined by an empty accessor and empty
patterns pointing to blackboxes /bbᵢ/. This respects the invariant
because a source matrix in the case of empty rows returns
the first expression and (Leaf bb)(v) := Match bb
2. [| (aⱼ)ʲ, ∅ |] ≡ Failure
Regarding non base cases:
Let's first define
| let Idx(k) := [0; arity(k)[
| let First(∅) := ⊥
| let First((aⱼ)ʲ) := a_{min(j∈J≠∅)}
\[
m := ((a_i)^i ((p_{ij})^i \to e_j)^{ij})
\]
\[
(k_k)^k := headconstructor(p_{i0})^i
\]
\begin{equation}
Groups(m) := ( k_k \to ((a)_{0.l})^{l \in Idx(k_k)} +++ (a_i)^{i \in I\DZ}), \\
( if p_{0j} is k(q_l) then \\
(qₗ)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\DZ} \to e_j \\
if p_{0j} is \_ then \\
(\_)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\DZ} \to e_j \\
else \bot )^j ), \\
((a_i)^{i \in I\DZ}, ((p_{ij})^{i \in I\DZ} \to eⱼ if p_{0j} is \_ else \bot)^{j \in J})
\end{equation}
Groups(m) is an auxiliary function that source a matrix m into
submatrices, according to the head constructor of their first pattern.
Groups(m) returns one submatrix m_r for each head constructor k that
occurs on the first row of m, plus one "wildcard submatrix" m_{wild}
that matches on all values that do not start with one of those head
constructors.
Intuitively, m is equivalent to its decomposition in the following
sense: if the first pattern of an input vector (vᵢ)ⁱ starts with one
of the head constructors k, then running (vᵢ)ⁱ against m is the same
as running it against the submatrix mₖ; otherwise (its head
constructor is none of the k) it is equivalent to running it against
the wildcard submatrix.
We formalize this intuition as follows:
Lemma (Groups):
Let \[m\] be a matrix with \[Groups(m) = (k_r \to m_r)^k, m_{wild}\].
For any value vector $(v_i)^l$ such that $v_0 = k(v'_l)^l$ for some
constructor k,
we have:
| if k = kₖ \text{ for some k then}
| \quad m(vᵢ)ⁱ = mₖ((v_{l}')ˡ +++ (v_{i})^{i∈I\DZ})
| \text{else}
| \quad m(vᵢ)ⁱ = m_{wild}(vᵢ)^{i∈I\DZ}
\begin{comment}
TODO: fix \{0}
\end{comment}
*** Proof:
Let $m$ be a matrix with \[Group(m) = (k_r \to m_r)^k, m_{wild}\]
Let $(v_i)^i$ be an input matrix with $v_0 = k(v'_l)^l$ for some k.
We proceed by case analysis:
- either k is one of the kₖ for some k
- or k is none of the (kₖ)ᵏ
Both m(vᵢ)ⁱ and mₖ(vₖ)ᵏ are defined as the first matching result of
a family over each row rⱼ of a matrix
We know, from the definition of
Groups(m), that mₖ is
| ((a){0.l})^{l∈Idx(kₖ)} +++ (aᵢ)^{i∈I\DZ}),
| (
| \quad if p_{0j} is k(qₗ) then
| \quad \quad (qₗ)ˡ +++ (p_{ij})^{i∈I\DZ } → eⱼ
| \quad if p_{0j} is _ then
| \quad \quad (_)ˡ +++ (p_{ij})^{i∈I\DZ} → eⱼ
| \quad else ⊥
| )^{j∈J}
By definition, m(vᵢ)ⁱ is
| m(vᵢ)ⁱ = First(rⱼ(vᵢ)ⁱ)ʲ for m = ((aᵢ)ⁱ, (rⱼ)ʲ)
| (pᵢ)ⁱ (vᵢ)ⁱ = {
| \quad if p₀ = k(qₗ)ˡ, v₀ = k'(v'ₖ)ᵏ, k=Idx(k') and l=Idx(k)
| \quad \quad if k ≠ k' then ⊥
| if k = k' then ((qₗ)ˡ +++ (pᵢ)^{i∈I\DZ}) ((v'ₖ)ᵏ +++ (vᵢ)^{i∈I\DZ} )
| if p₀ = (q₁\vert{}q₂) then
| First( (q₁pᵢ^{i∈I\DZ}) vᵢ^{i∈I\DZ}, (q₂pᵢ^{i∈I \DZ}) vᵢ^{i∈I\DZ})}
For this reason, if we can prove that
| ∀j, rⱼ(vᵢ)ⁱ = r'ⱼ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
it follows that
| m(vᵢ)ⁱ = mₖ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
from the above definition.
We can also show that aᵢ = (a_{0.l})ˡ +++ a_{i∈I\DZ} because v(a₀) = K(v(a){0.l})ˡ)
** Target translation
The target program of the following general form is parsed using a parser
generated by Menhir, a LR(1) parser generator for the OCaml programming language.
Menhir compiles LR(1) a grammar specification, in this case a subset
of the Lambda intermediate language, down to OCaml code.
This is the grammar of the target language (TODO: check menhir grammar)
| start ::= sexpr
| sexpr ::= variable \vert{} string \vert{} "(" special_form ")"
| string ::= "\"" identifier "\"" ;; string between doublequotes
| variable ::= identifier
| special_form ::= let\vert{}catch\vert{}if\vert{}switch\vert{}switch-star\vert{}field\vert{}apply\vert{}isout
| let ::= "let" assignment sexpr ;; (assignment sexpr)+ outside of pattern match code
| assignment ::= "function" variable variable+ ;; the first variable is the identifier of the function
| field ::= "field" digit variable
| apply ::= ocaml_lambda_code ;; arbitrary code
| catch ::= "catch" sexpr with sexpr
| with ::= "with" "(" label ")"
| exit ::= "exit" label
| switch-star ::= "switch*" variable case*
| switch::= "switch" variable case* "default:" sexpr
| case ::= "case" casevar ":" sexpr
| casevar ::= ("tag"\vert{}"int") integer
| if ::= "if" bexpr sexpr sexpr
| bexpr ::= "(" ("!="\vert{}"=="\vert{}">="\vert{}"<="\vert{}">"\vert{}"<") sexpr digit \vert{} field ")"
| label ::= integer
The prototype doesn't support strings.
The AST built by the parser is traversed and evaluated by the symbolic
execution engine.
Given that the target language supports jumps in the form of "catch - exit"
blocks the engine tries to evaluate the instructions inside the blocks
and stores the result of the partial evaluation into a record.
When a jump is encountered, the information at the point allows to
finalize the evaluation of the jump block.
In the environment the engine also stores bindings to values and
functions.
Integer additions and subtractions are simplified in a second pass.
The result of the symbolic evaluation is a target decision tree Cₜ
| Cₜ ::= Leaf bb \vert Switch(a, (πᵢ → Cᵢ)^{i∈S} , C?) \vert Failure
| vₜ ::= Cell(tag ∈ , (vᵢ)^{i∈I}) \vert n ∈
Every branch of the decision tree is "constrained" by a domain
| Domain π = { n\vert{}n∈ x n\vert{}n∈Tag⊆ }
Intuitively, the π condition at every branch tells us the set of
possible values that can "flow" through that path.
π conditions are refined by the engine during the evaluation; at the
root of the decision tree the domain corresponds to the set of
possible values that the type of the function can hold.
C? is the fallback node of the tree that is taken whenever the value
at that point of the execution can't flow to any other subbranch.
Intuitively, the π_{fallback} condition of the C? fallback node is
| π_{fallback} = ¬\bigcup\limits_{i∈I}πᵢ
The fallback node can be omitted in the case where the domain of the
children nodes correspond to set of possible values pointed by the
accessor at that point of the execution
| domain(vₛ(a)) = \bigcup\limits_{i∈I}πᵢ
We say that a translation of a target program to a decision tree
is correct when for every possible input, the target program and its
respective decision tree produces the same result
| ∀vₜ, tₜ(vₜ) = 〚tₜ〛ₜ(vₜ)
** Equivalence checking
The equivalence checking algorithm takes as input a domain of
possible values \emph{S} and a pair of source and target decision trees and
in case the two trees are not equivalent it returns a counter example.
The algorithm respects the following correctness statement:
\begin{comment}
TODO: we have to define what \coversTEX mean for readers to understand the specifications
(or we use a simplifying lie by hiding \coversTEX in the statements).
\end{comment}
\begin{align*}
\EquivTEX S {C_S} {C_T} \emptyqueue = \YesTEX \;\land\; \coversTEX {C_T} S
& \implies
\forall v_S \approx v_T \in S,\; C_S(v_S) = C_T(v_T)
\\
\EquivTEX S {C_S} {C_T} \emptyqueue = \NoTEX {v_S} {v_T} \;\land\; \coversTEX {C_T} S
& \implies
v_S \approx v_T \in S \;\land\; C_S(v_S) \neq C_T(v_T)
\end{align*}
Our equivalence-checking algorithm $\EquivTEX S {C_S} {C_T} G$ is
a exactly decision procedure for the provability of the judgment
$(\EquivTEX S {C_S} {C_T} G)$, defined below.
\begin{mathpar}
\begin{array}{l@{~}r@{~}l}
& & \text{\emph{constraint trees}} \\
C & \bnfeq & \Leaf t \\
& \bnfor & \Failure \\
& \bnfor & \Switch a {\Fam i {\pi_i, C_i}} \Cfb \\
& \bnfor & \Guard t {C_0} {C_1} \\
\end{array}
\begin{array}{l@{~}r@{~}l}
& & \text{\emph{boolean result}} \\
b & \in & \{ 0, 1 \} \\[0.5em]
& & \text{\emph{guard queues}} \\
G & \bnfeq & (t_1 = b_1), \dots, (t_n = b_n) \\
\end{array}
\begin{array}{l@{~}r@{~}l}
& & \text{\emph{input space}} \\
S & \subseteq & \{ (v_S, v_T) \mid \vrel {v_S} {v_T} \} \\
\end{array}
\\
\infer{ }
{\EquivTEX \emptyset {C_S} {C_T} G}
\infer{ }
{\EquivTEX S \Failure \Failure \emptyqueue}
\infer
{\trel {t_S} {t_T}}
{\EquivTEX S {\Leaf {t_S}} {\Leaf {t_T}} \emptyqueue}
\infer
{\forall i,\;
\EquivTEX
{(S \land a = K_i)}
{C_i} {\trim {C_T} {a = K_i}} G
\\
\EquivTEX
{(S \land a \notin \Fam i {K_i})}
\Cfb {\trim {C_T} {a \notin \Fam i {K_i}}} G
}
{\EquivTEX S
{\Switch a {\Fam i {K_i, C_i}} \Cfb} {C_T} G}
\begin{comment}
% TODO explain somewhere why the judgment is not symmetric:
% we avoid defining trimming on source trees, which would
% require more expressive conditions than just constructors
\end{comment}
\infer
{C_S \in {\Leaf t, \Failure}
\\
\forall i,\; \EquivTEX {(S \land a \in D_i)} {C_S} {C_i} G
\\
\EquivTEX {(S \land a \notin \Fam i {D_i})} {C_S} \Cfb G}
{\EquivTEX S
{C_S} {\Switch a {\Fam i {D_i} {C_i}} \Cfb} G}
\infer
{\EquivTEX S {C_0} {C_T} {G, (t_S = 0)}
\\
\EquivTEX S {C_1} {C_T} {G, (t_S = 1)}}
{\EquivTEX S
{\Guard {t_S} {C_0} {C_1}} {C_T} G}
\infer
{\trel {t_S} {t_T}
\\
\EquivTEX S {C_S} {C_b} G}
{\EquivTEX S
{C_S} {\Guard {t_T} {C_0} {C_1}} {(t_S = b), G}}
\end{mathpar}
Running a program tₛ or its translation 〚tₛ〛 against an input vₛ
produces as a result /r/ in the following way:
| ( 〚tₛ〛ₛ(vₛ) ≡ Cₛ(vₛ) ) → r
| tₛ(vₛ) → r
Likewise
| ( 〚tₜ〛ₜ(vₜ) ≡ Cₜ(vₜ) ) → r
| tₜ(vₜ) → r
| result r ::= guard list * (Match blackbox \vert{} NoMatch \vert{} Absurd)
| guard ::= blackbox.
Having defined equivalence between two inputs of which one is
expressed in the source language and the other in the target language,
vₛ ≃ vₜ, we can define the equivalence between a couple of programs or
a couple of decision trees
| tₛ ≃ tₜ := ∀vₛ≃vₜ, tₛ(vₛ) = tₜ(vₜ)
| Cₛ ≃ Cₜ := ∀vₛ≃vₜ, Cₛ(vₛ) = Cₜ(vₜ)
The result of the proposed equivalence algorithm is /Yes/ or /No(vₛ,
vₜ)/. In particular, in the negative case, vₛ and vₜ are a couple of
possible counter examples for which the decision trees produce a
different result.
In the presence of guards we can say that two results are
equivalent modulo the guards queue, written /r₁ ≃gs r₂/, when:
| (gs₁, r₁) ≃gs (gs₂, r₂) ⇔ (gs₁, r₁) = (gs₂ ++ gs, r₂)
We say that Cₜ covers the input space /S/, written
/covers(Cₜ, S)/ when every value vₛ∈S is a valid input to the
decision tree Cₜ. (TODO: rephrase)
Given an input space /S/ and a couple of decision trees, where
the target decision tree Cₜ covers the input space /S/, we say that
the two decision trees are equivalent when:
| equiv(S, Cₛ, Cₜ, gs) = Yes ∧ covers(Cₜ, S) → ∀vₛ≃vₜ ∈ S, Cₛ(vₛ) ≃gs Cₜ(vₜ)
Similarly we say that a couple of decision trees in the presence of
an input space /S/ are /not/ equivalent when:
| equiv(S, Cₛ, Cₜ, gs) = No(vₛ,vₜ) ∧ covers(Cₜ, S) → vₛ≃vₜ ∈ S ∧ Cₛ(vₛ) ≠gs Cₜ(vₜ)
Corollary: For a full input space /S/, that is the universe of the
target program we say:
| equiv(S, 〚tₛ〛ₛ, 〚tₜ〛ₜ, ∅) = Yes ⇔ tₛ ≃ tₜ
\begin{comment}
TODO: put ^i∈I where needed
\end{comment}
\subsubsection{The trimming lemma}
The trimming lemma allows to reduce the size of a decision tree given
an accessor /a/ → π relation (TODO: expand)
| ∀vₜ ∈ (a→π), Cₜ(vₜ) = C_{t/a→π}(vₜ)
We prove this by induction on Cₜ:
- Cₜ = Leaf_{bb}: when the decision tree is a leaf terminal, the result of trimming on a Leaf is the Leaf itself
| Leaf_{bb/a→π}(v) = Leaf_{bb}(v)
- The same applies to Failure terminal
| Failure_{/a→π}(v) = Failure(v)
- When Cₜ = Switch(b, (π→Cᵢ)ⁱ)_{/a→π} then we look at the accessor
/a/ of the subtree Cᵢ and we define πᵢ' = πᵢ if a≠b else πᵢ∩π Trimming
a switch node yields the following result:
| Switch(b, (π→Cᵢ)^{i∈I})_{/a→π} := Switch(b, (π'ᵢ→C_{i/a→π})^{i∈I})
\begin{comment}
TODO: understand how to properly align lists
check that every list is aligned
\end{comment}
For the trimming lemma we have to prove that running the value vₜ against
the decision tree Cₜ is the same as running vₜ against the tree
C_{trim} that is the result of the trimming operation on Cₜ
| Cₜ(vₜ) = C_{trim}(vₜ) = Switch(b, (πᵢ'→C_{i/a→π})^{i∈I})(vₜ)
We can reason by first noting that when vₜ∉(b→πᵢ)ⁱ the node must be a Failure node.
In the case where ∃k \vert{} vₜ∈(b→πₖ) then we can prove that
| C_{k/a→π}(vₜ) = Switch(b, (πᵢ'→C_{i/a→π})^{i∈I})(vₜ)
because when a ≠ b then πₖ'= πₖ and this means that vₜ∈πₖ'
while when a = b then πₖ'=(πₖ∩π) and vₜ∈πₖ' because:
- by the hypothesis, vₜ∈π
- we are in the case where vₜ∈πₖ
So vₜ ∈ πₖ' and by induction
| Cₖ(vₜ) = C_{k/a→π}(vₜ)
We also know that ∀vₜ∈(b→πₖ) → Cₜ(vₜ) = Cₖ(vₜ)
By putting together the last two steps, we have proven the trimming
lemma.
\begin{comment}
TODO: what should I say about covering??? I swap π and π'
Covering lemma:
∀a,π covers(Cₜ,S) → covers(C_{t/a→π}, (S∩a→π))
Uᵢπⁱ ≈ Uᵢπ'∩(a→π) ≈ (Uᵢπ')∩(a→π) %%
%%%%%%% Also: Should I swap π and π' ?
\end{comment}
\subsubsection{Equivalence checking}
The equivalence checking algorithm takes as parameters an input space
/S/, a source decision tree /Cₛ/ and a target decision tree /Cₜ/:
| equiv(S, Cₛ, Cₜ) → Yes \vert{} No(vₛ, vₜ)
When the algorithm returns Yes and the input space is covered by /Cₛ/
we can say that the couple of decision trees are the same for
every couple of source value /vₛ/ and target value /vₜ/ that are equivalent.
\begin{comment}
Define "covered"
Is it correct to say the same? How to correctly distinguish in words ≃ and = ?
\end{comment}
| equiv(S, Cₛ, Cₜ) = Yes and cover(Cₜ, S) → ∀ vₛ ≃ vₜ∈S ∧ Cₛ(vₛ) = Cₜ(vₜ)
In the case where the algorithm returns No we have at least a couple
of counter example values vₛ and vₜ for which the two decision trees
outputs a different result.
| equiv(S, Cₛ, Cₜ) = No(vₛ,vₜ) and cover(Cₜ, S) → ∀ vₛ ≃ vₜ∈S ∧ Cₛ(vₛ) ≠ Cₜ(vₜ)
We define the following
| Forall(Yes) = Yes
| Forall(Yes::l) = Forall(l)
| Forall(No(vₛ,vₜ)::_) = No(vₛ,vₜ)
There exists and are injective:
| int(k) ∈ (arity(k) = 0)
| tag(k) ∈ (arity(k) > 0)
| π(k) = {n\vert int(k) = n} x {n\vert tag(k) = n}
where k is a constructor.
\begin{comment}
TODO: explain:
∀v∈a→π, C_{/a→π}(v) = C(v)
\end{comment}
We proceed by case analysis:
\begin{comment}
I start numbering from zero to leave the numbers as they were on the blackboard, were we skipped some things
I think the unreachable case should go at the end.
\end{comment}
0. in case of unreachable:
| Cₛ(vₛ) = Absurd(Unreachable) ≠ Cₜ(vₜ) ∀vₛ,vₜ
1. In the case of an empty input space
| equiv(∅, Cₛ, Cₜ) := Yes
and that is trivial to prove because there is no pair of values (vₛ, vₜ) that could be
tested against the decision trees.
In the other subcases S is always non-empty.
2. When there are /Failure/ nodes at both sides the result is /Yes/:
|equiv(S, Failure, Failure) := Yes
Given that ∀v, Failure(v) = Failure, the statement holds.
3. When we have a Leaf or a Failure at the left side:
| equiv(S, Failure as Cₛ, Switch(a, (πᵢ → Cₜᵢ)^{i∈I})) := Forall(equiv( S∩a→π(kᵢ)), Cₛ, Cₜᵢ)^{i∈I})
| equiv(S, Leaf bbₛ as Cₛ, Switch(a, (πᵢ → Cₜᵢ)^{i∈I})) := Forall(equiv( S∩a→π(kᵢ)), Cₛ, Cₜᵢ)^{i∈I})
The algorithm either returns Yes for every sub-input space Sᵢ := S∩(a→π(kᵢ)) and
subtree Cₜᵢ
| equiv(Sᵢ, Cₛ, Cₜᵢ) = Yes ∀i
or we have a counter example vₛ, vₜ for which
| vₛ≃vₜ∈Sₖ ∧ cₛ(vₛ) ≠ Cₜₖ(vₜ)
then because
| vₜ∈(a→πₖ) → Cₜ(vₜ) = Cₜₖ(vₜ) ,
| vₛ≃vₜ∈S ∧ Cₛ(vₛ)≠Cₜ(vₜ)
we can say that
| equiv(Sᵢ, Cₛ, Cₜᵢ) = No(vₛ, vₜ) for some minimal k∈I
4. When we have a Switch on the right we define π_{fallback} as the domain of
values not covered but the union of the constructors kᵢ
| π_{fallback} = ¬\bigcup\limits_{i∈I}π(kᵢ)
The algorithm proceeds by trimming
| equiv(S, Switch(a, (kᵢ → Cₛᵢ)^{i∈I}, C_{sf}), Cₜ) :=
| Forall(equiv( S∩(a→π(kᵢ)^{i∈I}), Cₛᵢ, C_{t/a→π(kᵢ)})^{i∈I} +++ equiv(S∩(a→πₙ), Cₛ, C_{a→π_{fallback}}))
The statement still holds and we show this by first analyzing the
/Yes/ case:
| Forall(equiv( S∩(a→π(kᵢ)^{i∈I}), Cₛᵢ, C_{t/a→π(kᵢ)})^{i∈I} = Yes
The constructor k is either included in the set of constructors kᵢ:
| k \vert k∈(kᵢ)ⁱ ∧ Cₛ(vₛ) = Cₛᵢ(vₛ)
We also know that
| (1) Cₛᵢ(vₛ) = C_{t/a→πᵢ}(vₜ)
| (2) C_{T/a→πᵢ}(vₜ) = Cₜ(vₜ)
(1) is true by induction and (2) is a consequence of the trimming lemma.
Putting everything together:
| Cₛ(vₛ) = Cₛᵢ(vₛ) = C_{T/a→πᵢ}(vₜ) = Cₜ(vₜ)
When the k∉(kᵢ)ⁱ [TODO]
The auxiliary Forall function returns /No(vₛ, vₜ)/ when, for a minimum k,
| equiv(Sₖ, Cₛₖ, C_{T/a→πₖ} = No(vₛ, vₜ)
Then we can say that
| Cₛₖ(vₛ) ≠ C_{t/a→πₖ}(vₜ)
that is enough for proving that
| Cₛₖ(vₛ) ≠ (C_{t/a→πₖ}(vₜ) = Cₜ(vₜ))

Binary file not shown.

View file

@ -1,7 +1,7 @@
\begin{comment} \begin{comment}
TODO: not all todos are explicit. Check every comment section TODO: not all todos are explicit. Check every comment section
TODO: chiedi a Gabriel se T e S vanno bene, ma prima controlla che siano coerenti TODO: chiedi a Gabriel se T e S vanno bene, ma prima controlla che siano coerenti
* TODO Scaletta [1/6] * TODO Scaletta [1/5]
- [X] Introduction - [X] Introduction
- [-] Background [80%] - [-] Background [80%]
- [X] Low level representation - [X] Low level representation
@ -9,16 +9,14 @@ TODO: chiedi a Gabriel se T e S vanno bene, ma prima controlla che siano coerent
- [X] Pattern matching - [X] Pattern matching
- [X] Symbolic execution - [X] Symbolic execution
- [ ] Translation Validation - [ ] Translation Validation
- [ ] Translation validation of the Pattern Matching Compiler - [-] Translation validation of the Pattern Matching Compiler
- [ ] Source translation - [X] Source translation
- [ ] Formal Grammar - [X] Formal Grammar
- [ ] Compilation of source patterns - [X] Compilation of source patterns
- [ ] Rest?
- [ ] Target translation - [ ] Target translation
- [ ] Formal Grammar - [ ] Formal Grammar
- [ ] Symbolic execution of the Lambda target - [ ] Symbolic execution of the Lambda target
- [ ] Equivalence between source and target - [X] Equivalence between source and target
- [ ] Proof of correctness
- [ ] Practical results - [ ] Practical results
Magari prima pattern matching poi compilatore? Magari prima pattern matching poi compilatore?
@ -61,6 +59,7 @@ clause matrix
#+LaTeX_HEADER: \newcommand{\EquivTEX}[3]{\mathsf{equiv}(#1, #2, #3)} % \equiv is already taken #+LaTeX_HEADER: \newcommand{\EquivTEX}[3]{\mathsf{equiv}(#1, #2, #3)} % \equiv is already taken
#+LaTeX_HEADER: \newcommand{\coversTEX}[2]{#1 \mathrel{\mathsf{covers}} #2} #+LaTeX_HEADER: \newcommand{\coversTEX}[2]{#1 \mathrel{\mathsf{covers}} #2}
#+LaTeX_HEADER: \newcommand{\YesTEX}{\mathsf{Yes}} #+LaTeX_HEADER: \newcommand{\YesTEX}{\mathsf{Yes}}
#+LaTeX_HEADER: \newcommand{\DZ}{\backslash\text{\{0\}}}
#+LaTeX_HEADER: \newcommand{\NoTEX}[2]{\mathsf{No}(#1, #2)} #+LaTeX_HEADER: \newcommand{\NoTEX}[2]{\mathsf{No}(#1, #2)}
#+LaTeX_HEADER: \usepackage{comment} #+LaTeX_HEADER: \usepackage{comment}
#+LaTeX_HEADER: \usepackage{mathpartir} #+LaTeX_HEADER: \usepackage{mathpartir}
@ -899,9 +898,7 @@ following form:
|⋮ |⋮
|\vert pₙ \to exprₙ |\vert pₙ \to exprₙ
and can include any expression that is legal for the OCaml compiler, Patterns could or could not be exhaustive.
such as /when/ guards and assignments. Patterns could or could not
be exhaustive.
Pattern matching code could also be written using the more compact form: Pattern matching code could also be written using the more compact form:
|function |function
@ -934,87 +931,18 @@ This BNF grammar describes formally the grammar of the source program:
| assignment ::= "as" id | assignment ::= "as" id
| bexpr ::= _code ;; arbitrary code | bexpr ::= _code ;; arbitrary code
\begin{comment} A source program tₛ is a collection of pattern clauses pointing to
Check that it is still coherent to this bnf /bb/ terms. Running a program tₛ against an input value vₛ produces as
\end{comment} a result /r/:
| tₛ ::= (p → bb)^{i∈I}
| p ::= \vert K(pᵢ)ⁱ, i ∈ I \vert (p\vert{}q) \vert n ∈
| r ::= guard list * (Match bb \vert{} NoMatch \vert{} Absurd)
| tₛ(vₛ) → r
Patterns are of the form TODO: argument on what it means to run a source program
| pattern | type of pattern |
|-----------------+---------------------|
| _ | wildcard |
| x | variable |
| c(p₁,p₂,...,pₙ) | constructor pattern |
| (p₁\vert p₂) | or-pattern |
During compilation by the translators, expressions are compiled into /guard/ and /bb/ expressions are treated as blackboxes of OCaml code.
Lambda code and are referred as lambda code actions lᵢ. A sound approach for treating these blackboxes would be to inspect the
The entire pattern matching code is represented as a clause matrix
that associates rows of patterns (p_{i,1}, p_{i,2}, ..., p_{i,n}) to
lambda code action lⁱ
\begin{equation*}
(P → L) =
\begin{pmatrix}
p_{1,1} & p_{1,2} & \cdots & p_{1,n} & → l₁ \\
p_{2,1} & p_{2,2} & \cdots & p_{2,n} & → l₂ \\
\vdots & \vdots & \ddots & \vdots & → \vdots \\
p_{m,1} & p_{m,2} & \cdots & p_{m,n} & → lₘ
\end{pmatrix}
\end{equation*}
The pattern /p/ matches a value /v/, written as p ≼ v, when one of the
following rules apply
|--------------------+---+--------------------+-------------------------------------------|
| _ | ≼ | v | ∀v |
| x | ≼ | v | ∀v |
| (p₁ \vert p₂) | ≼ | v | iff p₁ ≼ v or p₂ ≼ v |
| c(p₁, p₂, ..., pₐ) | ≼ | c(v₁, v₂, ..., vₐ) | iff (p₁, p₂, ..., pₐ) ≼ (v₁, v₂, ..., vₐ) |
| (p₁, p₂, ..., pₐ) | ≼ | (v₁, v₂, ..., vₐ) | iff pᵢ ≼ vᵢ ∀i ∈ [1..a] |
|--------------------+---+--------------------+-------------------------------------------|
When a value /v/ matches pattern /p/ we say that /v/ is an /instance/ of /p/.
Considering the pattern matrix P we say that the value vector
$\vec{v}$ = (v₁, v₂, ..., vᵢ) matches the line number i in P if and only if the following two
conditions are satisfied:
- p_{i,1}, p_{i,2}, \cdots, p_{i,n} ≼ (v₁, v₂, ..., vᵢ)
- ∀j < i p_{j,1}, p_{j,2}, \cdots, p_{j,n} ⋠ (v₁, v₂, ..., vᵢ)
We can define the following three relations with respect to patterns:
- Patter p is less precise than pattern q, written p ≼ q, when all
instances of q are instances of p
- Pattern p and q are equivalent, written p ≡ q, when their instances
are the same
- Patterns p and q are compatible when they share a common instance
\subsubsection{Parsing of the source program}
The source program of the following general form is parsed using a parser
generated by Menhir, a LR(1) parser generator for the OCaml programming language.
Menhir compiles LR(1) a grammar specification, in this case the OCaml pattern matching
grammar, down to OCaml code.
|match variable with
|\vert pattern₁ -> e₁
|\vert pattern₂ -> e₂
|⋮
|\vert pₘ -> eₘ
The result of parsing, when successful, results in a list of clauses
and a list of type declarations.
Every clause consists of three objects: a left-hand-side that is the
kind of pattern expressed, an option guard and a right-hand-side expression.
Patterns are encoded in the following way:
| pattern | type |
|-----------------+-------------|
| _ | Wildcard |
| p₁ as x | Assignment |
| c(p₁,p₂,...,pₙ) | Constructor |
| (p₁\vert p₂) | Orpat |
Guards and right-hand-sides are treated as a blackbox of OCaml code.
A sound approach for treating these blackbox would be to inspect the
OCaml compiler during translation to Lambda code and extract the OCaml compiler during translation to Lambda code and extract the
blackboxes compiled in their Lambda representation. blackboxes compiled in their Lambda representation.
This would allow to test for equality with the respective blackbox at This would allow to test for equality with the respective blackbox at
@ -1046,9 +974,74 @@ let _ = function
| K2 _ -> observe 4 | K2 _ -> observe 4
#+END_SRC #+END_SRC
The source program is parsed using the ocaml-compiler-libs library.
The result of parsing, when successful, results in a list of clauses
and a list of type declarations.
Every clause consists of three objects: a left-hand-side that is the
kind of pattern expressed, an option guard and a right-hand-side expression.
Patterns are encoded in the following way:
| pattern | type |
|-----------------+-------------|
| _ | Wildcard |
| p₁ as x | Assignment |
| c(p₁,p₂,...,pₙ) | Constructor |
| (p₁\vert p₂) | Orpat |
Once parsed, the type declarations and the list of clauses are encoded in the form of a matrix Once parsed, the type declarations and the list of clauses are encoded in the form of a matrix
that is later evaluated using a matrix decomposition algorithm. that is later evaluated using a matrix decomposition algorithm.
Patterns are of the form
| pattern | type of pattern |
|-----------------+---------------------|
| _ | wildcard |
| x | variable |
| c(p₁,p₂,...,pₙ) | constructor pattern |
| (p₁\vert p₂) | or-pattern |
The pattern /p/ matches a value /v/, written as p ≼ v, when one of the
following rules apply
|--------------------+---+--------------------+-------------------------------------------|
| _ | ≼ | v | ∀v |
| x | ≼ | v | ∀v |
| (p₁ \vert p₂) | ≼ | v | iff p₁ ≼ v or p₂ ≼ v |
| c(p₁, p₂, ..., pₐ) | ≼ | c(v₁, v₂, ..., vₐ) | iff (p₁, p₂, ..., pₐ) ≼ (v₁, v₂, ..., vₐ) |
| (p₁, p₂, ..., pₐ) | ≼ | (v₁, v₂, ..., vₐ) | iff pᵢ ≼ vᵢ ∀i ∈ [1..a] |
|--------------------+---+--------------------+-------------------------------------------|
When a value /v/ matches pattern /p/ we say that /v/ is an /instance/ of /p/.
During compilation by the translators, expressions are compiled into
Lambda code and are referred as lambda code actions lᵢ.
The entire pattern matching code is represented as a clause matrix
that associates rows of patterns (p_{i,1}, p_{i,2}, ..., p_{i,n}) to
lambda code action lⁱ
\begin{equation*}
(P → L) =
\begin{pmatrix}
p_{1,1} & p_{1,2} & \cdots & p_{1,n} & → l₁ \\
p_{2,1} & p_{2,2} & \cdots & p_{2,n} & → l₂ \\
\vdots & \vdots & \ddots & \vdots & → \vdots \\
p_{m,1} & p_{m,2} & \cdots & p_{m,n} & → lₘ
\end{pmatrix}
\end{equation*}
Considering the pattern matrix P we say that the value vector
$\vec{v}$ = (v₁, v₂, ..., vᵢ) matches the line number i in P if and only if the following two
conditions are satisfied:
- p_{i,1}, p_{i,2}, \cdots, p_{i,n} ≼ (v₁, v₂, ..., vᵢ)
- ∀j < i p_{j,1}, p_{j,2}, \cdots, p_{j,n} ⋠ (v₁, v₂, ..., vᵢ)
We can define the following three relations with respect to patterns:
- Pattern p is less precise than pattern q, written p ≼ q, when all
instances of q are instances of p
- Pattern p and q are equivalent, written p ≡ q, when their instances
are the same
- Patterns p and q are compatible when they share a common instance
\subsubsection{Matrix decomposition of pattern clauses} \subsubsection{Matrix decomposition of pattern clauses}
The initial input of the decomposition algorithm C consists of a vector of variables The initial input of the decomposition algorithm C consists of a vector of variables
@ -1076,8 +1069,7 @@ C₀((),
→ l₂ \\ → l₂ \\
→ \vdots \\ → \vdots \\
→ lₘ → lₘ
\end{pmatrix}) \end{pmatrix}) = l₁
) = l₁
\end{equation*} \end{equation*}
When $\vec{x}$ ≠ () then the compilation advances using one of the When $\vec{x}$ ≠ () then the compilation advances using one of the
@ -1118,25 +1110,25 @@ following four rules:
constructor and we define new fresh variables y₁, y₂, ..., yₐ so constructor and we define new fresh variables y₁, y₂, ..., yₐ so
that the lambda action lᵢ becomes that the lambda action lᵢ becomes
\begin{lstlisting}[mathescape,columns=fullflexible,basicstyle=\fontfamily{lmvtt}\selectfont,] \begin{lstlisting}[mathescape,columns=fullflexible,basicstyle=\fontfamily{lmvtt}\selectfont,]
(let (y₁ (field 0 x₁)) (let (y₁ (field 0 x₁))
(y₂ (field 1 x₁)) (y₂ (field 1 x₁))
... ...
(yₐ (field (a-1) x₁)) (yₐ (field (a-1) x₁))
lᵢ) lᵢ)
\end{lstlisting} \end{lstlisting}
and the result of the compilation for the set of constructors and the result of the compilation for the set of constructors
{c₁, c₂, ..., cₖ} is: {c₁, c₂, ..., cₖ} is:
\begin{lstlisting}[mathescape,columns=fullflexible,basicstyle=\fontfamily{lmvtt}\selectfont,] \begin{lstlisting}[mathescape,columns=fullflexible,basicstyle=\fontfamily{lmvtt}\selectfont,]
switch x₁ with switch x₁ with
case c₁: l₁ case c₁: l₁
case c₂: l₂ case c₂: l₂
... ...
case cₖ: lₖ case cₖ: lₖ
default: exit default: exit
\end{lstlisting} \end{lstlisting}
3) Orpat rule: there are various strategies for dealing with 3) Orpat rule: there are various strategies for dealing with
or-patterns. The most naive one is to split the or-patterns. or-patterns. The most naive one is to split the or-patterns.
@ -1161,9 +1153,239 @@ following four rules:
apply, and P₂ → L₂ containing the remaining rows. The algorithm is apply, and P₂ → L₂ containing the remaining rows. The algorithm is
applied to both matrices. applied to both matrices.
In our prototype we make use of accessors to encode stored values.
\begin{minipage}{0.2\linewidth}
\begin{verbatim}
let value = 1 :: 2 :: 3 :: []
(* that can also be written *)
let value = []
|> List.cons 3
|> List.cons 2
|> List.cons 1
\end{verbatim}
\end{minipage}
\hfill
\begin{minipage}{0.5\linewidth}
\begin{verbatim}
(field 0 x) = 1
(field 0 (field 1 x)) = 2
(field 0 (field 1 (field 1 x)) = 3
(field 0 (field 1 (field 1 (field 1 x)) = []
\end{verbatim}
\end{minipage}
An \emph{accessor} /a/ represents the
access path to a value that can be reached by deconstructing the
scrutinee.
| a ::= Here \vert n.a
The above example, in encoded form:
\begin{verbatim}
Here = 1
1.Here = 2
1.1.Here = 3
1.1.1.Here = []
\end{verbatim}
In our prototype the source matrix mₛ is defined as follows
| SMatrix mₛ := (aⱼ)^{j∈J}, ((p_{ij})^{j∈J} → bbᵢ)^{i∈I}
Source matrices are used to build source decision trees Cₛ.
A decision tree is defined as either a Leaf, a Failure terminal or
an intermediate node with different children sharing the same accessor /a/
and an optional fallback.
Failure is emitted only when the patterns don't cover the whole set of
possible input values /S/. The fallback is not needed when the user
doesn't use a wildcard pattern.
%%% Give example of thing
| Cₛ ::= Leaf bb \vert Switch(a, (Kᵢ → Cᵢ)^{i∈S} , C?) \vert Failure \vert Unreachable
| vₛ ::= K(vᵢ)^{i∈I} \vert n ∈
\begin{comment}
Are K and Here clear here?
\end{comment}
We say that a translation of a source program to a decision tree
is correct when for every possible input, the source program and its
respective decision tree produces the same result
| ∀vₛ, tₛ(vₛ) = 〚tₛ〛ₛ(vₛ)
We define the decision tree of source programs
〚tₛ〛
in terms of the decision tree of pattern matrices
〚mₛ〛
by the following:
| 〚((pᵢ → bbᵢ)^{i∈I}〛 := 〚(Here), (pᵢ → bbᵢ)^{i∈I} 〛
Decision tree computed from pattern matrices respect the following invariant:
| ∀v (vᵢ)^{i∈I} = v(aᵢ)^{i∈I} → 〚m〛(v) = m(vᵢ)^{i∈I} for m = ((aᵢ)^{i∈I}, (rᵢ)^{i∈I})
| v(Here) = v
| K(vᵢ)ⁱ(k.a) = vₖ(a) if k ∈ [0;n[
\begin{comment}
TODO: EXPLAIN
\end{comment}
We proceed to show the correctness of the invariant by a case
analysys.
Base cases:
1. [| ∅, (∅ → bbᵢ)ⁱ |] ≡ Leaf bbᵢ where i := min(I), that is a
decision tree [|ms|] defined by an empty accessor and empty
patterns pointing to blackboxes /bbᵢ/. This respects the invariant
because a source matrix in the case of empty rows returns
the first expression and (Leaf bb)(v) := Match bb
2. [| (aⱼ)ʲ, ∅ |] ≡ Failure
Regarding non base cases:
Let's first define
| let Idx(k) := [0; arity(k)[
| let First(∅) := ⊥
| let First((aⱼ)ʲ) := a_{min(j∈J≠∅)}
\[
m := ((a_i)^i ((p_{ij})^i \to e_j)^{ij})
\]
\[
(k_k)^k := headconstructor(p_{i0})^i
\]
\begin{equation}
Groups(m) := ( k_k \to ((a)_{0.l})^{l \in Idx(k_k)} +++ (a_i)^{i \in I\DZ}), \\
( if p_{0j} is k(q_l) then \\
(qₗ)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\DZ} \to e_j \\
if p_{0j} is \_ then \\
(\_)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\DZ} \to e_j \\
else \bot )^j ), \\
((a_i)^{i \in I\DZ}, ((p_{ij})^{i \in I\DZ} \to eⱼ if p_{0j} is \_ else \bot)^{j \in J})
\end{equation}
Groups(m) is an auxiliary function that source a matrix m into
submatrices, according to the head constructor of their first pattern.
Groups(m) returns one submatrix m_r for each head constructor k that
occurs on the first row of m, plus one "wildcard submatrix" m_{wild}
that matches on all values that do not start with one of those head
constructors.
Intuitively, m is equivalent to its decomposition in the following
sense: if the first pattern of an input vector (vᵢ)ⁱ starts with one
of the head constructors k, then running (vᵢ)ⁱ against m is the same
as running it against the submatrix mₖ; otherwise (its head
constructor is none of the k) it is equivalent to running it against
the wildcard submatrix.
We formalize this intuition as follows:
Lemma (Groups):
Let \[m\] be a matrix with \[Groups(m) = (k_r \to m_r)^k, m_{wild}\].
For any value vector $(v_i)^l$ such that $v_0 = k(v'_l)^l$ for some
constructor k,
we have:
| if k = kₖ \text{ for some k then}
| \quad m(vᵢ)ⁱ = mₖ((v_{l}')ˡ +++ (v_{i})^{i∈I\DZ})
| \text{else}
| \quad m(vᵢ)ⁱ = m_{wild}(vᵢ)^{i∈I\DZ}
\begin{comment}
TODO: fix \{0}
\end{comment}
*** Proof:
Let $m$ be a matrix with \[Group(m) = (k_r \to m_r)^k, m_{wild}\]
Let $(v_i)^i$ be an input matrix with $v_0 = k(v'_l)^l$ for some k.
We proceed by case analysis:
- either k is one of the kₖ for some k
- or k is none of the (kₖ)ᵏ
Both m(vᵢ)ⁱ and mₖ(vₖ)ᵏ are defined as the first matching result of
a family over each row rⱼ of a matrix
We know, from the definition of
Groups(m), that mₖ is
| ((a){0.l})^{l∈Idx(kₖ)} +++ (aᵢ)^{i∈I\DZ}),
| (
| \quad if p_{0j} is k(qₗ) then
| \quad \quad (qₗ)ˡ +++ (p_{ij})^{i∈I\DZ } → eⱼ
| \quad if p_{0j} is _ then
| \quad \quad (_)ˡ +++ (p_{ij})^{i∈I\DZ} → eⱼ
| \quad else ⊥
| )^{j∈J}
By definition, m(vᵢ)ⁱ is
| m(vᵢ)ⁱ = First(rⱼ(vᵢ)ⁱ)ʲ for m = ((aᵢ)ⁱ, (rⱼ)ʲ)
| (pᵢ)ⁱ (vᵢ)ⁱ = {
| \quad if p₀ = k(qₗ)ˡ, v₀ = k'(v'ₖ)ᵏ, k=Idx(k') and l=Idx(k)
| \quad \quad if k ≠ k' then ⊥
| if k = k' then ((qₗ)ˡ +++ (pᵢ)^{i∈I\DZ}) ((v'ₖ)ᵏ +++ (vᵢ)^{i∈I\DZ} )
| if p₀ = (q₁\vert{}q₂) then
| First( (q₁pᵢ^{i∈I\DZ}) vᵢ^{i∈I\DZ}, (q₂pᵢ^{i∈I \DZ}) vᵢ^{i∈I\DZ})}
For this reason, if we can prove that
| ∀j, rⱼ(vᵢ)ⁱ = r'ⱼ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
it follows that
| m(vᵢ)ⁱ = mₖ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
from the above definition.
We can also show that aᵢ = (a_{0.l})ˡ +++ a_{i∈I\DZ} because v(a₀) = K(v(a){0.l})ˡ)
** Target translation ** Target translation
TODO The target program of the following general form is parsed using a parser
generated by Menhir, a LR(1) parser generator for the OCaml programming language.
Menhir compiles LR(1) a grammar specification, in this case a subset
of the Lambda intermediate language, down to OCaml code.
This is the grammar of the target language (TODO: check menhir grammar)
| start ::= sexpr
| sexpr ::= variable \vert{} string \vert{} "(" special_form ")"
| string ::= "\"" identifier "\"" ;; string between doublequotes
| variable ::= identifier
| special_form ::= let\vert{}catch\vert{}if\vert{}switch\vert{}switch-star\vert{}field\vert{}apply\vert{}isout
| let ::= "let" assignment sexpr ;; (assignment sexpr)+ outside of pattern match code
| assignment ::= "function" variable variable+ ;; the first variable is the identifier of the function
| field ::= "field" digit variable
| apply ::= ocaml_lambda_code ;; arbitrary code
| catch ::= "catch" sexpr with sexpr
| with ::= "with" "(" label ")"
| exit ::= "exit" label
| switch-star ::= "switch*" variable case*
| switch::= "switch" variable case* "default:" sexpr
| case ::= "case" casevar ":" sexpr
| casevar ::= ("tag"\vert{}"int") integer
| if ::= "if" bexpr sexpr sexpr
| bexpr ::= "(" ("!="\vert{}"=="\vert{}">="\vert{}"<="\vert{}">"\vert{}"<") sexpr digit \vert{} field ")"
| label ::= integer
The prototype doesn't support strings.
The AST built by the parser is traversed and evaluated by the symbolic
execution engine.
Given that the target language supports jumps in the form of "catch - exit"
blocks the engine tries to evaluate the instructions inside the blocks
and stores the result of the partial evaluation into a record.
When a jump is encountered, the information at the point allows to
finalize the evaluation of the jump block.
In the environment the engine also stores bindings to values and
functions.
Integer additions and subtractions are simplified in a second pass.
The result of the symbolic evaluation is a target decision tree Cₜ
| Cₜ ::= Leaf bb \vert Switch(a, (πᵢ → Cᵢ)^{i∈S} , C?) \vert Failure
| vₜ ::= Cell(tag ∈ , (vᵢ)^{i∈I}) \vert n ∈
Every branch of the decision tree is "constrained" by a domain
| Domain π = { n\vert{}n∈ x n\vert{}n∈Tag⊆ }
Intuitively, the π condition at every branch tells us the set of
possible values that can "flow" through that path.
π conditions are refined by the engine during the evaluation; at the
root of the decision tree the domain corresponds to the set of
possible values that the type of the function can hold.
C? is the fallback node of the tree that is taken whenever the value
at that point of the execution can't flow to any other subbranch.
Intuitively, the π_{fallback} condition of the C? fallback node is
| π_{fallback} = ¬\bigcup\limits_{i∈I}πᵢ
The fallback node can be omitted in the case where the domain of the
children nodes correspond to set of possible values pointed by the
accessor at that point of the execution
| domain(vₛ(a)) = \bigcup\limits_{i∈I}πᵢ
We say that a translation of a target program to a decision tree
is correct when for every possible input, the target program and its
respective decision tree produces the same result
| ∀vₜ, tₜ(vₜ) = 〚tₜ〛ₜ(vₜ)
** Equivalence checking ** Equivalence checking
@ -1186,7 +1408,6 @@ TODO: we have to define what \coversTEX mean for readers to understand the speci
& \implies & \implies
v_S \approx v_T \in S \;\land\; C_S(v_S) \neq C_T(v_T) v_S \approx v_T \in S \;\land\; C_S(v_S) \neq C_T(v_T)
\end{align*} \end{align*}
Our equivalence-checking algorithm $\EquivTEX S {C_S} {C_T} G$ is Our equivalence-checking algorithm $\EquivTEX S {C_S} {C_T} G$ is
a exactly decision procedure for the provability of the judgment a exactly decision procedure for the provability of the judgment
$(\EquivTEX S {C_S} {C_T} G)$, defined below. $(\EquivTEX S {C_S} {C_T} G)$, defined below.
@ -1262,8 +1483,6 @@ $(\EquivTEX S {C_S} {C_T} G)$, defined below.
{\EquivTEX S {\EquivTEX S
{C_S} {\Guard {t_T} {C_0} {C_1}} {(t_S = b), G}} {C_S} {\Guard {t_T} {C_0} {C_1}} {(t_S = b), G}}
\end{mathpar} \end{mathpar}
* Correctness of the algorithm
Running a program tₛ or its translation 〚tₛ〛 against an input vₛ Running a program tₛ or its translation 〚tₛ〛 against an input vₛ
produces as a result /r/ in the following way: produces as a result /r/ in the following way:
| ( 〚tₛ〛ₛ(vₛ) ≡ Cₛ(vₛ) ) → r | ( 〚tₛ〛ₛ(vₛ) ≡ Cₛ(vₛ) ) → r
@ -1271,211 +1490,37 @@ produces as a result /r/ in the following way:
Likewise Likewise
| ( 〚tₜ〛ₜ(vₜ) ≡ Cₜ(vₜ) ) → r | ( 〚tₜ〛ₜ(vₜ) ≡ Cₜ(vₜ) ) → r
| tₜ(vₜ) → r | tₜ(vₜ) → r
where result r ::= guard list * (Match blackbox | NoMatch | Absurd) | result r ::= guard list * (Match blackbox \vert{} NoMatch \vert{} Absurd)
and guard ::= blackbox. | guard ::= blackbox.
Having defined equivalence between two inputs of which one is Having defined equivalence between two inputs of which one is
expressed in the source language and the other in the target language expressed in the source language and the other in the target language,
vₛ ≃ vₜ (TODO define, this talks about the representation of source values in the target) vₛ ≃ vₜ, we can define the equivalence between a couple of programs or
a couple of decision trees
we can define the equivalence between a couple of programs or a couple
of decision trees
| tₛ ≃ tₜ := ∀vₛ≃vₜ, tₛ(vₛ) = tₜ(vₜ) | tₛ ≃ tₜ := ∀vₛ≃vₜ, tₛ(vₛ) = tₜ(vₜ)
| Cₛ ≃ Cₜ := ∀vₛ≃vₜ, Cₛ(vₛ) = Cₜ(vₜ) | Cₛ ≃ Cₜ := ∀vₛ≃vₜ, Cₛ(vₛ) = Cₜ(vₜ)
The result of the proposed equivalence algorithm is /Yes/ or /No(vₛ, The result of the proposed equivalence algorithm is /Yes/ or /No(vₛ,
vₜ)/. In particular, in the negative case, vₛ and vₜ are a couple of vₜ)/. In particular, in the negative case, vₛ and vₜ are a couple of
possible counter examples for which the decision trees produce a possible counter examples for which the decision trees produce a
different result. different result.
** Statements In the presence of guards we can say that two results are
Theorem. We say that a translation of a source program to a decision tree
is correct when for every possible input, the source program and its
respective decision tree produces the same result
| ∀vₛ, tₛ(vₛ) = 〚tₛ〛ₛ(vₛ)
Likewise, for the target language:
| ∀vₜ, tₜ(vₜ) = 〚tₜ〛ₜ(vₜ)
Definition: in the presence of guards we can say that two results are
equivalent modulo the guards queue, written /r₁ ≃gs r₂/, when: equivalent modulo the guards queue, written /r₁ ≃gs r₂/, when:
| (gs₁, r₁) ≃gs (gs₂, r₂) ⇔ (gs₁, r₁) = (gs₂ ++ gs, r₂) | (gs₁, r₁) ≃gs (gs₂, r₂) ⇔ (gs₁, r₁) = (gs₂ ++ gs, r₂)
We say that Cₜ covers the input space /S/, written
Definition: we say that Cₜ covers the input space /S/, written /covers(Cₜ, S)/ when every value vₛ∈S is a valid input to the
/covers(Cₜ, S) when every value vₛ∈S is a valid input to the
decision tree Cₜ. (TODO: rephrase) decision tree Cₜ. (TODO: rephrase)
Given an input space /S/ and a couple of decision trees, where
Theorem: Given an input space /S/ and a couple of decision trees, where
the target decision tree Cₜ covers the input space /S/, we say that the target decision tree Cₜ covers the input space /S/, we say that
the two decision trees are equivalent when: the two decision trees are equivalent when:
| equiv(S, Cₛ, Cₜ, gs) = Yes ∧ covers(Cₜ, S) → ∀vₛ≃vₜ ∈ S, Cₛ(vₛ) ≃gs Cₜ(vₜ) | equiv(S, Cₛ, Cₜ, gs) = Yes ∧ covers(Cₜ, S) → ∀vₛ≃vₜ ∈ S, Cₛ(vₛ) ≃gs Cₜ(vₜ)
Similarly we say that a couple of decision trees in the presence of Similarly we say that a couple of decision trees in the presence of
an input space /S/ are /not/ equivalent when: an input space /S/ are /not/ equivalent when:
| equiv(S, Cₛ, Cₜ, gs) = No(vₛ,vₜ) ∧ covers(Cₜ, S) → vₛ≃vₜ ∈ S ∧ Cₛ(vₛ) ≠gs Cₜ(vₜ) | equiv(S, Cₛ, Cₜ, gs) = No(vₛ,vₜ) ∧ covers(Cₜ, S) → vₛ≃vₜ ∈ S ∧ Cₛ(vₛ) ≠gs Cₜ(vₜ)
Corollary: For a full input space /S/, that is the universe of the Corollary: For a full input space /S/, that is the universe of the
target program we say: target program we say:
| equiv(S, 〚tₛ〛ₛ, 〚tₜ〛ₜ, ∅) = Yes ⇔ tₛ ≃ tₜ | equiv(S, 〚tₛ〛ₛ, 〚tₜ〛ₜ, ∅) = Yes ⇔ tₛ ≃ tₜ
*** Proof of the correctness of the translation from source programs to source decision trees
We define the source term tₛ as a collection of patterns pointing to blackboxes
| tₛ ::= (p → bb)^{i∈I}
A pattern is defined as either a constructor pattern, an or pattern or
a constant pattern
| p ::= \vert K(pᵢ)ⁱ, i ∈ I \vert (p\vert{}q) \vert n ∈
A decision tree is defined as either a Leaf, a Failure terminal or
an intermediate node with different children sharing the same accessor /a/
and an optional fallback.
Failure is emitted only when the patterns don't cover the whole set of
possible input values /S/. The fallback is not needed when the user
doesn't use a wildcard pattern.
%%% Give example of thing
| Cₛ ::= Leaf bb \vert Switch(a, (Kᵢ → Cᵢ)^{i∈S} , C?)
| a ::= Here \vert n.a
| vₛ ::= K(vᵢ)^{i∈I} \vert n ∈
\begin{comment}
Are K and Here clear here?
\end{comment}
We define the decomposition matrix /mₛ/ as
| SMatrix mₛ := (aⱼ)^{j∈J}, ((p_{ij})^{j∈J} → bbᵢ)^{i∈I}
\begin{comment}
Correggi prendendo in considerazione l'accessor
\end{comment}
We define the decision tree of source programs
〚tₛ〛
in terms of the decision tree of pattern matrices
〚mₛ〛
by the following:
〚((pᵢ → bbᵢ)^{i∈I}〛 := 〚(Root), (pᵢ → bbᵢ)^{i∈I} 〛
decision tree computed from pattern matrices respect the following invariant:
| ∀v (vᵢ)^{i∈I} = v(aᵢ)^{i∈I} → 〚m〛(v) = m(vᵢ)^{i∈I} for m = ((aᵢ)^{i∈I}, (rᵢ)^{i∈I})
where
| v(Here) = v
| K(vᵢ)ⁱ(k.a) = vₖ(a) if k ∈ [0;n[
\begin{comment}
TODO: EXPLAIN
\end{comment}
We proceed to show the correctness of the invariant by a case
analysys.
Base cases:
1. [| ∅, (∅ → bbᵢ)ⁱ |] ≡ Leaf bbᵢ where i := min(I), that is a
decision tree [|ms|] defined by an empty accessor and empty
patterns pointing to blackboxes /bbᵢ/. This respects the invariant
because a decomposition matrix in the case of empty rows returns
the first expression and (Leaf bb)(v) := Match bb
2. [| (aⱼ)ʲ, ∅ |] ≡ Failure
Regarding non base cases:
Let's first define
| let Idx(k) := [0; arity(k)[
| let First(∅) := ⊥
| let First((aⱼ)ʲ) := a_{min(j∈J≠∅)}
\[
m := ((a_i)^i ((p_{ij})^i \to e_j)^{ij})
\]
\[
(k_k)^k := headconstructor(p_{i0})^i
\]
\begin{equation}
Groups(m) := ( k_k \to ((a)_{0.l})^{l \in Idx(k_k)} +++ (a_i)^{i \in I\backslash \{0\} }), \\
( if p_{0j} is k(q_l) then \\
(qₗ)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\backslash \{0\}} \to e_j \\
if p_{0j} is \_ then \\
(\_)^{l \in Idx(k_k)} +++ (p_{ij})^{i \in I\backslash \{0\}} \to e_j \\
else \bot )^j ), \\
((a_i)^{i \in I\backslash \{0\}}, ((p_{ij})^{i \in I\backslash \{0\}} \to eⱼ if p_{0j} is \_ else \bot)^{j \in J})
\end{equation}
Groups(m) is an auxiliary function that decomposes a matrix m into
submatrices, according to the head constructor of their first pattern.
Groups(m) returns one submatrix m_r for each head constructor k that
occurs on the first row of m, plus one "wildcard submatrix" m_{wild}
that matches on all values that do not start with one of those head
constructors.
Intuitively, m is equivalent to its decomposition in the following
sense: if the first pattern of an input vector (vᵢ)ⁱ starts with one
of the head constructors k, then running (vᵢ)ⁱ against m is the same
as running it against the submatrix mₖ; otherwise (its head
constructor is none of the k) it is equivalent to running it against
the wildcard submatrix.
We formalize this intuition as follows:
Lemma (Groups):
Let \[m\] be a matrix with \[Groups(m) = (k_r \to m_r)^k, m_{wild}\].
For any value vector \[(v_i)^l\] such that \[v_0 = k(v'_l)^l\] for some
constructor k,
we have:
\[
if k = kₖ for some k then
m(vᵢ)ⁱ = mₖ((v'ₗ)ˡ +++ (vᵢ)^{i∈I\backslash \{0\}})
else
m(vᵢ)ⁱ = m_{wild}(vᵢ)^{i∈I\backslash \{0\}}
\]
*** Proof:
Let \[m\] be a matrix with \[Group(m) = (k_r \to m_r)^k, m_{wild}\].
Let \[(v_i)^i\] be an input matrix with \[v_0 = k(v'_l)^l\] for some k.
We proceed by case analysis:
- either k is one of the kₖ for some k
- or k is none of the (kₖ)ᵏ
Both m(vᵢ)ⁱ and mₖ(vₖ)ᵏ are defined as the first matching result of
a family over each row rⱼ of a matrix
We know, from the definition of
Groups(m), that mₖ is
\[
((a){0.l})^{l∈Idx(kₖ)} +++ (aᵢ)^{i∈I\backslash \{0\}}),
(
if p_{0j} is k(qₗ) then
(qₗ)ˡ +++ (p_{ij})^{i∈I\backslash \{0\}} → eⱼ
if p_{0j} is _ then
(_)ˡ +++ (p_{ij})^{i∈I\backslash \{0\}} → eⱼ
else ⊥
\]
By definition, m(vᵢ)ⁱ is
m(vᵢ)ⁱ = First(rⱼ(vᵢ)ⁱ)ʲ for m = ((aᵢ)ⁱ, (rⱼ)ʲ)
(pᵢ)ⁱ (vᵢ)ⁱ = {
if p₀ = k(qₗ)ˡ, v₀ = k'(v'ₖ)ᵏ, k=Idx(k') and l=Idx(k)
if k ≠ k' then ⊥
if k = k' then ((qₗ)ˡ +++ (pᵢ)^{i∈I\backslash \{0\}}) ((v'ₖ)ᵏ +++ (vᵢ)^{i∈I\backslash \{0\}})
if p₀ = (q₁|q₂) then
First( (q₁pᵢ^{i∈I \backslash \{0\}}) vᵢ^{i∈I \backslash \{0\}}, (q₂pᵢ^{i∈I \backslash \{0\}}) vᵢ^{i∈I \backslash \{0\}} )
}
For this reason, if we can prove that
| ∀j, rⱼ(vᵢ)ⁱ = r'ⱼ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
it follows that
| m(vᵢ)ⁱ = mₖ((v'ₖ)ᵏ ++ (vᵢ)ⁱ)
from the above definition.
We can also show that aᵢ = a_{0.l}ˡ +++ a_{i∈I\backslash \{0\}} because v(a₀) = K(v(a){0.l})ˡ)
** Proof of equivalence checking
\begin{comment} \begin{comment}
TODO: put ^i∈I where needed TODO: put ^i∈I where needed
\end{comment} \end{comment}
@ -1493,17 +1538,10 @@ We prove this by induction on Cₜ:
/a/ of the subtree Cᵢ and we define πᵢ' = πᵢ if a≠b else πᵢ∩π Trimming /a/ of the subtree Cᵢ and we define πᵢ' = πᵢ if a≠b else πᵢ∩π Trimming
a switch node yields the following result: a switch node yields the following result:
| Switch(b, (π→Cᵢ)^{i∈I})_{/a→π} := Switch(b, (π'ᵢ→C_{i/a→π})^{i∈I}) | Switch(b, (π→Cᵢ)^{i∈I})_{/a→π} := Switch(b, (π'ᵢ→C_{i/a→π})^{i∈I})
\begin{comment} \begin{comment}
TODO: understand how to properly align lists TODO: understand how to properly align lists
check that every list is aligned check that every list is aligned
\end{comment} \end{comment}
\begin{comment}
Actually in the proof.org file I transcribed:
e. Unreachabe → ⊥
This is not correct because you don't have Unreachable nodes in target decision trees
\end{comment}
For the trimming lemma we have to prove that running the value vₜ against For the trimming lemma we have to prove that running the value vₜ against
the decision tree Cₜ is the same as running vₜ against the tree the decision tree Cₜ is the same as running vₜ against the tree
C_{trim} that is the result of the trimming operation on Cₜ C_{trim} that is the result of the trimming operation on Cₜ
@ -1548,7 +1586,6 @@ In the case where the algorithm returns No we have at least a couple
of counter example values vₛ and vₜ for which the two decision trees of counter example values vₛ and vₜ for which the two decision trees
outputs a different result. outputs a different result.
| equiv(S, Cₛ, Cₜ) = No(vₛ,vₜ) and cover(Cₜ, S) → ∀ vₛ ≃ vₜ∈S ∧ Cₛ(vₛ) ≠ Cₜ(vₜ) | equiv(S, Cₛ, Cₜ) = No(vₛ,vₜ) and cover(Cₜ, S) → ∀ vₛ ≃ vₜ∈S ∧ Cₛ(vₛ) ≠ Cₜ(vₜ)
We define the following We define the following
| Forall(Yes) = Yes | Forall(Yes) = Yes
| Forall(Yes::l) = Forall(l) | Forall(Yes::l) = Forall(l)
@ -1592,12 +1629,12 @@ I think the unreachable case should go at the end.
| vₛ≃vₜ∈S ∧ Cₛ(vₛ)≠Cₜ(vₜ) | vₛ≃vₜ∈S ∧ Cₛ(vₛ)≠Cₜ(vₜ)
we can say that we can say that
| equiv(Sᵢ, Cₛ, Cₜᵢ) = No(vₛ, vₜ) for some minimal k∈I | equiv(Sᵢ, Cₛ, Cₜᵢ) = No(vₛ, vₜ) for some minimal k∈I
4. When we have a Switch on the right we define π as the domain of 4. When we have a Switch on the right we define π_{fallback} as the domain of
values not covered but the union of the constructors kᵢ values not covered but the union of the constructors kᵢ
| πₙ = ¬(⋃π(kᵢ)^{i∈I}) | π_{fallback} = ¬\bigcup\limits_{i∈I}π(kᵢ)
The algorithm proceeds by trimming The algorithm proceeds by trimming
| equiv(S, Switch(a, (kᵢ → Cₛᵢ)^{i∈I}, C_{sf}), Cₜ) := | equiv(S, Switch(a, (kᵢ → Cₛᵢ)^{i∈I}, C_{sf}), Cₜ) :=
| Forall(equiv( S∩(a→π(kᵢ)^{i∈I}), Cₛᵢ, C_{t/a→π(kᵢ)})^{i∈I} +++ equiv(S∩(a→πₙ), Cₛ, C_{a→π})) | Forall(equiv( S∩(a→π(kᵢ)^{i∈I}), Cₛᵢ, C_{t/a→π(kᵢ)})^{i∈I} +++ equiv(S∩(a→πₙ), Cₛ, C_{a→π_{fallback}}))
The statement still holds and we show this by first analyzing the The statement still holds and we show this by first analyzing the
/Yes/ case: /Yes/ case:
| Forall(equiv( S∩(a→π(kᵢ)^{i∈I}), Cₛᵢ, C_{t/a→π(kᵢ)})^{i∈I} = Yes | Forall(equiv( S∩(a→π(kᵢ)^{i∈I}), Cₛᵢ, C_{t/a→π(kᵢ)})^{i∈I} = Yes