Révision 267

CSL17/tech-report/main.tex (revision 267)
68 68
\appendix
69 69

  
70 70
%%\input{pv-theories}	
71
\input{appendix-sequent-calculus}
71
%\input{appendix-sequent-calculus}
72 72

  
73 73

  
74 74
\end{document}
CSL17/tech-report/arithmetic.tex (revision 267)
26 26

  
27 27
Namely, in addition to usual axioms, our $\basic$ includes the following:\footnote{Later some of these will be redundant, for instance $\safe (u \times x) $ and $\safe (u \smsh v)$ are consequences of $\Sigma^\safe_i$-$\pind$, but $\safe (x + y)$ is not since both inputs are safe and so we cannot induct.}
28 28
\begin{enumerate}[(a)]
29
	\item $\forall u^\normal. \safe(u) $
29
	\item\label{ax:a} $\forall u^\normal. \safe(u) $
30 30
	\item $\safe (0) $
31 31
	\item $\forall x^\safe . \safe (\succ{} x) $
32 32
	\item $\forall u^\safe .\safe(\hlf{u})$
33 33
	\item $\forall x^\safe, y^\safe . \safe(x+y)$
34 34
	\item $\forall u^\normal, x^\safe . \safe(u\times x) $
35 35
	\item $\forall u^\normal , v^\normal . \safe (u \smsh v)$
36
	\item $\forall u^\normal .\safe(|u|)$
36
	\item\label{ax:h} $\forall u^\normal .\safe(|u|)$
37 37
\end{enumerate}
38 38
%\[
39 39
%\begin{array}{l}
......
50 50
%\forall u^\normal .\safe(|x|)
51 51
%\end{array}
52 52
%\]
53
\patrick{Actually I guess that we could replace the last one by $\forall x^\normal .\safe(|x|)$, as $|x|$ has smaller size as $x$? But I am not sure this would be needed anywhere. }
53
%\patrick{Actually I guess that we could replace the last one by $\forall x^\normal .\safe(|x|)$, as $|x|$ has smaller size as $x$? But I am not sure this would be needed anywhere. }
54 54

  
55 55
Notice in particular that we have $\normal \subseteq \safe$.
56 56
%
......
211 211
%
212 212
%\anupam{In induction,for inductive cases, need $u\neq 0$ for $\succ 0$ case.}
213 213

  
214
\anupam{Need vector $\vec y^\safe$?}
214
%\anupam{Need vector $\vec y^\safe$?}
215 215

  
216 216
\begin{remark}
217 217
Notice that $\rais$ looks similar to the $K$ rule from the calculus for the modal logic $S4$ (which subsumes necessitation), and indeed we believe there is a way to present these results in such a framework.
......
219 219
	\end{remark}
220 220

  
221 221

  
222
Let us consider some basic examples of theorems of our
222 223
\begin{example}
223 224
	[Some basic theorems]
224 225
	We give proofs of the following, that are sometimes included as basic axioms, in $\arith^1$:
225
	\[
226
	\begin{array}{l}
227
	\forall x^\safe . 0 \neq \succ{} (x) \\
228
	\forall x^\safe , y^\safe . (\succ{} x = \succ{} y \cimp x = y) \\
229
	%\forall x^\safe . (x = 0 \cor \exists y^\safe.\  x = \succ{} y   )  \\        %% version 1: Anupam
230
	%\forall u^\normal . (u = 0 \cor \exists v^\normal.\  u = \succ{} v  )  %%  version 2: Patrick
231
	\forall u^\normal . (u = 0 \cor \exists y^\safe.  u = \succ{} v  )  \\        %% version 3: after discussion
232
	\forall u^\normal ,x^\safe . u \cdot \succ{} x = u + ux
233
	\end{array}
234
	\]
235
	\todo{Proof: Patrick? The final two should require PIND.}
226
	\begin{enumerate}[(i)]
227
		\item\label{item:i} $\forall x^\safe . 0 \neq \succ{} (x)$
228
		\item\label{item:ii} $\forall x^\safe , y^\safe . (\succ{} x = \succ{} y \cimp x = y) $
229
		\item\label{item:iii} $\forall u^\normal . (u = 0 \cor \exists y^\safe.  u = \succ{} v  ) $
230
		\item\label{item:iv} $\forall u^\normal ,x^\safe . u \cdot \succ{} x = u + ux$
231
	\end{enumerate}
232
%	\[
233
%	\begin{array}{l}
234
%	\forall x^\safe . 0 \neq \succ{} (x) \\
235
%	\forall x^\safe , y^\safe . (\succ{} x = \succ{} y \cimp x = y) \\
236
%	%\forall x^\safe . (x = 0 \cor \exists y^\safe.\  x = \succ{} y   )  \\        %% version 1: Anupam
237
%	%\forall u^\normal . (u = 0 \cor \exists v^\normal.\  u = \succ{} v  )  %%  version 2: Patrick
238
%	\forall u^\normal . (u = 0 \cor \exists y^\safe.  u = \succ{} v  )  \\        %% version 3: after discussion
239
%	\forall u^\normal ,x^\safe . u \cdot \succ{} x = u + ux
240
%	\end{array}
241
%	\]
242
%	\todo{Proof: Patrick? The final two should require PIND.}
236 243
	
237
	\patrick{Actually statement 4 is nearly trivial from axiom 27 (for which I corrected the sorts). Maybe you meant something else?
238
		\anupam{I actually changed those sorts from the submitted version: I really think axiom 27 should be $\forall x^{\safe}, y^{\normal}.  x\cdot(\succ{}y)=(x\cdot y)+x$, i.e.\ with the recursion on the normal variable. That way we can do induction on it. Statement 4 is meant to state the recursive property for the safe argument, which we can prove by induction on the normal argument, $u$.}
239
	
240
	For statement 3 I replaced the quantifications $\safe$ by $\normal$, otherwise I don't know how to prove it. But I think we need the stronger statement with $\safe$, so probably we should add it as an additional axiom.}
241
\anupam{Rather, I do not know how you would prove the current version: we cannot induct on $\exists v^\normal$ from axiom 27 as I state it. For the previous version I would just do induction on $u$ in $(u = 0 \cor \exists x^\safe.\  u = \succ{} x )$. }
244
%	\patrick{Actually statement 4 is nearly trivial from axiom 27 (for which I corrected the sorts). Maybe you meant something else?
245
%		\anupam{I actually changed those sorts from the submitted version: I really think axiom 27 should be $\forall x^{\safe}, y^{\normal}.  x\cdot(\succ{}y)=(x\cdot y)+x$, i.e.\ with the recursion on the normal variable. That way we can do induction on it. Statement 4 is meant to state the recursive property for the safe argument, which we can prove by induction on the normal argument, $u$.}
246
%	
247
%	For statement 3 I replaced the quantifications $\safe$ by $\normal$, otherwise I don't know how to prove it. But I think we need the stronger statement with $\safe$, so probably we should add it as an additional axiom.}
248
%\anupam{Rather, I do not know how you would prove the current version: we cannot induct on $\exists v^\normal$ from axiom 27 as I state it. For the previous version I would just do induction on $u$ in $(u = 0 \cor \exists x^\safe.\  u = \succ{} x )$. }
242 249
	\end{example}
243 250
\begin{proof}
244
\begin{itemize}
245
 \item For the first statement, assume  $\succ{} (x)=0$ in order to derive a contradiction. From axiom \ref{axiom1} we deduce $x \leq x$, and so by transitivity (axiom \ref{axiom<=transitivity}) we have $x\leq 0$. So axiom \ref{axiom3} gives us $x=0$, and so $x=\succ{} (x)$. Finally with axiom  \ref{axiom2} we conclude with a contradiction.
246
 \item For statement 2 assume $\succ{} x= \succ{} y$ and, in view of a contradiction, $x\neq y$. Let us apply axiom \ref{axiom6} and assume we are in the case where $x\leq y$ (the case $y\leq x$ is symmetric). By axiom \ref{axiom4}, using the assumption $x\neq y$ we deduce $\succ{} x \leq y$. So as $\succ{} x = \succ{} y$ we can derive that $\succ{} y \leq y$. So by using axiom \ref{axiom1} we obtain that $\succ{} y = y$, hence axiom \ref{axiom2} gives us a contradiction, so we are done.
247
 \item For statement 3 we proceed by induction on $u$ on the formula %$A(u)= (u=0) \cup \exists^{\normal} v. u=\succ{} v$.
248
 $A(u)= (u=0) \cup \exists^{\safe} v. u=\succ{} v$. Note that this formula is in $\Sigma^{\safe}_1$, so we can do induction on it in  $\arith^1$. 
249
   The formula $A(0)$ obviously holds. Let us assume $A(u)$ and prove $A(\succ{0} u)$ (the case for $A(\succ{1} u)$ will be similar).  By $A(u)$ we deduce that either $u=0$, in which case we also have $\succ{0} u=0$, hence  $A(\succ{0} u)$ also holds, or there is a $y$ such that $u=\succ{} y$. In this latter case we have $\succ{0} u=2\cdot(\succ{} y)=2 v +2$ (by axiom \ref{axiom29}), hence have $\succ{0} u=\succ{} (\succ{} (2\cdot y))$, and so $A(\succ{0} u)$ holds.
250
 
251
 By induction we conclude that $\forall^{\normal} u. A(u)$ holds.
252
\item Statement 4:
253
%For statement 4 we just use axiom  \ref{axiom27} followed by the $+$ commutativity axiom \ref{axiom21}.
254
 Let $u$ be in $\normal$ and $x$ in $\safe$, and try to prove $u\succ{} x=u+ux$. We have $\succ{} x=x+1$ by axioms \ref{axiom23} and \ref{axiom22} , so $u\succ{} x= u(x+1)$, so from axiom \ref{axiom29} we deduce $ u\succ{} x= u(x+1)=ux+u$, and by axiom \ref{axiom21} we conclude that $ u\succ{} x=u+ux$.
255
\end{itemize}
251
	For \eqref{item:i}, assume  $\succ{} (x)=0$ in order to derive a contradiction. From axiom \ref{axiom1} we deduce $x \leq x$, and so by transitivity (axiom \ref{axiom<=transitivity}) we have $x\leq 0$. So axiom \ref{axiom3} gives us $x=0$, and so $x=\succ{} (x)$. Finally with axiom  \ref{axiom2} we conclude with a contradiction.
252
	
253
	For \eqref{item:ii} assume $\succ{} x= \succ{} y$ and, in view of a contradiction, $x\neq y$. Let us apply axiom \ref{axiom6} and assume we are in the case where $x\leq y$ (the case $y\leq x$ is symmetric). By axiom \ref{axiom4}, using the assumption $x\neq y$ we deduce $\succ{} x \leq y$. So as $\succ{} x = \succ{} y$ we can derive that $\succ{} y \leq y$. So by using axiom \ref{axiom1} we obtain that $\succ{} y = y$, hence axiom \ref{axiom2} gives us a contradiction, so we are done.
254
	
255
	For \eqref{item:iii} we proceed by induction on $u$ on the formula %$A(u)= (u=0) \cup \exists^{\normal} v. u=\succ{} v$.
256
	$A(u)= (u=0) \cup \exists^{\safe} v. u=\succ{} v$. Note that this formula is in $\Sigma^{\safe}_1$, so we can do induction on it in  $\arith^1$. 
257
	The formula $A(0)$ obviously holds. Let us assume $A(u)$ and prove $A(\succ{0} u)$ (the case for $A(\succ{1} u)$ will be similar).  By $A(u)$ we deduce that either $u=0$, in which case we also have $\succ{0} u=0$, hence  $A(\succ{0} u)$ also holds, or there is a $y$ such that $u=\succ{} y$. In this latter case we have $\succ{0} u=2\cdot(\succ{} y)=2 v +2$ (by axiom \ref{axiom29}), hence have $\succ{0} u=\succ{} (\succ{} (2\cdot y))$, and so $A(\succ{0} u)$ holds.
258
	By induction we conclude that $\forall^{\normal} u. A(u)$ holds.
259
	
260
	 For \eqref{item:iv} let $u$ be in $\normal$ and $x$ in $\safe$, and try to prove $u\succ{} x=u+ux$. We have $\succ{} x=x+1$ by axioms \ref{axiom23} and \ref{axiom22} , so $u\succ{} x= u(x+1)$, so from axiom \ref{axiom29} we deduce $ u\succ{} x= u(x+1)=ux+u$, and by axiom \ref{axiom21} we conclude that $ u\succ{} x=u+ux$.	
256 261
\end{proof}
257 262
It is often useful for us to work with \emph{length-induction}, which is equivalent to polynomial induction and well known from bounded arithmetic:
258 263
\begin{proposition}
......
485 490
	where $\Gamma \seqar \Delta$ contains only $\Sigma^\safe_i$ formulae for which the sorting $(\vec u ;\vec x)$ is compatible.
486 491
\end{theorem}
487 492

  
488
Strictly speaking, we must alter some of the sequent rules a little to arrive at this normal form. For instance the $\pind$ rule would have $\normal(\vec u)$ in its lower sequent rather than $\normal (t(\vec u))$. The latter is a consequence of the former already in $\basic$.
489
The proof of this result also relies on a heavy use of the structural rules, contraction and weakening, to ensure that we always have a complete and compatible sorting of variables on the LHS of a sequent. This is similar to what is done in \cite{OstrinWainer05} where they use a $G3$ style calculus to manage such structural manipulations.
493
Strictly speaking, we must alter some of the sequent rules a little to arrive at this normal form. For instance the $\pind$ rule would have $\normal(\vec u)$ in its lower sequent rather than $\normal (t(\vec u))$:
494
\[
495
\small
496
\vliinf{}{}{ \normal(\vec u) , \safe (\vec x) , \Gamma , A(0) \seqar A(t(\vec u)), \Delta }{ \normal(u), \normal (\vec u) , \safe (\vec x) , \Gamma, A(u) \seqar A(\succ{0} u) , \Delta }{ \normal(u) ,, \normal(\vec u ) , \safe (\vec x) , \Gamma, A(u) \seqar A(\succ{1} u) , \Delta  }
497
\] 
498
Notice that $\normal(t(\vec u))$ is a consequence of $\normal (\vec u)$ already in $\basic$, due to the axioms \eqref{ax:a}-\eqref{ax:h}.
499
That the variables of $t$ can be assumed to occur already in the premisses is due to weakening.
500
Indeed, the proof of this result relies on a heavy use of the structural rules, contraction and weakening, to ensure that we always have a complete and compatible sorting of variables on the LHS of a sequent. This is similar to what is done in \cite{OstrinWainer05} where they use a $G3$ style calculus to manage such structural manipulations.
490 501

  
491
As we mentioned, the fact that only $\Sigma^\safe_i$ formulae occur is due to the free-cut elimination result for first-order calculi \cite{Takeuti87,Cook:2010:LFP:1734064}, which gives a form of proof where every $\cut$ step has one of its cut formulae `immediately' below a non-logical step. Again, we have to adapt the $\rais$ rule a little to achieve our result, due to the fact that it has a $\exists x^\normal$ in its lower sequent. For this we consider a merge of $\rais$ and $\cut$, which allows us to directly cut the upper sequent of $\rais$ against a sequent of the form $\normal(u), A(u), \Gamma \seqar \Delta$.
502
As we mentioned, the fact that only $\Sigma^\safe_i$ formulae occur is due to the free-cut elimination result for first-order calculi \cite{Takeuti87,Cook:2010:LFP:1734064}, which gives a form of proof where every $\cut$ step has one of its cut formulae `immediately' below a non-logical step. Again, we have to adapt the $\rais$ rule a little to achieve our result, due to the fact that it has a $\exists x^\normal$ in its lower sequent. For this we consider a merge of $\rais$ and $\cut$, which allows us to directly cut the upper sequent of $\rais$ against a sequent of the form $\normal(u), A(u), \Gamma \seqar \Delta$:
503
\[
504
\vliinf{\rais\cut}{}{\normal (\vec u) , \Gamma \seqar \Delta}{\normal (\vec u) \seqar \exists x^\safe . A(x)}{\normal (u) , A(u) , \Gamma \seqar \Delta}
505
\]
492 506

  
493 507
Finally, as is usual in bounded arithmetic, we use distinguished rules for our relativised quantifiers \cite{Buss86book}, although we use ones that satisfy the aforementioned constraints. For instance, we include the following rules, from which the remaining ones are similar:
494 508
\[
CSL17/tech-report/completeness.tex (revision 267)
6 6
	\label{thm:completeness}
7 7
	For every $\mubci{i-1}$ program $f(\vec u ; \vec x)$ (which is in $\fphi i$), there is a $\Sigma^{\safe}_i$ formula $A_f(\vec u, \vec x)$ such that $\arith^i$ proves $\forall^{\normal} \vec u, \forall^{\safe} \vec x, \exists^{\safe} ! y. A_f(\vec u , \vec x , y )$ and $\Nat \models \forall \vec u , \vec x. A_f(\vec u , \vec x , f(\vec u ; \vec x))$.
8 8
\end{theorem}
9
%\begin{proof}[Proof sketch]
10
% The proof proceeds by induction on the construction of $\mubci{i-1}$ program $f(\vec u ; \vec x)$ to construct a suitable formula $A_f(\vec u, \vec x)$ .  The case of initial functions is easy, by using the formulas we gave in Sect. \ref{sect:graphsbasicfunctions}. The case of safe composition is also simple, and makes use of the $\rais$ rule and simply combines previously defined graphs by \emph{modus ponens} (or, equivalently, $\cut$) and logical steps. 
11
% Naturally, the interesting cases are predicative recursion and predicative minimisation. 
12
% 
13
% For predicative recursion, the basic idea is to follow the classical simulation of primitive recursion in arithmetic, adapted to recursion on notation, by encoding the sequence of recursive calls as an integer $w$. 
14
% The main technical difficulty is the representation of (de)coding functions $\beta (k, x)$ (``the $k$th element of the sequence $x$'') and $\langle x,y \rangle$ (``the pair $(x,y)$'').
15
% For this we represent instead the length-bounded variants of these functions given towards the end of Sect.~\ref{sect:prelims}, relying on the bounded minimisation and polychecking results, Lemmas~\ref{lem:bounded-minimisation} and \ref{lem:polychecking}, to find appropriate length bounds.
16
%%  which is sorted in a suitable way so that the formula $A_f$  is in the same class $\Sigma^{\safe}_i$ as $A_g$,  $A_{h_0}$ and $A_{h_1}$.  This can be obtained by defining  the predicate $\beta (k, w, y)$ with $k$ in $\normal$ and $w$, $y$ in $\safe$.
17
%
18
% 
19
% For predicative minimisation, assume $f$ is defined as  $\succ 1 \mu x .( g(\vec u ; \vec x , x) =_2 0)$ (or $0$ if no such $x$ exists).
20
%  By induction hypothesis we have a $\Sigma^{\safe}_{i-1}$ formula $A_g (\vec u , \vec x , x , y)$  satisfying the property. We then define $A_f(\vec u ; \vec x , z)$ as the following $\Sigma^{\safe}_{i}$ formula:
21
%\[
22
%\begin{array}{rl}
23
%&\left(
24
%z=0 \  \cand \ \forall x^\safe , y^\safe . (A_g (\vec u , \vec x , x, y) \cimp y=_2 1)
25
%\right) \\
26
%\cor & \left(
27
%\begin{array}{ll}
28
%z\neq 0 
29
%& \cand\   \forall y^\safe . (A_g (\vec u , \vec x , p z , y) \cimp y=_2 0 ) \\
30
%& \cand\ \forall x^\safe < p z . \forall y^\safe . (A_g (\vec u , \vec x , x , y) \cimp y=_2 1) 
31
%\end{array}
32
%\right)
33
%\end{array}
34
%\]
35
%As for the analogous bounded arithmetic theory $S^i_2$ (\cite{Buss86book}, Thm 20, p. 58),  $\arith^i$ can prove a \emph{minimisation} principle for $\Sigma^\safe_{i-1}$ formulas, allowing us to extract the \emph{least} witness of a safe existential, which we apply to $\exists x^\safe , y^\safe .  (A_g (\vec u ,\vec x , x , y) \cand y=_2 0)$. From here we are able to readily prove the totality of $A_f(\vec u ; \vec x , z)$, notably relying crucially on classical reasoning.
36
%\end{proof}
9
\begin{proof}[Proof sketch]
10
 The proof proceeds by induction on the construction of $\mubci{i-1}$ program $f(\vec u ; \vec x)$ to construct a suitable formula $A_f(\vec u, \vec x)$ .  The case of initial functions is easy, by using the formulas we gave in Sect. \ref{sect:graphsbasicfunctions}. The case of safe composition is also simple, and makes use of the $\rais$ rule and simply combines previously defined graphs by \emph{modus ponens} (or, equivalently, $\cut$) and logical steps. 
11
 Naturally, the interesting cases are predicative recursion and predicative minimisation. 
12
 
13
 For predicative recursion, the basic idea is to follow the classical simulation of primitive recursion in arithmetic, adapted to recursion on notation, by encoding the sequence of recursive calls as an integer $w$. 
14
 The main technical difficulty is the representation of (de)coding functions $\beta (k, x)$ (``the $k$th element of the sequence $x$'') and $\langle x,y \rangle$ (``the pair $(x,y)$'').
15
 For this we represent instead the length-bounded variants of these functions given towards the end of Sect.~\ref{sect:prelims}, relying on the bounded minimisation and polychecking results, Lemmas~\ref{lem:bounded-minimisation} and \ref{lem:polychecking}, to find appropriate length bounds.
16
%  which is sorted in a suitable way so that the formula $A_f$  is in the same class $\Sigma^{\safe}_i$ as $A_g$,  $A_{h_0}$ and $A_{h_1}$.  This can be obtained by defining  the predicate $\beta (k, w, y)$ with $k$ in $\normal$ and $w$, $y$ in $\safe$.
37 17

  
38
\begin{proof}
39
	We proceed by structural induction on a $\mubc^{i-1} $ program, dealing with each case in the proceeding paragraphs.
40
	
41
	The property is easily verified for the class of initial functions of  $\mubci{i-1}$: constant, projections, (binary) successors, predecessor, conditional, as already shown in Sect. \ref{sect:graphsbasicfunctions}. Now let us examine the three constructions: predicative minimisation, predicative (safe) recursion and composition. 
42
	\paragraph*{Predicative minimisation}
43
	Suppose $f(\vec u ; \vec x)$ is defined by predicative minimisation from $g$ (we then denote $f$ as as $\mu x^{+1} . g(\vec u ; \vec x , x) =_2 0$). 
44
	By definition $g$ is in $\mubci{i-2}$, and so by the inductive hypothesis there is a $\Sigma^{\safe}_{i-1}$ formula $A_g (\vec u , \vec x , x , y)$ computing the graph of $g$ such that,
45
	\[
46
	\arith^i \proves \forall \vec u^\normal . \forall \vec x^\safe , x^\safe . \exists ! y^\safe . A_g(\vec u , \vec x , x , y)
47
	\]
48
	Let us define $A_f(\vec u ; \vec x , z)$ as:
49
	\[
50
	\begin{array}{rl}
51
	&\left(
52
	z=0 \  \cand \ \forall x^\safe , y^\safe . (A_g (\vec u , \vec x , x, y) \cimp y=_2 1)
53
	\right) \\
54
	\cor & \left(
55
	\begin{array}{ll}
56
	%z\neq 0
57
	z=\succ{1} p z 
58
	& \cand\   \forall y^\safe . (A_g (\vec u , \vec x , p z , y) \cimp y=_2 0 ) \\
59
	& \cand\ \forall x^\safe < p z . \forall y^\safe . (A_g (\vec u , \vec x , x , y) \cimp y=_2 1) 
60
	\end{array}
61
	\right)
62
	\end{array}
63
	\]
64
	Notice that $A_f$ is $\Pi^{\safe}_{i-1}$, since $A_g$ is $\Sigma^{\safe}_{i-1}$ and occurs only in negative context above, with additional safe universal quantifiers occurring in positive context.
65
	In particular this means $A_f$ is $\Sigma^{\safe}_i$.
66
	
67
	Now, to prove totality of $A_f$, we rely on $\Sigma^\safe_{i-1}$-minimisation, which is a consequence of $\cpind{\Sigma^\safe_i}$:
68
	
18
 
19
 For predicative minimisation, assume $f$ is defined as  $\succ 1 \mu x .( g(\vec u ; \vec x , x) =_2 0)$ (or $0$ if no such $x$ exists).
20
  By induction hypothesis we have a $\Sigma^{\safe}_{i-1}$ formula $A_g (\vec u , \vec x , x , y)$  satisfying the property. We then define $A_f(\vec u ; \vec x , z)$ as the following $\Sigma^{\safe}_{i}$ formula:
21
\[
22
\begin{array}{rl}
23
&\left(
24
z=0 \  \cand \ \forall x^\safe , y^\safe . (A_g (\vec u , \vec x , x, y) \cimp y=_2 1)
25
\right) \\
26
\cor & \left(
27
\begin{array}{ll}
28
z\neq 0 
29
& \cand\   \forall y^\safe . (A_g (\vec u , \vec x , p z , y) \cimp y=_2 0 ) \\
30
& \cand\ \forall x^\safe < p z . \forall y^\safe . (A_g (\vec u , \vec x , x , y) \cimp y=_2 1) 
31
\end{array}
32
\right)
33
\end{array}
34
\]
35
As for the analogous bounded arithmetic theory $S^i_2$ (\cite{Buss86book}, Thm 20, p. 58),  $\arith^i$ can prove a \emph{minimisation} principle for $\Sigma^\safe_{i-1}$ formulas, allowing us to extract the \emph{least} witness of a safe existential; we prove this in the lemma below. We apply minimisation to $\exists x^\safe , y^\safe .  (A_g (\vec u ,\vec x , x , y) \cand y=_2 0)$. From here we are able to readily prove the totality of $A_f(\vec u ; \vec x , z)$ arguing by cases, notably relying crucially on classical reasoning.
36
\end{proof}
37

  
69 38
	\begin{lemma}
70 39
		[Minimisation]
71
		$\arith^i \proves \cmin{\Sigma^\safe_{i-1}}$.	
40
		For any $\Sigma^\safe_{i-1}$ formula $A$, $\arith^i \proves \exists x^\safe . A(x) \cimp \exists x^\safe . ( A(x) \cand \forall y^\safe \leq x . (A(y) \cimp y=x) ) $.	
72 41
	\end{lemma}
73 42
	% see Thm 20 p. 58 in Buss' book.
74 43
	%\begin{proof}
......
76 45
	\begin{proof}
77 46
		This Lemma is proved by using the same method as for the proof of the analogous result in the bounded arithmetic $S_2^{i}$ (see \cite{Buss86book}, Thm 20, p. 58).
78 47
		
79
		Let $A(x)$ be a  $\Sigma^\safe_{i-1}$ formula, with $x$ in $\safe$. We define the formula $B(a,b,c)$ as:
48
		%Let $A(x)$ be a  $\Sigma^\safe_{i-1}$ formula, with $x$ in $\safe$. 
49
		We define the formula $B(a,b,c)$ as:
80 50
		$$ \forall x^{\safe}. (x < |a| \moins b \cimp \zerobit(x,c)) \cand \forall y^{\safe}<c. \neg A(y) \cand \exists y^{\safe} < 2^{|a|\moins b}.A(c+y)$$
81 51
		where $a$ is in $\normal$, $b$ in $\normal$ and $c$ in $\safe$.  $B(a,b,c)$ is in $\Sigma^\safe_{i}$.
82 52
		
......
102 72
		$$(A(a) \cand a \neq 0 ) \cimp \exists x^{\safe } \leq a.B(a,|a|,x).$$
103 73
		But $B(a,|a|,x)$ implies $(\forall y^{\safe}<x. \neg A(y))\cand A(x)$, so we have proven:
104 74
		$$(A(a) \cand a \neq 0 ) \cimp (\exists x^{\safe } \leq a. (\forall y^{\safe}<x. \neg A(y))\cand A(x))$$
105
		which is the $\Sigma_{i-1}^{\safe}$ axiom for $A$.
75
	%	which is the $\Sigma_{i-1}^{\safe}$ axiom for $A$.
76
	as required.
106 77
	\end{proof}
107
	
108
	Now, working in $\arith^i$, let $\vec u \in \normal , \vec x \in \safe$ and let us prove:
109
	\[
110
	\exists !z^\safe  . A_f(\vec u ; \vec x , z)
111
	\]
112
	Suppose that $\exists x^\safe , y^\safe .  (A_g (\vec u ,\vec x , x , y) \cand y=_2 0)$.
113
	We can apply minimisation due to the lemma above to find the least $x\in \safe$ such that $\exists y^\safe .  (A_g (\vec u ,\vec x , x , y) \cand y=_2 0)$, and we set $z = \succ 1 x$. So $x= p z$. 
114
	%\todo{verify $z\neq 0$ disjunct.} 
115
	%Then $z \neq 0$ holds. 
116
	Then $z=\succ{1} p z$ holds.
117
	Moreover we had  $\exists ! y^\safe . A_g(\vec u , \vec x , x , y)$. So we deduce that
118
	$\forall y^\safe . (A_g (\vec u , \vec x , p z , y) \cimp y=_2 0 ) $. Finally, as $p z$ is the least element such that
119
	$\exists y^\safe .  (A_g (\vec u ,\vec x , p z , y) \cand y=_2 0)$, we deduce 
120
	$\ \forall x^\safe < p z . \forall y^\safe . (A_g (\vec u , \vec x , x , y) \cimp y=_2 1) $. We conclude that the second member of the disjunction
121
	$A_f(\vec u ; \vec x , z)$ is proven.  
122
	
123
	Otherwise, we have that $\forall x^\safe , y^\safe . (A_g (\vec u , \vec x , x, y) \cimp y=_2 1)$, so we can set $z=0$ and the first member of the disjunction $A_f(\vec u ; \vec x , z)$ is proven.  
124
	
125
	So we have proven $\exists z^\safe  . A_f(\vec u ; \vec x , z)$, and unicity can be easily verified.
126
	
127
	\paragraph*{Predicative recursion on notation}
128
	
129
	\anupam{Assume access to the following predicates (makes completeness easier, soundness will be okay):
130
		\begin{itemize}
131
			%	\item $\pair x y z $ . ``$z$ is the sequence that appends $y$ to the sequence $x$''
132
			\item $\pair x y z$. ``$z$ is the sequence that prepends $x$ to the sequence $y$''
133
			\item $\beta (i; x ,y)$. ``The $i$th element of the sequence $x$ is $y$.''
134
		\end{itemize}
135
	}
136
	\patrick{I also assume access to the following predicates:
137
		\begin{itemize}
138
			%   \item $\zerobit (u,k)$ (resp. $\onebit(u,k)$). " The $k$th bit of $u$ is 0 (resp. 1)"
139
			%   \item $\pref(k,x,y)$. "The prefix of $x$ (as a binary string) of length $k$ is $y$" 
140
			\item $\addtosequence(w,y,w')$. "$w'$ represents the sequence obtained by adding $y$ at the end of the sequence represented by $w$". Here we are referring to sequences which can be decoded with predicate $\beta$.
141
		\end{itemize}}
142
		In the following we will use the predicates $\zerobit (u,k)$, $\onebit(u,k)$, $\pref(k,x,y)$ which have been defined in Sect. \ref{sect:graphsbasicfunctions}.
143
		
144
		Suppose that $f$ is defined by predicative recursion on notation:
145
		\[
146
		\begin{array}{rcl}
147
		f(0 , \vec u ; \vec x) & \dfn & g(\vec u ; \vec x) \\
148
		f(\succ i u, \vec u ; \vec x) & \dfn & h_i( u , \vec u ; \vec x , f(u , \vec u ; \vec x))
149
		\end{array}
150
		\]
151
		
152
		\anupam{using $\beta(i,x,y)$ predicate for sequences: ``$i$th element of $x$ is $y$''. Provably total in $\arith^1$.}
153
		
154
		Suppose we have $\Sigma^\safe_i$ formulae $A_g (\vec u ; \vec x,y)$ and $A_{h_i} (u , \vec u ; \vec x , y , z)$ computing the graphs of $g$ and $h_i$ respectively, provably total in $\arith^i$.
155
		We define $A_f (u ,\vec u ; \vec x , y)$ as,
156
		\[
157
		\exists w^\safe . \left(
158
		\begin{array}{ll}
159
		& 
160
		%Seq(z) \cand 
161
		\exists^{\safe} y_0 . ( A_g (\vec u , \vec x , y_0) \cand \beta(0, w , y_0) ) \cand \beta(|u|, w,y ) \\
162
		%\cand & \forall k < |u| . \exists y_k , y_{k+1} . ( \beta (k, w, y_k) \cand \beta (k+1 ,w, y_{k+1})  \cand A_{h_i} (u , \vec u ; \vec x , y_k , y_{k+1}) )\\
163
		\cand & \forall^{\normal}  k < |u| . \exists^{\safe} y_k , y_{k+1} . ( \beta (k, w, y_k) \cand \beta (k+1 ,w, y_{k+1})  \cand B (u , \vec u ; \vec x , y_k , y_{k+1}) )
164
		\end{array}
165
		\right)
166
		\]
167
		where 
168
		\[
169
		B (u , \vec u ; \vec x , y_k , y_{k+1}) = \left(
170
		\begin{array}{ll}
171
		& \zerobit(u,k+1) \cimp  \exists v .(\pref(k,u,v)  \cand A_{h_0}(v,\vec u ; \vec x, y_k, y_{k+1}) )\\
172
		\cand &  \onebit(u,k+1) \cimp  \exists v .(\pref(k,u,v)  \cand A_{h_1}(v,\vec u ; \vec x, y_k, y_{k+1}) )
173
		\end{array}
174
		\right)
175
		\]
176
		
177
		%which is $\Sigma^\safe_i$ by inspection, and indeed defines the graph of $f$.
178
		
179
		To show totality, let $\vec u \in \normal, \vec x \in \safe$ and proceed by induction on $u \in \normal$.
180
		The base case, when $u=0$, is immediate from the totality of $A_g$, so for the inductive case we need to show:
181
		\[
182
		\exists y^\safe . A_f (u , \vec u ; \vec x , y) 
183
		\quad \seqar \quad
184
		\exists z^\safe . A_f (s_i u, \vec u ; \vec x , z)
185
		\]
186
		
187
		So let us assume $\exists y^\safe . A_f (u , \vec u ; \vec x , y) $. Then there exists $w$ such that $\safe(w)$ and 
188
		$A_f (u , \vec u ; \vec x , w) $.
189
		
190
		We know that there exists a $z$ such that $A_{h_i}(u,\vec u ; \vec x, y, z)$. Let then $w'$ be such that
191
		$\addtosequence(w,z,w')$. Observe also that we have $\pref(|u|,s_i u,u)$
192
		
193
		So we have, for $k=|u|$:
194
		
195
		$$  \beta (k, w', y) \cand \beta (k+1 ,w', z)  \cand B (u , \vec u ; \vec x , y , z).$$
196
		
197
		and we can conclude that
198
		\[
199
		\exists w'^\safe . \left(
200
		\begin{array}{ll}
201
		& 
202
		%Seq(z) \cand 
203
		\exists y_0 . ( A_g (\vec u , \vec x , y_0) \cand \beta(0, w' , y_0) ) \cand \beta(|u|+1, w',z ) \\
204
		\cand & \forall k < |u|+1 . \exists y_k , y_{k+1} . ( \beta (k, w, y_k) \cand \beta (k+1 ,w, y_{k+1})  \cand B (u , \vec u ; \vec x , y_k , y_{k+1}) )
205
		\end{array}
206
		\right)
207
		\]
208
		So $\exists z^{\safe} . A_f (s_i u, \vec u ; \vec x , z)$ has been proven. So by induction we have proven $\forall^{\normal} u,  
209
		\forall^{\normal} \vec u, \exists^{\safe} y. A_f (u ,\vec u ; \vec x , y)$. Moreover one can also check the unicity of $y$, and so we have proved the required formula. 
210
		
211
		\anupam{here need to `add' element to the computation sequence. Need to do this earlier in the paper.}
212
		
213
		\anupam{for inductive cases, need $u\neq 0$ for $\succ 0$ case.}
214
		
215
		\paragraph*{Safe composition}
216
		Now suppose that $f$ is defined by safe composition:
217
		\[
218
		f(\vec u ; \vec x) \quad \dfn \quad g( \vec h(\vec u;) ; \vec h' (\vec u ; \vec x) )
219
		\]
220
		
221
		By the inductive hypothesis, let us suppose that we have $\Sigma^\safe_i $ definitions $A_g , A_{h_i} , A_{h_j'} $ of the graphs of $g , h_i , h_j'$ respectively, which are provably total etc.
222
		In particular, by Raising, we have that $\forall \vec u^\normal . \exists v^\normal . A_{h_i} (\vec u , v)$.
223
		
224
		We define $A_f (\vec u , \vec x , z)$ defining the graph of $f$ as follows:
225
		\[
226
		\exists \vec v^\normal . \exists \vec y^\safe .  
227
		\left(  
228
		\bigwedge\limits_i A_{h_i} (\vec u , v_i)
229
		\wedge
230
		\bigwedge\limits_j A_{h_j'} (\vec u ; \vec x , y_j)
231
		\wedge
232
		A_g ( \vec v , \vec y , z ) 
233
		\right)
234
		\]
235
		The provable totality of $A_f$ follows from simple first-order reasoning, mostly cuts and basic quantifier manipulations.
236
		
237
		\todo{elaborate}
238
		
239
		The proof of Thm \ref{thm:completeness} is thus completed.
240
\end{proof}
241 78

  
79
%\begin{proof}
80
%	We proceed by structural induction on a $\mubc^{i-1} $ program, dealing with each case in the proceeding paragraphs.
81
%	
82
%	The property is easily verified for the class of initial functions of  $\mubci{i-1}$: constant, projections, (binary) successors, predecessor, conditional, as already shown in Sect. \ref{sect:graphsbasicfunctions}. Now let us examine the three constructions: predicative minimisation, predicative (safe) recursion and composition. 
83
%	\paragraph*{Predicative minimisation}
84
%	Suppose $f(\vec u ; \vec x)$ is defined by predicative minimisation from $g$ (we then denote $f$ as as $\mu x^{+1} . g(\vec u ; \vec x , x) =_2 0$). 
85
%	By definition $g$ is in $\mubci{i-2}$, and so by the inductive hypothesis there is a $\Sigma^{\safe}_{i-1}$ formula $A_g (\vec u , \vec x , x , y)$ computing the graph of $g$ such that,
86
%	\[
87
%	\arith^i \proves \forall \vec u^\normal . \forall \vec x^\safe , x^\safe . \exists ! y^\safe . A_g(\vec u , \vec x , x , y)
88
%	\]
89
%	Let us define $A_f(\vec u ; \vec x , z)$ as:
90
%	\[
91
%	\begin{array}{rl}
92
%	&\left(
93
%	z=0 \  \cand \ \forall x^\safe , y^\safe . (A_g (\vec u , \vec x , x, y) \cimp y=_2 1)
94
%	\right) \\
95
%	\cor & \left(
96
%	\begin{array}{ll}
97
%	%z\neq 0
98
%	z=\succ{1} p z 
99
%	& \cand\   \forall y^\safe . (A_g (\vec u , \vec x , p z , y) \cimp y=_2 0 ) \\
100
%	& \cand\ \forall x^\safe < p z . \forall y^\safe . (A_g (\vec u , \vec x , x , y) \cimp y=_2 1) 
101
%	\end{array}
102
%	\right)
103
%	\end{array}
104
%	\]
105
%	Notice that $A_f$ is $\Pi^{\safe}_{i-1}$, since $A_g$ is $\Sigma^{\safe}_{i-1}$ and occurs only in negative context above, with additional safe universal quantifiers occurring in positive context.
106
%	In particular this means $A_f$ is $\Sigma^{\safe}_i$.
107
%	
108
%	Now, to prove totality of $A_f$, we rely on $\Sigma^\safe_{i-1}$-minimisation, which is a consequence of $\cpind{\Sigma^\safe_i}$:
109
%	
110
%	\begin{lemma}
111
%		[Minimisation]
112
%		$\arith^i \proves \cmin{\Sigma^\safe_{i-1}}$.	
113
%	\end{lemma}
114
%	% see Thm 20 p. 58 in Buss' book.
115
%	%\begin{proof}
116
%	%\end{proof}
117
%	\begin{proof}
118
%		This Lemma is proved by using the same method as for the proof of the analogous result in the bounded arithmetic $S_2^{i}$ (see \cite{Buss86book}, Thm 20, p. 58).
119
%		
120
%		Let $A(x)$ be a  $\Sigma^\safe_{i-1}$ formula, with $x$ in $\safe$. We define the formula $B(a,b,c)$ as:
121
%		$$ \forall x^{\safe}. (x < |a| \moins b \cimp \zerobit(x,c)) \cand \forall y^{\safe}<c. \neg A(y) \cand \exists y^{\safe} < 2^{|a|\moins b}.A(c+y)$$
122
%		where $a$ is in $\normal$, $b$ in $\normal$ and $c$ in $\safe$.  $B(a,b,c)$ is in $\Sigma^\safe_{i}$.
123
%		
124
%		The intuitive idea of the proof is to observe that, if $A(a)$ is true for $a\neq 0$, then  $\exists x^{\safe}\leq a. B(a,b,x)$ holds for $b=0$, and to try to prove it for $b=|a|$, by using a length induction on $b$.
125
%		
126
%		First, one can prove:
127
%		$$ (A(a) \cand a \neq 0) \cimp B(a,0,0).$$ 
128
%		And thus:
129
%		$$ (A(a) \cand a \neq 0) \cimp \exists x^{\safe}\leq a .B(a,0,x).$$ 
130
%		We then can check that the two following statements are provable:
131
%		$$
132
%		\begin{array}{rcl}
133
%		(b<|a| \cand B(a,b,c)\cand \exists y^{\safe}<2^{|a| \moins (b+1)}.A(c+y)) &\cimp& B(a,\succ{} b,c)\\
134
%		(b<|a| \cand B(a,b,c)\cand \forall y^{\safe}<2^{|a| \moins(b+1)}.  A(c+y)) &\cimp & B(a,\succ{} b, c+2^{|a| \moins (b+1)})
135
%		\end{array}
136
%		$$
137
%		Moreover we have: $A(a) \cand B(a,b,c) \cimp c\leq a$.
138
%		From these three statements we deduce:
139
%		$$(A(a) \cand a \neq 0 \cand b<|a| \cand \exists x^{\safe} \leq a. B(a,b,x)) \cimp \exists x^{\safe } \leq a.B(a,\succ{} b,x).$$
140
%		Note that here we have used the fact that we are in classical logic.
141
%		
142
%		The formula $\exists x\leq a. B(a,b,c)$ is in $\Sigma^{\safe}_{i}$, so by $\Sigma^{\safe}_{i}$-LIND on the formula $\exists x\leq a. B(a,b,c)$, with the variable $b$ which is in $\normal$,  we obtain:
143
%		$$(A(a) \cand a \neq 0 ) \cimp \exists x^{\safe } \leq a.B(a,|a|,x).$$
144
%		But $B(a,|a|,x)$ implies $(\forall y^{\safe}<x. \neg A(y))\cand A(x)$, so we have proven:
145
%		$$(A(a) \cand a \neq 0 ) \cimp (\exists x^{\safe } \leq a. (\forall y^{\safe}<x. \neg A(y))\cand A(x))$$
146
%		which is the $\Sigma_{i-1}^{\safe}$ axiom for $A$.
147
%	\end{proof}
148
%	
149
%	Now, working in $\arith^i$, let $\vec u \in \normal , \vec x \in \safe$ and let us prove:
150
%	\[
151
%	\exists !z^\safe  . A_f(\vec u ; \vec x , z)
152
%	\]
153
%	Suppose that $\exists x^\safe , y^\safe .  (A_g (\vec u ,\vec x , x , y) \cand y=_2 0)$.
154
%	We can apply minimisation due to the lemma above to find the least $x\in \safe$ such that $\exists y^\safe .  (A_g (\vec u ,\vec x , x , y) \cand y=_2 0)$, and we set $z = \succ 1 x$. So $x= p z$. 
155
%	%\todo{verify $z\neq 0$ disjunct.} 
156
%	%Then $z \neq 0$ holds. 
157
%	Then $z=\succ{1} p z$ holds.
158
%	Moreover we had  $\exists ! y^\safe . A_g(\vec u , \vec x , x , y)$. So we deduce that
159
%	$\forall y^\safe . (A_g (\vec u , \vec x , p z , y) \cimp y=_2 0 ) $. Finally, as $p z$ is the least element such that
160
%	$\exists y^\safe .  (A_g (\vec u ,\vec x , p z , y) \cand y=_2 0)$, we deduce 
161
%	$\ \forall x^\safe < p z . \forall y^\safe . (A_g (\vec u , \vec x , x , y) \cimp y=_2 1) $. We conclude that the second member of the disjunction
162
%	$A_f(\vec u ; \vec x , z)$ is proven.  
163
%	
164
%	Otherwise, we have that $\forall x^\safe , y^\safe . (A_g (\vec u , \vec x , x, y) \cimp y=_2 1)$, so we can set $z=0$ and the first member of the disjunction $A_f(\vec u ; \vec x , z)$ is proven.  
165
%	
166
%	So we have proven $\exists z^\safe  . A_f(\vec u ; \vec x , z)$, and unicity can be easily verified.
167
%	
168
%	\paragraph*{Predicative recursion on notation}
169
%	
170
%	\anupam{Assume access to the following predicates (makes completeness easier, soundness will be okay):
171
%		\begin{itemize}
172
%			%	\item $\pair x y z $ . ``$z$ is the sequence that appends $y$ to the sequence $x$''
173
%			\item $\pair x y z$. ``$z$ is the sequence that prepends $x$ to the sequence $y$''
174
%			\item $\beta (i; x ,y)$. ``The $i$th element of the sequence $x$ is $y$.''
175
%		\end{itemize}
176
%	}
177
%	\patrick{I also assume access to the following predicates:
178
%		\begin{itemize}
179
%			%   \item $\zerobit (u,k)$ (resp. $\onebit(u,k)$). " The $k$th bit of $u$ is 0 (resp. 1)"
180
%			%   \item $\pref(k,x,y)$. "The prefix of $x$ (as a binary string) of length $k$ is $y$" 
181
%			\item $\addtosequence(w,y,w')$. "$w'$ represents the sequence obtained by adding $y$ at the end of the sequence represented by $w$". Here we are referring to sequences which can be decoded with predicate $\beta$.
182
%		\end{itemize}}
183
%		In the following we will use the predicates $\zerobit (u,k)$, $\onebit(u,k)$, $\pref(k,x,y)$ which have been defined in Sect. \ref{sect:graphsbasicfunctions}.
184
%		
185
%		Suppose that $f$ is defined by predicative recursion on notation:
186
%		\[
187
%		\begin{array}{rcl}
188
%		f(0 , \vec u ; \vec x) & \dfn & g(\vec u ; \vec x) \\
189
%		f(\succ i u, \vec u ; \vec x) & \dfn & h_i( u , \vec u ; \vec x , f(u , \vec u ; \vec x))
190
%		\end{array}
191
%		\]
192
%		
193
%		\anupam{using $\beta(i,x,y)$ predicate for sequences: ``$i$th element of $x$ is $y$''. Provably total in $\arith^1$.}
194
%		
195
%		Suppose we have $\Sigma^\safe_i$ formulae $A_g (\vec u ; \vec x,y)$ and $A_{h_i} (u , \vec u ; \vec x , y , z)$ computing the graphs of $g$ and $h_i$ respectively, provably total in $\arith^i$.
196
%		We define $A_f (u ,\vec u ; \vec x , y)$ as,
197
%		\[
198
%		\exists w^\safe . \left(
199
%		\begin{array}{ll}
200
%		& 
201
%		%Seq(z) \cand 
202
%		\exists^{\safe} y_0 . ( A_g (\vec u , \vec x , y_0) \cand \beta(0, w , y_0) ) \cand \beta(|u|, w,y ) \\
203
%		%\cand & \forall k < |u| . \exists y_k , y_{k+1} . ( \beta (k, w, y_k) \cand \beta (k+1 ,w, y_{k+1})  \cand A_{h_i} (u , \vec u ; \vec x , y_k , y_{k+1}) )\\
204
%		\cand & \forall^{\normal}  k < |u| . \exists^{\safe} y_k , y_{k+1} . ( \beta (k, w, y_k) \cand \beta (k+1 ,w, y_{k+1})  \cand B (u , \vec u ; \vec x , y_k , y_{k+1}) )
205
%		\end{array}
206
%		\right)
207
%		\]
208
%		where 
209
%		\[
210
%		B (u , \vec u ; \vec x , y_k , y_{k+1}) = \left(
211
%		\begin{array}{ll}
212
%		& \zerobit(u,k+1) \cimp  \exists v .(\pref(k,u,v)  \cand A_{h_0}(v,\vec u ; \vec x, y_k, y_{k+1}) )\\
213
%		\cand &  \onebit(u,k+1) \cimp  \exists v .(\pref(k,u,v)  \cand A_{h_1}(v,\vec u ; \vec x, y_k, y_{k+1}) )
214
%		\end{array}
215
%		\right)
216
%		\]
217
%		
218
%		%which is $\Sigma^\safe_i$ by inspection, and indeed defines the graph of $f$.
219
%		
220
%		To show totality, let $\vec u \in \normal, \vec x \in \safe$ and proceed by induction on $u \in \normal$.
221
%		The base case, when $u=0$, is immediate from the totality of $A_g$, so for the inductive case we need to show:
222
%		\[
223
%		\exists y^\safe . A_f (u , \vec u ; \vec x , y) 
224
%		\quad \seqar \quad
225
%		\exists z^\safe . A_f (s_i u, \vec u ; \vec x , z)
226
%		\]
227
%		
228
%		So let us assume $\exists y^\safe . A_f (u , \vec u ; \vec x , y) $. Then there exists $w$ such that $\safe(w)$ and 
229
%		$A_f (u , \vec u ; \vec x , w) $.
230
%		
231
%		We know that there exists a $z$ such that $A_{h_i}(u,\vec u ; \vec x, y, z)$. Let then $w'$ be such that
232
%		$\addtosequence(w,z,w')$. Observe also that we have $\pref(|u|,s_i u,u)$
233
%		
234
%		So we have, for $k=|u|$:
235
%		
236
%		$$  \beta (k, w', y) \cand \beta (k+1 ,w', z)  \cand B (u , \vec u ; \vec x , y , z).$$
237
%		
238
%		and we can conclude that
239
%		\[
240
%		\exists w'^\safe . \left(
241
%		\begin{array}{ll}
242
%		& 
243
%		%Seq(z) \cand 
244
%		\exists y_0 . ( A_g (\vec u , \vec x , y_0) \cand \beta(0, w' , y_0) ) \cand \beta(|u|+1, w',z ) \\
245
%		\cand & \forall k < |u|+1 . \exists y_k , y_{k+1} . ( \beta (k, w, y_k) \cand \beta (k+1 ,w, y_{k+1})  \cand B (u , \vec u ; \vec x , y_k , y_{k+1}) )
246
%		\end{array}
247
%		\right)
248
%		\]
249
%		So $\exists z^{\safe} . A_f (s_i u, \vec u ; \vec x , z)$ has been proven. So by induction we have proven $\forall^{\normal} u,  
250
%		\forall^{\normal} \vec u, \exists^{\safe} y. A_f (u ,\vec u ; \vec x , y)$. Moreover one can also check the unicity of $y$, and so we have proved the required formula. 
251
%		
252
%		\anupam{here need to `add' element to the computation sequence. Need to do this earlier in the paper.}
253
%		
254
%		\anupam{for inductive cases, need $u\neq 0$ for $\succ 0$ case.}
255
%		
256
%		\paragraph*{Safe composition}
257
%		Now suppose that $f$ is defined by safe composition:
258
%		\[
259
%		f(\vec u ; \vec x) \quad \dfn \quad g( \vec h(\vec u;) ; \vec h' (\vec u ; \vec x) )
260
%		\]
261
%		
262
%		By the inductive hypothesis, let us suppose that we have $\Sigma^\safe_i $ definitions $A_g , A_{h_i} , A_{h_j'} $ of the graphs of $g , h_i , h_j'$ respectively, which are provably total etc.
263
%		In particular, by Raising, we have that $\forall \vec u^\normal . \exists v^\normal . A_{h_i} (\vec u , v)$.
264
%		
265
%		We define $A_f (\vec u , \vec x , z)$ defining the graph of $f$ as follows:
266
%		\[
267
%		\exists \vec v^\normal . \exists \vec y^\safe .  
268
%		\left(  
269
%		\bigwedge\limits_i A_{h_i} (\vec u , v_i)
270
%		\wedge
271
%		\bigwedge\limits_j A_{h_j'} (\vec u ; \vec x , y_j)
272
%		\wedge
273
%		A_g ( \vec v , \vec y , z ) 
274
%		\right)
275
%		\]
276
%		The provable totality of $A_f$ follows from simple first-order reasoning, mostly cuts and basic quantifier manipulations.
277
%		
278
%		\todo{elaborate}
279
%		
280
%		The proof of Thm \ref{thm:completeness} is thus completed.
281
%\end{proof}
282
%
283
%
242 284

  
243 285

  
244 286

  
......
258 300

  
259 301

  
260 302

  
261

  
CSL17/tech-report/soundness.tex (revision 267)
47 47
In the presence of a compatible sorting, we may easily define functions that \emph{evaluate} safe formulae in $\mubc$:
48 48

  
49 49

  
50
\begin{proposition}
51
Given a $\Sigma^\safe_i$ formula $A$ and compatible sorting $(\vec u; \vec x)$ of its variables, there is a $\mubci{i}$ program $\charfn{\vec u ;\vec x}{A} (l, \vec u ; \vec x)$ computing the characteristic function of $A (\vec u \mode l ; \vec x \mode l)$.
52
\end{proposition}
53

  
54 50
\begin{definition}
55 51
	[Length bounded characteristic functions]
56 52
	We define $\mubci{}$ programs $\charfn{}{A} (l , \vec u; \vec x)$, parametrised by a formula $A$ and a compatible typing $(\vec u ; \vec x)$ of its varables, as follows.
......
85 81
\end{definition}
86 82

  
87 83

  
84
\begin{proposition}
85
Given a $\Sigma^\safe_i$ formula $A$ and compatible sorting $(\vec u; \vec x)$ of its variables, 
86
$\charfn{\vec u ;\vec x}{A} (l, \vec u ; \vec x)$ is in $\mubci{i}$ and computes the characteristic function of $A (\vec u \mode l ; \vec x \mode l)$.
87
\end{proposition}
88

  
89

  
90

  
88 91
We will use the programs $\charfn{}{}$ in the witness functions we define below.
89 92
Let us write $\charfn{}{i}$ to denote the class of functions $\charfn{}{A}$ for $A \in \Sigma^\safe_{i}$.
90 93
For the notion of bounding polynomial below we are a little informal with bounds, using `big-oh' notation, since it will suffice just to be `sufficiently large'.
......
96 99
	For a $\Sigma^\safe_{i}$ formula $A$ with a compatible sorting $(\vec u ; \vec x)$, we define the \emph{length-bounded witness function} $\wit{\vec u ; \vec x}{A} (l, \vec u ; \vec x , w)$ in $\bc (\charfn{}{i-1})$ and its \emph{bounding polynomial} $b_A (l)$ as follows:
97 100
	\begin{itemize}
98 101
		\item If $A$ is $\Pi^\safe_{i-1}$ then $\wit{\vec u ; \vec x}{A} (l, \vec u ; \vec x , w) \dfn \charfn{\vec u ; \vec x}{A} (l, \vec u ; \vec x )$ and we set $b_A (l) = 1$.
99
		\item If $A$ is $B \cor C$ then we may set $b_A = O(b_B + b_C)$ and define $		\wit{\vec u ; \vec x}{A} (l, \vec u ;\vec x , w) \dfn		\orfn ( ; \wit{\vec u ; \vec x}{B} (l, \vec u ; \vec x , \beta (b_B (l) , 0 ; w ) ) , \wit{\vec u ; \vec x}{C} (l, \vec u ; \vec x , \beta (b_C (l) , 0 ; w ) )  )$.
102
		\item If $A$ is $B \cor C$ then we may set $|b_A| = O(|b_B| + |b_C|)$ and define $		\wit{\vec u ; \vec x}{A} (l, \vec u ;\vec x , w) \dfn		\orfn ( ; \wit{\vec u ; \vec x}{B} (l, \vec u ; \vec x , \beta (b_B (l) , 0 ; w ) ) , \wit{\vec u ; \vec x}{C} (l, \vec u ; \vec x , \beta (b_C (l) , 0 ; w ) )  )$.
100 103
%		\[
101 104
%		\wit{\vec u ; \vec x}{A} (l, \vec u ;\vec x , w) 
102 105
%		\quad \dfn \quad
......
121 124
		%appealing to Lemma~\ref{lem:sharply-bounded-recursion}, 
122 125
		and 
123 126
		we set 
124
		$b_A = O(b_{B(t)}^2 )$.
127
		$|b_A| = O(|b_{B(t)}|^2 )$.
125 128
		\item Similarly if $A$ is $\exists u^\normal \leq |t(\vec u;)|. A'(u)$, but with $\exists u \leq |t|$ in place of $\forall u \leq |t|$.
126 129
		\item If $A$ is $\exists x^\safe . B(x) $ then
127 130
		\(
......
129 132
		\dfn 
130 133
		\wit{\vec u ; \vec x , x}{B(x)} ( l, \vec u ; \vec x , \beta( b_{B} (l) , 0;w ) , \beta (q(l) , 1 ;w ))
131 134
		\)
132
		where $q$ is obtained by the polychecking and bounded minimisation properties, Lemmas~\ref{lem:polychecking} and \ref{lem:bounded-minimisation}, for $\wit{\vec u ; \vec x , x}{B(x)}$.
133
		We may set $b_A = O(b_B + q )$.
135
		where $q$ is obtained by the polychecking and bounded minimisation properties,\footnote{Here let us assume that $q$ is formulated as a corresponding quasipolynomial in $l$ as opposed to a polynomial in $|l|$, as in Lemma~\ref{lem:bounded-minimisation}.} Lemmas~\ref{lem:polychecking} and \ref{lem:bounded-minimisation}, for $\wit{\vec u ; \vec x , x}{B(x)}$.
136
		We may set $|b_A| = O(|b_B | + |q|)$.
134 137
	\end{itemize}
135 138
%	\[
136 139
%	\begin{array}{rcl}
......
151 154
\end{definition}
152 155

  
153 156

  
154

  
157
From Lemmas~\ref{lem:polychecking} and \ref{lem:bounded-minimisation} we have:
155 158
\begin{proposition}
156 159
	\label{prop:wit-rfn}
157 160
	If, for some $w$, $\wit{\vec u ; \vec x}{A} (l, \vec u ; \vec x,w) =1$, then $A (\vec u \mode l ; \vec x \mode l)$ is true.
......
177 180
	\begin{array}{rl}
178 181
&	\bigwedge\limits_{A \in \Gamma} \wit{\vec u ; \vec x}{ A} (l, \vec u ; \vec x , w_A \mode b_A(l)) =1 \\
179 182
\noalign{\medskip}
180
\implies & 	\bigvee\limits_{B\in \Delta} \wit{\vec u ; \vec x}{B} (l, \vec u ; \vec x , f^\pi_B((\vec u ; \vec x )\mode l, \vec w \mode p(l))) = 1
183
\implies & 	\bigvee\limits_{B\in \Delta} \wit{\vec u ; \vec x}{B} (l, \vec u ; \vec x , f^\pi_B((\vec u ; \vec x )\mode l, \vec w \mode p^\pi(l))) = 1
181 184
	\end{array}
182 185
	\]
183
	for some polynomial $p$.
186
	for some polynomial $p^\pi$.
184 187
%	\anupam{Need $\vec w \mode p(l)$ for some $p$.}
185 188
%	\anupam{$l$ may occur freely in the programs $f^\pi_B$}
186 189
\end{lemma}
187 190

  
188 191

  
189 192

  
190
For the implication above, let us simply refer to the LHS as $\Wit{\vec u ; \vec x}{\Gamma} (l , \vec u ; \vec x , \vec w)$ and the RHS as $\Wit{\vec u ; \vec x}{ \Delta} (l, \vec u ; \vec x , \vec w')$, with $\vec w'$ in place of $\vec f( \cdots )$, which is a slight abuse of notation: we assume that LHS and RHS are clear from context.
193
For the implication above, let us simply refer to the LHS as $\Wit{\vec u ; \vec x}{\Gamma} (l , \vec u ; \vec x , \vec w)$ and the RHS as $\Wit{\vec u ; \vec x}{ \Delta} (l, \vec u ; \vec x , \vec f^\pi ((\vec u ; \vec x )\mode l, \vec w \mode p^\pi(l))  )$, which is a slight abuse of notation: we assume that LHS and RHS are clear from context.
194
Also let us call $p^\pi$ the \emph{modulus} of $f^\pi$ with respect to $l$.
191 195

  
192 196
\begin{proof}
193 197
	Since the proof is in typed variable normal form we have that each line of the proof is of the same form, i.e.\ $\normal (\vec u), \safe (\vec x) , \Gamma \seqar \Delta$ over free variables $\vec u ; \vec x$.
......
202 206
Notice that, since $\pi$ is in De Morgan form, we have that $A$ is atomic ($s\leq t$) and so, in particular, $\Pi^\safe_{i-1}$.
203 207
So we can simply set the witness for both $A$ and $\cnot A$ to $0$.
204 208
Namely, if $\vec f (\vec u ; \vec x , \vec w , w )$ is obtained by the inductive hypothesis, then we may set $f^\pi_B ( \vec u ; \vec x , \vec w) \dfn f_B (\vec u ;\vec x , \vec w , 0)$ for $B\in \Delta$, and $f^\pi_A (\vec u ; \vec x , \vec w) \dfn 0$. 
205
The bounding polynomial remains the same. 
209
The modulus $p^\pi$ remains the same as that of the inductive hypothesis. 
210

  
211
Left negation is similar, relying on a dummy argument.
206 212
	
207 213
	\paragraph*{Logical rules}
208
	Pairing, depairing. Need length-boundedness.
209
	
210
	If we have a left conjunction step:
214
%	Pairing, depairing. Need length-boundedness.
215
	Suppose $\pi$ ends with a 
216
	left conjunction step:
211 217
	\[
212 218
	\vlinf{\lefrul{\cand}}{}{ \normal (\vec u ), \safe (\vec x) , A\cand B , \Gamma \seqar \Delta }{ \normal (\vec u ), \safe (\vec x) , A, B , \Gamma \seqar \Delta}
213 219
	\]
......
219 225
	\]
220 226
	for some polynomial $p$.
221 227
	%
222
	We define $\vec f^\pi (\vec u ; \vec x , w , \vec w) \dfn \vec f (\vec u ; \vec x , \beta (p(l),0; w) , \beta(p(l),1;w),\vec w )$ and, by the bounding polynomial for pairing, it suffices to set $p^\pi = O(p)$.
228
	We define $\vec f^\pi (\vec u ; \vec x , w , \vec w) \dfn \vec f (\vec u ; \vec x , \beta (p(l),0; w) , \beta(p(l),1;w),\vec w )$ and, by the bounding polynomial for pairing, it suffices to set $|p^\pi| = O(|p|)$.
223 229
	
230
		Right disjunction step:
231
		\[
232
		\vlinf{\rigrul{\cor}}{}{ \normal (\vec u ), \safe (\vec x) , \Gamma \seqar \Delta , A \cor B}{ \normal (\vec u ), \safe (\vec x) , \Gamma \seqar \Delta, A, B }
233
		\]
234
		$\vec f^\pi_\Delta$ remains the same as that of premiss.
235
		Let $f_A, f_B$ be the functions corresponding to $A$ and $B$ in the premiss, so that:
236
		\[
237
		\Wit{\vec u ; \vec x}{\Gamma} (l, \vec u ; \vec x , \vec w)
238
		\quad \implies \quad
239
		\Wit{\vec u ; \vec x}{\Delta} (l, \vec u ; \vec x , f_C ( (\vec u ; \vec x) \mode l , \vec w \mode p(l) ))		
240
		\]
241
		for $C = A,B$ and for some $p$, such that $f_A , f_B$ are bounded by $q(l)$ (again by IH).
242
		We define $f^\pi_{A\cor B} (\vec u ; \vec x, \vec w) \dfn \pair{q(l)}{f_A ((\vec u ; \vec x, \vec w))}{f_B ((\vec u ; \vec x, \vec w))}$.
224 243
	
225
	Right disjunction step:
226
	\[
227
	\vlinf{\rigrul{\cor}}{}{ \normal (\vec u ), \safe (\vec x) , \Gamma \seqar \Delta , A \cor B}{ \normal (\vec u ), \safe (\vec x) , \Gamma \seqar \Delta, A, B }
228
	\]
229
	$\vec f^\pi_\Delta$ remains the same as that of premiss.
230
	Let $f_A, f_B$ be the functions corresponding to $A$ and $B$ in the premiss, so that:
231
	\[
232
	\Wit{\vec u ; \vec x}{\Gamma} (l, \vec u ; \vec x , \vec w)
233
	\quad \implies \quad
234
	\Wit{\vec u ; \vec x}{\Delta} (l, \vec u ; \vec x , f_C ( (\vec u ; \vec x) \mode l , \vec w \mode p(l) ))		
235
	\]
236
	for $C = A,B$ and for some $p$, such that $f_A , f_B$ are bounded by $q(l)$ (again by IH).
237
	We define $f^\pi_{A\cor B} (\vec u ; \vec x, \vec w) \dfn \pair{q(l)}{f_A ((\vec u ; \vec x, \vec w))}{f_B ((\vec u ; \vec x, \vec w))}$.
244
	The other logical cases use a similar argument.
245
	
246

  
238 247
	\paragraph*{Quantifiers}
239
	\anupam{Do $\exists$-right and $\forall$-right, left rules are symmetric.}
240
	
241
	
242
	
243
	Sharply bounded quantifiers are generalised versions of logical rules.
248
	If $\pi$ ends with a sharply bounded quantifier step, then we use a similar argument to that for logical rules. 
249
	For instance, suppose $\pi$ ends with a $\rigrul{|\forall|}$:
244 250
	\[
245
	\vlinf{|\forall|}{}{\normal (\vec u) , \safe (\vec x) , \Gamma \seqar  \Delta , \forall u^\normal \leq |t(\vec u;)| . A(u) }{ \normal(u) , \normal (\vec u) , \safe (\vec x) , u \leq |t(\vec u;)| , \Gamma \seqar  \Delta, A(u)  }
251
	\vlinf{\rigrul{|\forall|}}{}{\normal (\vec u) , \safe (\vec x) , \Gamma \seqar  \Delta , \forall u^\normal \leq |t(\vec u;)| . A(u) }{ \normal(u) , \normal (\vec u) , \safe (\vec x) , u \leq |t(\vec u;)| , \Gamma \seqar  \Delta, A(u)  }
246 252
	\]
247 253
	By the inductive hypothesis we have functions $\vec f(u , \vec u ; \vec x , w , \vec w)$ such that:
248 254
	\[
......
256 262
	
257 263
	We set $f^\pi_{\forall u^\normal \leq t . A} (\vec u ; \vec x , \vec w) \dfn F(q(|l|), t(\vec u;), \vec u ; \vec x , 0, \vec w  )$.
258 264
	
265
		\anupam{Do $\exists$-right and $\forall$-right, left rules are symmetric.}
259 266
	
260 267
	Right existential:
261 268
	\[
......
265 272
	
266 273
	
267 274
	\paragraph*{Contraction}
268
	Left contraction simply duplicates an argument, whereas right contraction requires a conditional on a $\Sigma^\safe_i$ formula.
275
	Left contraction simply duplicates an argument,\footnote{We ignore here the cases when contraction is on a $\normal (u) $ or $\safe (x)$ formula, treating these formulae form as forming a set rather than a multiset.} whereas right contraction requires a conditional on a $\Sigma^\safe_i$ formula.
276
	This is the reason why we need to the witness function encoded itself into $\mubc$ rather than simply using a predicate.
269 277
	
278
	For the sake of example, suppose $\pi$ ends with a right contraction step:
270 279
	\[
271 280
	\vlinf{\cntr}{}{\normal (\vec u) , \safe (\vec x) , \Gamma \seqar \Delta , A }{\normal (\vec u) , \safe (\vec x) , \Gamma \seqar \Delta , A , A}
272 281
	\]
273
	
274
	$\vec f^\pi_\Delta$ remains the same as that of premiss. Let $f_0 ,f_1$ correspond to the two copies of $A$ in the premiss.
275
	We define:
282
	By the inductive hypothesis we have functions $\vec f_\Delta , f_0, f_1$ corresponding to the RHS $\Delta , A ,A$.
283
	We may define $\vec f^\pi_\Delta = \vec f_\Delta$ and:
276 284
	\[
277 285
	f^\pi_A ( \vec u ; \vec x , \vec w  )
278 286
	\quad \dfn \quad

Formats disponibles : Unified diff