\documentstyle[11pt]{article}
\begin{document}
\title{Indefiniteness}
\author{Randall Holmes\thanks{e-mail: \tt holmes@math.idbsu.edu}
\\
Math. Dept., Boise State Univ.}
\date{}
\maketitle

\abstract{The general topic of {\em undefined terms} was discussed and
basic approaches were outlined.}

\section{Approaches}

\subsection{History}

  Russell's theory of descriptions, formulated in response
to the views of Meinong, stands at the beginning of the topic.

\subsection{
Alternate Logics}
  Systems of free logic with an existence
predicate (described in a preprint {\em Undefinedness} of Feferman)
implement the logic of Russell's description operator, or,
alternatively, can implement the approach of Meinong which Russell was
trying to discredit.  In either approach, the range of quantifiers is
restricted to the objects which exist; in a "Russellian" approach free
variables also range only over existing objects, whereas in a
"Meinongian" approach free variables range over all objects, existent
or otherwise.

These logics are good at handling partial functions.  The PF logic of
the IMPS theorem prover is a (Russellian) logic of this kind.

The common characteristic of these logics is that rules for
well-formedness of terms can be kept simple, but well-formed terms do
not necessarily denote, and an existence predicate is available to
capture this information.

\subsection{Typing Approaches}
  In this family of approaches, one adopts a
type system designed to keep {\em undefined terms} ill-formed.  This
approach appears to require subtyping (for example, if we are defining
division on the reals, we need the type of nonzero reals for the
typing conditions).  This would appear to lead to complex, even
undecidable type schemes; this might not be a problem if one was
already committed (as in some constructive type theories) to a very
complex typing scheme.

The PF logic of IMPS is not a scheme of this kind, though it is the
logic of a type theory; in PF logic, one has well-formed terms which
do not denote and an existence predicate (this distinction was brought
out in the discussion)

\subsection{Default value approaches}
 In these approaches, well-formedness
of terms is kept simple and all terms are treated as denoting.  Terms
which would normally be regarded as undefined are assigned "default
values".  A spectrum of approaches exists, ranging from assigning the
same default value to every {\em undefined term} (or the same default
value to every undefined term of each individual type) to an approach
in which one carefully avoids providing any information about default
values except that they exist.  One tries to choose default values so
as to keep theorems simple, but also keep them familiar-looking; it is
probably a good idea to avoid having too many surprising theorems
resulting from one's default convention.  An example given by Boyer is
a convention used by Morse in a treatment of von Neumann-G{\"o}del-Bernays
set theory (with proper classes) in which the universal class (which
is a first-class object but not an element of any set) is used as the
default value in all cases; this works surprisingly well in many
cases.

These approaches do not interact well with the use of pointwise
operations on partial functions (for example, consider the sum of the
functions  $x !-> 1/x$ and the constant 1 on the one hand and the
function $x !-> (1/x)+1$ on the other; suppose that {\em undefined values}
are set to 0; these two functions have different values at 0 and so
are distinct).  This suggests the use of a special error value for
undefined values of functions and restricting one's attention to
"strict" functions (those which send the error value to the error
value); such an approach would culminate in something like the
semantics of Scott's models of the lambda-calculus.  Notice that
Morse's default value approach to set theory described above also
handles this correctly.

\section{Relation to QED}

Different approaches to undefined terms create an obstruction to
exchange of information between theorem provers which has nothing to
do with underlying mathematical issues (there are likely to be other
such obstructions arising from differences of mathematical convention
rather than content).

The character of different provers may dictate which approach is most
convenient: for example, in the prover which Holmes is working on, an
equational prover without a built-in type system or an easy way to
express side conditions, a default value approach was most natural.
Thus, a commitment to a single approach throughout QED seems
undesirable.

Holmes suggested the design of a menu of standard approaches to
undefined terms and a set of protocols for translation between these
approaches.  An example:  translation from a type-based approach to an
existence predicate based approach should be easy.
\end{document}