Published by Cambridge University Press, 1999, ISBN 0 521 63107 6.
http://www.dcs.qmw.ac.uk/ ~ pt/Practical_Foundations
This summary explains which parts of the book are original research or novelties of exposition. The entire text is now accessible on the Web in an approximate HTML translation, so if you are reading the HTML version of this document you can go directly to the parts of the text to which it refers.
This book is officially due to be published in mid-May, but it is already in stock, and is available direct from the publishers at £50 (inclusive of overland postage and packing). Please contact
Richard Knott, email: email@example.com
fax: +44 1223 315 052 tel: +44 1223 325 916 (but other methods are preferable)
snail: Cambridge University Press, The Edinburgh Building, Shaftesbury Road, Cambridge, CB2 2RU, UK
with your address and credit card number. (£2.50 extra for airmail.)
The first three chapters will be essential reading for the design of courses in discrete mathematics and reasoning, especially for the ``box method'' of proof taught successfully to first year informatics students. Chapters IV, and are an introduction to categorical logic. Between the formal languages translations are provided which are fluent, showing how to write vernacular proofs that are sound in formal logics.
Chapter is a new approach to term algebras, induction and recursion, which have hitherto only been treated either naïvely or with set theory. The last two chapters prove in detail the equivalence of types and categories, in particular between generalised algebraic theories and categories with display maps.
Students and teachers of computing, mathematics and philosophy will find this book both readable and of lasting value as a reference work.
The book is an account of the foundations of mathematics (algebra) and theoretical computer science, from a modern constructive viewpoint. It is intended, amongst other things, to provide a bridge between the use of logic underlying mathematical constructions and its abstract study in disciplines such as the lambda calculus, predicate calculus, type theory and universal algebra. It began as the prerequisites for another book (Domains and Duality), so it contains very little domain theory itself, but it does treat the fundamental issues of the semantics of programming languages.
Mathematical and computing issues are interwoven. For example the classifying category for an algebraic theory is defined as a declarative programming language, which is in turn illustrated by the solution of cubic equations.
Category theory plays a major role in the book, but the abstract concepts are introduced on a ``need to know'' basis. Emphasis is placed on how functoriality, naturality, uniqueness, universality and pullback-stability carry the force of the constructions and properties (such as invariance under substitution), rather than allowing the reader to think that these are merely bureaucratic side conditions. Wherever possible, the poset analogues of categorical results are given first.
Excluded Middle is avoided, being largely irrelevant to algebra and category theory (these are often done ``inside a topos''). However just as it is easier to teach a baby to swim before it has learnt the fear of water, so it is simpler to be constructive from the beginning with naive set theory than to recover it later with Kripke models or Grothendieck toposes.
Every result has been taken apart, sharpened, polished and re-assembled, so most of the sections contain material which is in some way original, and much of it could have been published individually. However much of the value of the book is that it deliberately blurs distinctions between disciplines, resolving numerous apparent conflicts of viewpoint which would never have been considered so thoroughly if they had been treated separately.
Our subject should be concerned with the basic idioms of argument and construction in mathematics and programming, and seek to explain these (as fundamental physics does) in terms of more general and basic phenomena. This is ``discrete math for grown-ups.''
A moderate form of the logicist thesis is established in the tradition from Weierstrass to Bourbaki, that mathematical discussions consist of the manipulation of assertions built using \land, Ú, Þ, " and $. We shall show how the way in which mathematicians (and programmers) - rather than logicians - conduct such discussions really does correspond to a certain semi-formal system of proof in (intuitionistic) predicate calculus. The working mathematician who is aware of this correspondence will be more likely to make valid arguments, that others are able to follow. Automated deduction is still in its infancy, but such awareness may also be expected to help with computer-assisted construction, verification and dissemination of proofs.
One of the more absurd claims of extreme logicism was the reduction of the natural numbers to the predicate calculus. Now we have a richer view of what constitutes logic, based on a powerful analogy between types and propositions. In classical logic, as in classical physics, particles enact a logical script, but neither they nor the stage on which they perform are permanently altered by the experience. In the modern view, matter and its activity are created together, and are interchangeable (the observer also affects the experiment by the strength of the meta-logic).
This analogy, which also makes algebra, induction and recursion part of logic, is a structural part of this book, in that we always treat the simpler propositional or order-theoretic version of a phenomenon as well as the type or categorical form.
Besides this and the classical symmetry between \land and Ú and between " and $, the modern rules of logic exhibit one between introduction (proof, element) and elimination (consequence, function). These rules are part of an even wider picture, being examples of adjunctions.
This suggests a new understanding of foundations apart from the mere codification of mathematics in logical scripture. When the connectives and quantifiers have been characterised as (universal) properties of mathematical structures, we can ask what other structures admit these properties. Doing this for coproducts in particular reveals rather a lot of the elementary theory of algebra and topology. We also look for function spaces and universal quantifiers among topological spaces.
A large part of this book is category theory, but that is because for many applications this seems to me to be the most efficient heuristic tool for investigating structure, and comparing it in different examples. Plainly we all write mathematics in a symbolic fashion, so there is a need for fluent translations that render symbols and diagrams as part of the same language. However, it is not enough to see recursion as an example of the adjoint functor theorem, or the propositions-as-types analogy as a reflection of bicategories. We must also contrast the examples and give a full categorical account of symbolic notions like structural recursion.
You should not regard this book as yet another foundational prescription. I have deliberately not given any special status to any particular formal system, whether ancient or modern, because I regard them as the vehicles of meaning, not its cargo. I actually believe the (moderate) logicist thesis less than most mathematicians do. This book is not intended to be synthetic, but analytic - to ask what mathematics requires of logic, with a view to starting again from scratch.
The occasional anecdotes are not meant to be authoritative history. They are there to remind us that mathematics is a human activity, which is always done in some social and historical context, and to encourage the reader to trace its roots. The dates emphasise quite how late logic arrived on the mathematical scene. The footnotes are for colleagues, not general readers, and there are theses to be written à propos of the exercises.
The first three chapters should be accessible to final year undergraduates in mathematics and informatics; lecturers will be able to select appropriate sections themselves, but should warn students about parenthetical material. Most of the book is addressed to graduate students; Section and the last two chapters are research material. I hope, however, that every reader will find interesting topics throughout the book.
Chapters IV, and provide a course on category theory. Chapter III is valuable as a prelude since it contains many of the results (the adjoint functor theorem, for example) in much simpler form.
Sections 1.1-5 (not necessarily in that order), 2.3, 2.4, 2.7, 2.8, 3.1-5, 4.1-5 and provide a course on the semantics of the l-calculus.
For imperative languages, take Sections 1.4, 1.5, 4.1-6, , and .
An advanced course on type theory would use Chapter IV as a basis, followed by Chapters and with Sections , to give semantic and syntactic points of view on similar issues.
Chapter and Section discuss topics in symbolic logic from the point of view of category theory.
The substitution lemma (extended to include weakening) plays a much more important role in this book than is usual in accounts of syntax, being used as the equations in a sketch for the classifying category in Sections 4.3 and . Our notation for substitution is [a/x]*t, since this is pullback along the morphism [a/x]. An explicit notation, [^(x)]*, is also introduced for weakening.
Laws as reduction rules; definitions of confluence and normalisation (stated for the l-calculus in Section 2.3 and proved in Section ). Transitive closure, zig-zags, connected components of graphs.
A novel example, the Lineland Army (which is in fact the free monoid-with-a-monad), shows how the proofs of confluence and of associativity of composition of normal forms amount to the same thing.
Russell's theory of descriptions, using ``the'' when a predicate has at most one witness, that need not exist. Extending this usage to uniqueness up to isomorphism, for objects such as products that are defined by universal properties.
Partial and total functional relations (following on from the theory of descriptions). Arity, source and target: everything is typed. Relational algebra and terminology, including idempotents.
A novel account of the phrase `there exists', which both asserts the quantified predicate and opens its elimination-box, which is open-ended. Hilbert's epsilon is unnecessary.
The scope rules for boxes; the distributive and Frobenius laws; invariance under substitution (a recurring theme throughout the book).
Idioms for definition and `without loss of generality'.
Historical discussion: Frege, Hilbert, Ja\'skowski, Gentzen, Fitch.
Model theory: a ``three sides of a square'' treatment of logic. Comments on teaching logic.
Box-proof heuristics: an explicit recipe for constructing uniform proofs in the box method, based on teaching first year computer science students.
Intuitionism: Ockham, Brouwer, Kolmogorov, Gödel, Hilbert. The axiom of choice; history; dependent choice.
Logic in a topos; generic objects; syntactically constructed worlds.
Similar constructions for unions, etc. The difference between the singleton set and the singleton subset.
A construction of X+Y Ì P(X)×P(Y).
Discussion of notation for comprehension, subsets, parametric sets.
Historical comments on the Zermelo-Fraenkel axioms: extensionality and replacement are not trivial, foundation is unnatural; cardinality; the Skolem paradox; the Wiener-Kuratowski pair formula.
Correctness proofs for recursive programs fitting this paradigm. (One example is Gauss's second proof of the algebraic completeness of C.)
Simple or primitive induction, course-of-values or complete induction.
Classical idioms: minimal counterexamples, descending chains, historical comments (including Fermat's example), König's tree lemma, termination of recursive programs. Proof trees.
Complexity measures such as loop variants are based on the fact that functions which preserve the relation reflect its well-foundedness. Applications, such as to weak normalisation for the simply typed lambda calculus. Initial segments; lexicographic and other products.
Zero, successor, nil and cons as introduction rules; primitive recursion as the elimination rule. Predecessor and pattern matching. Monoid actions. Recursion invariants.
The type of propositions; definability of the quantifiers and connectives; Leibniz' principle.
Cantor's diagonalisation theorem for the powerset; Galileo's comments on extending ideas from the finite to the infinite.
Second order polymorphic lambda calculus (System F). Impredicativity; universal properties; Poincaré and Weyl's objections. Answering them by distinguishing between specification and implementation.
Representation of orders by subset-inclusion; presheaves and the Yoneda lemma for posets; quotient poset of a preorder.
Preservation and creation of meets (categorical terminology). Yoneda embedding preserves meets but freely adds joins. Diagrams; equivalent joins; cofinality.
Semilattices; lattices; the distributive law and normal forms.
The poset of partial functions; lifting; information order; the ``Tarski'' fixed point theorem; other fixed point results.
The Scott topology in terms of closed subsets first, then open subsets; intuitionistic disparity between these; the Sierpi\'nski space; finitely generated (compact) elements; algebraic dcpos.
Historical discussion of Scott's thesis: Ershov, Myhill-Shepherdson, Rice-Shapiro, Strachey. Plotkin: PCF, computational adequacy, parallel or, sequentiality, full abstraction. Cartesian closed categories of domains and spaces; L-domains; Kelley. Synthetic domain theory.
This is an original account of the induction scheme for a system of closure conditions - the propositional analogue of structural recursion for types. This is more general than well founded induction (Section 2.5), as it is non-deterministic. Many examples of induction arise in this way, but are often coerced into induction on natural numbers (rank), even though no numbers actually occur in the problem. Park induction deduces properties of fixed points of Scott-continuous operations from the base case and the operation; this is generalised to monotone operations on condition that property is Scott-continuous, i.e. the subset that it defines is closed under directed joins. Exercise relates this to Zermelo's proof of well-ordering.
The covariant ones are modal operators,  and \lozenge. S4 logic, where R is an order relation. Transitive closure; simpler proofs for Section 2.6. Total and partial correctness of programs; the core of a subgroup. Quantifiers, when R is a function.
The contravariant adjunctions are Galois connections. Specialisation order, `enough'. Examples: separable presheaves, Dedekind cuts, Leibniz' principle, points in an open subset, convergent sequences in a closed subset, annihilator subspaces, centraliser subgroups, number fields.
Adding and preserving joins (developed, initially, without the usual requirement of stability under meets); coverages (Grothendieck topologies); canonical; subcanonical. Examples: ideals, Smyth powerdomain. Joins that are stable under meet; nuclei; Tychonov product.
Summary of how the remainder of the book generalises from propositions to types.
Several proofs of Tarski's fixed point theorem, including Pataraia's intuitionistic proofs for ipos. Well founded elements of a lattice equipped with a successor operation (cf. well founded coalgebras, Section ).
Gluing (cf. Section ).
In algebra and topology it is easy to see why groups and homomorphisms, or spaces and continuous functions, form a category, so why is it so difficult for type theory? In fact this difficulty is the result of making a head-on attack on the equivalence between lambda calculus and categories as two accounts of the notion of function, and of equating objects with types.
But categories do not axiomatise functions: they encapsulate associativity. What is the fundamental associative operation in type theory? It is (composition of) substitution, and (as we like to use lots of variables together) the objects are not types but contexts.
The category of contexts and substitutions (a phrase chosen to echo the mathematical examples) is generated by an elementary sketch (Sections 4.2-4.3) whose equations are defined by the substitution lemma in Section 1.1. In type theory, more precisely the sequent calculus, the generating morphisms are called cut and weakening.
The same sketch defines the direct declarative programming language, including assignments. This sketch has a covariant imperative action on states, and a contravariant action that is equivalent to Floyd-Hoare logic.
By going via a sketch, the construction of the classifying category of a language in a certain fragment of logic avoids the technical difficulties that always arise from combining recursive and associative definitions. The objects and generating morphisms of the sketch are given directly by the syntactic classes of contexts, types and terms in the language.
Having set this construction up at the algebraic level, we may easily apply it to any fragment of logic. The syntax is already a category, so we may ask straight away about any universal properties that hold inside it. In particular, Section makes the connection very simply between the universal property of the exponential in category theory and the syntax for function-abstraction and function-types, with the beta and eta laws, in the lambda calculus.
The construction of the sketch is extended to generalised algebraic theories (dependent types) in Chapter , and then the universal properties and type-theoretic rules for the quantifiers are investigated in Chapter .
Returning to take a more critical look at the usual mathematical examples of categories, we notice that the ``obviousness'' that models and homomorphisms form categories is a peculiarity of (simply-typed) algebra: there is no single compelling notion of morphism for models of first order theories in general, or even for spaces or generalised algebras such as categories. In fact, the traditional motivation for the notion of category - collecting all the models and then trying to recover the theory - is as much a ``three sides of a square'' approach to logic as is classical model theory (Section 1.6).
The well known shortcomings of traditional category theory - the size problems, and the lack of extension from algebra to first order logic - disappear as soon as we recognise the classifying category (of contexts and substitutions) as the primary notion, and the category of models as a secondary one with limited applicability.
Elementary sketches; unary algebraic theories. Free theories; lists; paths through oriented graphs; regular grammars; automata; polyhedra. Commutative diagrams; Freyd's punctures.
Models; clones; free categories; soundness; completeness.
Canonical elementary language, which others call the ``internal'' language, but it is not internal in the sense of ``internal category'' etc.; I really prefer the word ``proper'' (in the sense of the French ``propre''), but this is actually the same as ``canonical (Grothendieck) topology'' in Section 3.9.
Example: the solution of cubic equations by radicals.
Contravariant logical interpretation; Floyd-Hoare logic; pre-, post- and mid-conditions; weakest preconditions and substitution.
The substitution lemma and normal form theorem; in reverse: compilation; discussion of discard (which appeared in Floyd's paper).
The construction of the classifying category from an elementary sketch based on the substitution lemma: see above.
Display maps; terms as sections; discussion of the use of variables; open a-equivalence.
Classifying category for a unary theory, i.e. the free category on a sketch.
The force of functoriality: non-examples, isomorphism-invariants.
Full and faithful; essentially surjective; full and wide subcategories; equivalence functors; examples. Misleading uses of ``forgetful''.
Replete functors and subcategories, the definition being chosen to characterise those functors for with the strict and pseudo-pullbacks against any functor are weakly equivalent.
The quantified variable that occurs in universal properties defining right adjoints is, throughout the book, written as G (Gamma) - rather than a randomly chosen letter - to emphasise the connection with (unspecified) contexts in type theory. Q (Theta) is used for left adjoints.
The terminal object; global elements; local (generalised) elements = terms in un unspecified context G; examples from recursion theory showing the difference; class of generators; well pointed (weak and strong senses); concrete.
Uniqueness up to unique isomorphism.
Binary products; examples; preservation and creation; left-associated multiple products.
Discussion of language in the use of the existence of products; the need or otherwise for Choice.
Universal properties define functors.
All universal properties are terminal objects; discussion of simple categorical constructions by analogy with modular programming.
Examples: Horn theories; magma; context-free languages; internal monoids and groups; rings with modules; generators and relations. Non-examples: lists, fields, projective planes.
Semantics of expressions; Scott's semantic brackets; Yacc. The classifying category and its universal property; completeness.
Raw cartesian closed structure in a category; naturality of abstraction with respect to substitution; Currying; interpretation. The beta- and eta-rules; universal property; exponential transposition; discussion of notation.
Examples: Heyting semilattices; C-monoids; Set, Pos, Dcpo. A ``four-point plan'' is given for verifying that such a category is cartesian closed, summing up the calculations from Section 3.5.
Sets of solutions of algebraic equations; conjunction; unique existential quantification; essentially algebraic theories; discussion (cf. Chapter ).
Special subobjects; class of supports (dominion); classifier (dominance); Examples: open, upper and recursively enumerable subsets.
By giving the Floyd-Hoare mid-conditions first class status, the category defined by a programming language looks increasingly like the category of sets and functions used in mathematics. This section develops partial functions in terms of subobjects and applies them to if then else, which is shown in Section to have the same categorical characterisation as disjoint unions of sets.
Definition; codiagonal; initial object; empty set. Disjoint union; Boolean type; variant fields; exceptions; overloading.
Abelian categories; zero map; biproduct or direct sum; non-distributivity. CMon-enriched (additive) categories; homological algebra; exactness. Limit-colimit coincidences.
Stone duality: the modularity or distributivity properties of congruence lattices in certain categories of algebras suggests that they should be seen as the opposites of categories of spaces. Coproducts in CRng and Frm as tensor products.
Van Kampen's theorem: the importance of free coproducts beyond symbolic logic.
Distributivity; binary sums in type theory; strict initial object. Extensive categories; examples; equivalence with stable disjoint coproducts. Disjunctive theories; exceptions.
Effective quotients in Set and Mod(L), where L is a finitary simply-typed algebraic theory; the need for Choice in the infinitary case; projectivity.
Constructing general coequalisers in Set; simpler construction by duality in CSLat.
Image factorisation into regular epis and monos; examples: algebras, separable field extensions, subspace and quotient topologies, dense subspaces, cofinal maps; non-examples because regular epis or monos do not compose.
Composition, cancellation and pullback properties. Example of a prefactorisation system in a poset. Special adjoint functor theorem.
Examples and counterexamples: algebras, regular rings, compact Hausdorff spaces, posets. A regular category need not have coequalisers other than of congruences, and those that it does have need not be stable.
Regular epis compose in a regular category, giving image factorisation. Four notions of surjective function.
Relational algebra: the need for pullback-stability; allegories.
Unions; Heyting implication; division allegories; Lambek calculus. Pretoposes: the elementary properties of Set.
Equationally free, parsable and free algebras; the Peano and Lawvere axioms; Lambek and Smyth-Lehman lemmas. Lists and streams.
Infinitary conjunction and disjunction; validity in models of first order theories; strictness analysis.
Construction of (equationally) free algebras using powersets of lists.
Recursive covers; examples: Rubik's cube, provability, PERs, Collatz' problem.
Variables; many-sorted theories; the free category on a graph; Polish notation; free algebras for finitary theories; formation rules in type theories.
Discussion of infinitary theories with laws and the need for Choice; counterexample to the existence of constructive proofs for provable propositions; cf. Example 2.4.8, Theorem , Exercise .
Each exit state of the loop corresponds to a component of the transition graph with a fixed point. The idea of the the coequaliser is to get (mathematical) access to this component and its properties via the kernel of the coequaliser. The result depends on the exactness properties of a pretopos (Section ), plus pullback-stability of the equivalence closure of a relation. The argument actually goes back to Frege's account of transitive closure in the Begriffsschrift, whilst the categorical property is Freyd's characterisation of N in terms of a coproduct and a coequaliser. Freyd's allegories are also used to extend the result to pretoposes that do not have all N-indexed unions.
The Floyd-Hoare correctness rules are justified on this basis. Exercise shows, categorically, how to use an accumulator to reduce unary recursion (with at most one sub-call) to tail recursion.
Co-universal properties: limits, exponentials, indiscriminate (indiscrete) structure, streams. Symmetric adjunctions: dual spaces, C-monoids, the Lineland Army. Two-sided adjunctions: splitting idempotents, Gp Ì Mon.
Classifying categories; universal property: models of a -theory in a category C with appropriate structure to interpret  correspond to functors CnL®C that preserve this structure. The analogy with polynomials; difficulties: structure defined by universal properties rather than being chosen, what is a language morphism? the axiom of replacement. Historical comments on classifying toposes.
An original treatment of the way in which each part of the anatomy of an adjunction (including the correspondence, unit, co-unit and triangle laws) corresponds to some part of the type-theoretic rules. The difference in idiom between initial algebras and left adjoints. Adjoints in 2-categories; comments on how much commoner (and less remarkable) adjunctions are than novices suppose.
Definition; algebras; the Kleisli and Eilenberg-Moore categories; examples: Rel Ì CSLat, vector spaces, domains and lift-algebras. Creating limits and (some) colimits; contractible coequalisers; Beck's theorem; Paré's theorem for the contravariant powerset. Applications: finitary theories; Linton; Kock-Zöbelein; transfinite recursion; Moggi; Beck homology.
Encoding operations; finite product sketches, interpretations; Horn theories (Section 3.9);
Subcanonical, weakly canonical and strongly canonical languages; how categorical equivalences preserve structure; conservativity; the equivalence theorem. This is an example of a situation that is obviously an adjunction, apart from naturality of the co-unit, which is the issue to be proved.
Proving conservativity syntactically (by normalisation) for products, sums and exponentials.
One reason for my attitude is that my own interest was originally not in pure categorical logic, but the interpretation of polymorphic lambda calculi in categories of domains. These typically do not have all of even the basic structure (finite limits) that Bénabou-Lawvere categorical logic demanded. There is, however, a well defined, albeit slightly different question: how much of a language (for instance with dependent products) can one soundly interpret in a given category? This question is of the same form ask asking what function-types (exponentials) exist in a category: there is a scheme of universal properties, and we want to know which instances of them are satisfied. Fibrations rightly ask that these properties be parametric, and invariant under substitution, but these conditions translate exactly into pullback-stability of the relevant universal properties.
In this way, Sections and demonstrate the precise translations between the type-theoretic rules (in the style of, say, Martin-Löf) and universal properties.
Overview of classes of display maps, fibrations, the syntactic quantifier formation rules, the indirect rules ("Á and $E) as adjunctions, substitution and the Beck-Chevalley condition, the recursive definition of the interpretation functor.
However, the traditional categorical examples employ categories of models, whereas our explanation is in terms of syntax (classifying categories or categories of contexts and substitutions), so a result is sketched that shows how bifibrations of models arise from fibrations of theories. Restriction and induction of modules for rings arise in this way, as do the standard notions of variable domain, where x £ y gives rise to an embedding-projection pair between D[x] and D[y].
The terms prone and supine are introduced for the forward and backward orthogonals of vertical maps; prone maps are elsewhere called cartesian.
The fibrational technology was first introduced to describe topological group extensions. Discrete and geometric examples of split and non-split extensions are given. The cocycle condition and first cohomology group are recovered from their categorical versions.
As I have said, I came into this subject from domain theory, though I am now interested in categories with similar ``deficiencies'' constructed in different ways. In such categories, the semantic calculations are already rather difficult, perhaps involving non-trivial lattice theory or topology. For this reason, it is essential that the universal properties to be tested for the interpretation of type theory be as simple as it is possible to make them.
The literal type-theoretic formulation involves a certain complication, which turns out exactly to cancel the complication involving the Beck-Chevalley condition, with the effect that the strict definition is equivalent to a type-theoretically naïve one that also appears in the categorical literature from the 1970s with the name ``exponentiability''.
The last half of the section shows that an even more drastic simplification can be made, with the result that the Beck-Chevalley condition does the work for us, instead of being an extra burden, as it is with the notion of hyperdoctrine. The calculation of general dependent products can be reduced to (certain pullbacks and) so-called partial products, which are slightly more complicated than function-spaces, but not much. They first appeared, in geometric topology, in 1965, and, as people gradually wake up to their importance, more and more idioms of construction are being recognised as examples.
For higher-order logic, we distinguish between the name of a type or proposition and its substance, setting out the rules for the type of such names. When the eta (extensionality) law holds, proofs are anonymous and comprehension is available, this type is a support-classifier or dominance (Section ), and a topos-theoretic subobject classifier when all monos are classified by propositions.
The eta rule conflicts with other customs in type- and category theory, so the extent of its relevance to mathematics is discussed.
An internal submodels is a type of types in the sense of the previous section. Russell's paradox shows that this cannot be the entire model, if this is to have equality and function-types in the traditional mathematical sense, but such things are possible in categories of domains.
The construction of submodels in set theory and domain theory is not possible within the Zermelo or topos axioms, but needs (part of) the axiom of replacement. Set theory obfuscates this axiom particularly badly, but it can be regarded as saying that we may form set-indexed unions of sets.
However, the axiom of replacement has a more profound significance in logic than this. In type theory, (part of) it is Mart-in-Löf's notion of universe. The force lies, not in the existence of a term model, but in the recursive construction of its interpretation. This is illustrated by the amazing power of the gluing construction (Section ). As an original piece of research, we also give a categorical formulation of it, as a way of saying that transfinite iterated of functors exist, based on the categorical theory of ordinals in Sections and .