List of all papers published at RTA with their abstract


2015

Port Graphs, Rules and Strategies for Dynamic Data Analytics - Extended Abstract (Invited Talk)
Hélène Kirchner

In the context of understanding, planning and anticipating the behaviour of complex systems, such as biological networks or social networks, this paper proposes port graphs, rules and strategies, combined in strategic rewrite programs, as foundational ingredients for interactive and visual programming and shows how they can contribute to dynamic data analytics.

Matching Logic - Extended Abstract (Invited Talk)
Grigore Roşu

This paper presents matching logic, a first-order logic (FOL)

Executable Formal Models in Rewriting Logic (Invited Talk)
Carolyn Talcott

Formal executable models provide a means to gain insights into the behavior of complex distributed systems. Ideas can be prototyped and assurance gained by carrying out analyses at different levels of fidelity: searching for desirable or undesirable behaviors, determining effects of perturbing the system, and eventually investing effort to carry out formal proofs of key properties. This modeling approach applies to a wide range of systems, including a variety of protocols and networked cyber-physical systems. It is also emerging as an important tool in understanding many different aspects of biological systems.

Certification of Complexity Proofs using CeTA
Martin Avanzini, Christian Sternagel, René Thiemann

Nowadays certification is widely employed by automated termination tools for term rewriting, where certifiers support most available techniques. In complexity analysis, the situation is quite different. Although tools support certification in principle, current certifiers implement only the most basic technique,

Dismatching and Local Disunification in EL
Franz Baader, Stefan Borgwardt, Barbara Morawska

Unification in Description Logics has been introduced as a means to detect redundancies in ontologies. We try to extend the known decidability results for unification in the Description Logic EL to disunification since negative constraints on unifiers can be used to avoid unwanted unifiers. While decidability of the solvability of general EL-disunification problems remains an open problem, we obtain NP-completeness results for two interesting special cases: dismatching problems, where one side of each negative constraint

Nominal Anti-Unification
Alexander Baumgartner, Temur Kutsia, Jordi Levy, Mateu Villaret

We study nominal anti-unification, which is concerned with computing

A faithful encoding of programmable strategies into term rewriting systems
Horatiu Cirstea, Serguei Lenglet, Pierre-Étienne Moreau

Rewriting is a formalism widely used in computer science and

Presenting a Category Modulo a Rewriting System
Florence Clerc, Samuel Mimram

Presentations of categories are a well-known algebraic tool to provide descriptions of categories by the means of generators, for objects and morphisms, and relations on morphisms. We generalize here this notion, in order to consider situations where the objects are considered modulo an equivalence relation (in the spirit of rewriting modulo), which is described by equational generators. When those form a convergent (abstract) rewriting system on objects, there are three very natural constructions that can be used to define the category which is described by the presentation: one is based on restricting to objects which are normal forms, one consists in turning equational generators into identities (i.e. considering a quotient category), and one consists in formally adding inverses to equational generators (i.e. localizing the category). We show that, under suitable coherence conditions on the presentation, the three constructions coincide, thus generalizing celebrated results on presentations of groups. We illustrate our techniques on a non-trivial example, and hint at a generalization for 2-categories.

Confluence of nearly orthogonal infinitary term rewriting systems
Łukasz Czajka

We give a relatively simple coinductive proof of confluence, modulo

No complete linear term rewriting system for propositional logic
Anupam Das, Lutz Straßburger

Recently it has been observed that the set of all sound linear inference rules in propositional logic is already coNP-complete, i.e. that every Boolean tautology can be written as a (left- and right-) linear rewrite rule. This raises the question of whether there is a rewriting system on linear terms of propositional logic that is sound and complete for the set of all such rewrite rules. We show in this paper that, as long as reduction steps are polynomial-time decidable, such a rewriting system does not exist unless coNP=NP.

A Coinductive Framework for Infinitary Rewriting and Equational Reasoning
Jörg Endrullis, Helle Hvid Hansen, Dimitri Hendriks, Andrew Polonsky, Alexandra Silva

We present a coinductive framework for defining infinitary analogues of equational reasoning and rewriting in a uniform way. The setup captures rewrite sequences of arbitrary ordinal length, but it has neither the need for ordinals nor for metric convergence. This makes the framework especially suitable for formalizations in theorem provers.

Proving non-termination by finite automata
Jörg Endrullis, Hans Zantema

A new technique is presented to prove non-termination of term rewriting. The basic idea is to find a non-empty regular language of terms that is closed under rewriting and does not contain normal forms. It is automated by representing the language by a tree automaton with a fixed number of states, and expressing the mentioned requirements in a SAT formula. Satisfiability of this formula implies non-termination. Our approach succeeds for many examples where all earlier techniques fail, for instance for the S-rule from combinatory logic.

Reachability Analysis of Innermost Rewriting
Thomas Genet, Yann Salmon

We consider the problem of inferring a grammar describing the output of a functional program given a grammar describing its input. Solutions to this problem are helpful for detecting bugs or proving safety properties of functional programs and, several rewriting tools exist for solving this problem. However, known grammar inference techniques are not able to take evaluation strategies of the program into account. This yields very imprecise results when the evaluation strategy matters. In this work, we adapt the Tree Automata Completion algorithm to approximate accurately the set of

Network Rewriting II: Bi- and Hopf Algebras
Lars Hellström

Bialgebras and their specialisation Hopf algebras are algebraic

Leftmost Outermost Revisited
Nao Hirokawa, Aart Middeldorp, Georg Moser

We present an elementary proof of the classical result that the

Conditional Complexity
Cynthia Kop, Aart Middeldorp, Thomas Sternagel

We propose a notion of complexity for oriented conditional term rewrite systems. This notion is realistic in the sense that it measures not only successful computations but also partial computations that result in a failed rule application. A transformation to unconditional context-sensitive rewrite systems is

Constructing Orthogonal Designs in Powers of Two: Gröbner Bases Meet Equational Unification
Ilias Kotsireas, Temur Kutsia, Dimitris E. Simos

In the past few decades, design theory has grown to encompass a wide variety of research directions. It comes as no surprise that applications in coding theory and communications continue to arise,

Improving Automatic Confluence Analysis of Rewrite Systems by Redundant Rules
Julian Nagele, Bertram Felgenhauer, Aart Middeldorp

We describe how to utilize redundant rewrite rules, i.e., rules that can be simulated by other rules, when (dis)proving confluence of term rewrite systems. We demonstrate how automatic confluence provers benefit from the addition as well as the removal of redundant rules. Due to their simplicity, our transformations were easy to formalize in a proof assistant and are thus amenable to certification. Experimental results show the surprising gain in power.

Certified Rule Labeling
Julian Nagele, Harald Zankl

The rule labeling heuristic aims to establish confluence of (left-)linear term rewrite systems via decreasing diagrams. We present a formalization of a confluence criterion based on the interplay of relative termination and the rule labeling in the theorem prover Isabelle. Moreover, we report on the integration of this result into the certifier CeTA, facilitating the checking of confluence certificates based on decreasing diagrams for the first time. The power of the method is illustrated by an experimental evaluation on a (standard) collection of confluence problems.

Transforming Cycle Rewriting into String Rewriting
David Sabel, Hans Zantema

We present new techniques to prove termination of cycle rewriting, that is, string rewriting on cycles, which are strings in which the start and end are connected. Our main technique is to transform cycle rewriting into string rewriting and then apply state of the art techniques to prove termination of the string rewrite system.

Confluence of Orthogonal Nominal Rewriting Systems Revisited
Takaki Suzuki, Kentaro Kikuchi, Takahito Aoto, Yoshihito Toyama

Nominal rewriting systems (Fernandez, Gabbay, Mackie, 2004;

Matrix Interpretations on Polyhedral Domains
Johannes Waldmann

We refine matrix interpretations for proving termination and complexity bounds of term rewrite systems we restricting them to domains that satisfy a system of linear inequalities. Admissibility of such a restriction is shown by certificates whose validity can be expressed as a constraint program. This refinement is orthogonal to other features of matrix interpretations (complexity bounds, dependency pairs), but can be used to improve complexity bounds, and we discuss its relation with the usable rules criterion. We present an implementation and experiments.

Inferring Lower Bounds for Runtime Complexity
Florian Frohn, Jürgen Giesl, Jera Hensel, Cornelius Aschermann, Thomas Ströder

We present the first approach to deduce lower bounds for innermost runtime complexity of term rewrite systems (TRSs) automatically. Inferring lower runtime bounds is useful to detect bugs and to complement existing techniques that compute upper complexity

A Simple and Efficient Step Towards Type-Correct XSLT Transformations
Markus Lepper, Baltasar Trancón y Widemann

XSLT 1.0 is a standardized functional programming language and widely used for defining transformations on XML models and documents, in many areas of industry and publishing. The problem of XSLT type checking is to verify that a given transformation, when applied to an input which conforms to a given structure definition, e.g. an XML DTD, will always produce an output which adheres to a second structure definition. This problem is known to be undecidable for the full range of XSLT and document structure definition languages. Either one or both of them must be significantly restricted, or only approximations can be calculated.

DynSem: A DSL for Dynamic Semantics Specification
Vlad Vergu, Pierre Neron, Eelco Visser

The formal semantics of a programming language and its implementation are typically separately defined, with the risk of

2014

Process Types as a Descriptive Tool for Interaction
Kohei Honda, Nobuko Yoshida, Martin Berger

We demonstrate a tight relationship between linearly typed π-calculi and typed λ-calculi by giving a type-preserving translation from the call-by-value λμ-calculus into a typed π-calculus. The λμ-calculus has a particularly simple representation as typed mobile processes. The target calculus is a simple variant of the linear π-calculus. We establish full abstraction up to maximally consistent observational congruences in source and target calculi using techniques from games semantics and process calculi.

Unnesting of Copatterns
Anton Setzer, Andreas Abel, Brigitte Pientka, David Thibodeau

Inductive data such as finite lists and trees can elegantly be defined by constructors which allow programmers to analyze and manipulate finite data via pattern matching. Dually, coinductive data such as streams can be defined by observations such as head and tail and programmers can synthesize infinite data via copattern matching. This leads to a symmetric language where finite and infinite data can be nested. In this paper, we compile nested pattern and copattern matching into a core language which only supports simple non-nested (co)pattern matching. This core language may serve as an intermediate language of a compiler. We show that this translation is conservative, i.e. the multi-step reduction relation in both languages coincides for terms of the original language. Furthermore, we show that the translation preserves strong and weak normalisation: a term of the original language is strongly/weakly normalising in one language if and only if it is so in the other. In the proof we develop more general criteria which guarantee that extensions of abstract reduction systems are conservative and preserve strong or weak normalisation.

Proving Confluence of Term Rewriting Systems via Persistency and Decreasing Diagrams
Takahito Aoto, Yoshihito Toyama, Kazumasa Uchida

The decreasing diagrams technique (van Oostrom, 1994) has been successfully used to prove confluence of rewrite systems in various ways; using rule-labelling (van Oostrom, 2008), it can also be applied directly to prove confluence of some linear term rewriting systems (TRSs) automatically. Some efforts for extending the rule-labelling are known, but non-left-linear TRSs are left beyond the scope. Two methods for automatically proving confluence of non-(left-)linear TRSs with the rule-labelling are given. The key idea of our methods is to combine the decreasing diagrams technique with persistency of confluence (Aoto & Toyama, 1997).

Predicate Abstraction of Rewrite Theories
Kyungmin Bae, José Meseguer

For an infinite-state concurrent system S with a set AP of state predicates, its predicate abstraction defines a finite-state system whose states are subsets of AP, and its transitions s → s' are witnessed by concrete transitions between states in S satisfying the respective sets of predicates s and s'. Since it is not always possible to find such witnesses, an over-approximation adding extra transitions is often used. For systems S described by formal specifications, predicate abstractions are typically built using various automated deduction techniques. This paper presents a new method—based on rewriting, semantic unification, and variant narrowing—to automatically generate a predicate abstraction when the formal specification of S is given by a conditional rewrite theory. The method is illustrated with concrete examples showing that it naturally supports abstraction refinement and is quite accurate, i.e., it can produce abstractions not needing over-approximations.

Unification and Logarithmic Space
Clément Aubert, Marc Bagnol

We present an algebraic characterization of the complexity classes Logspace and NLogspace, using an algebra with a composition law based on unification. This new bridge between unification and complexity classes is inspired from proof theory and more specifically linear logic and Geometry of Interaction.

Ramsey Theorem as an Intuitionistic Property of Well Founded Relations
Stefano Berardi, Silvia Steila

Ramsey Theorem for pairs is a combinatorial result that cannot be intuitionistically proved. In this paper we present a new form of Ramsey Theorem for pairs we call H-closure Theorem. H-closure is a property of well-founded relations, intuitionistically provable, informative, and simple to use in intuitionistic proofs. Using our intuitionistic version of Ramsey Theorem we intuitionistically prove the Termination Theorem by Poldenski and Rybalchenko. This theorem concerns an algorithm inferring termination for while-programs, and was originally proved from the classical Ramsey Theorem, then intuitionistically, but using an intuitionistic version of Ramsey Theorem different from our one. Our long-term goal is to extract effective bounds for the while-programs from the proof of Termination Theorem, and our new intuitionistic version of Ramsey Theorem is designed for this goal.

A Model of Countable Nondeterminism in Guarded Type Theory
Ales Bizjak, Lars Birkedal, Marino Miculan

We show how to construct a logical relation for countable nondeterminism in a guarded type theory, corresponding to the internal logic of the topos Sh ω1 of sheaves over ω1. In contrast to earlier work on abstract step-indexed models, we not only construct the logical relations in the guarded type theory, but also give an internal proof of the adequacy of the model with respect to standard contextual equivalence. To state and prove adequacy of the logical relation, we introduce a new propositional modality. In connection with this modality we show why it is necessary to work in the logic of bf Sh ω1.

Cut Admissibility by Saturation
Guillaume Burel

Deduction modulo is a framework in which theories are integrated into proof systems such as natural deduction or sequent calculus by presenting them using rewriting rules. When only terms are rewritten, cut admissibility in those systems is equivalent to the confluence of the rewriting system, as shown by Dowek, RTA 2003, LNCS 2706. This is no longer true when considering rewriting rules involving propositions. In this paper, we show that, in the same way that it is possible to recover confluence using Knuth-Bendix completion, one can regain cut admissibility in the general case using standard saturation techniques. This work relies on a view of proposition rewriting rules as oriented clauses, like term rewriting rules can be seen as oriented equations. This also leads us to introduce an extension of deduction modulo with conditional term rewriting rules.

Automatic Evaluation of Context-Free Grammars (System Description)
Carles Creus, Guillem Godoy

We implement an online judge for context-free grammars. Our system contains a list of problems describing formal languages, and asking for grammars generating them. A submitted proposal grammar receives a verdict of acceptance or rejection depending on whether the judge determines that it is equivalent to the reference solution grammar provided by the problem setter. Since equivalence of context-free grammars is an undecidable problem, we consider a maximum length ℓ and only test equivalence of the generated languages up to words of length ℓ. This length restriction is very often sufficient for the well-meant submissions. Since this restricted problem is still NP-complete, we design and implement methods based on hashing, SAT, and automata that perform well in practice.

Tree Automata with Height Constraints between Brothers
Carles Creus, Guillem Godoy

We define the tree automata with height constraints between brothers (TACBBH). Constraints of equalities and inequalities between heights of siblings that restrict the applicability of the rules are allowed in TACBBH. These constraints allow to express natural tree languages like complete or balanced (like AVL) trees. We prove decidability of emptiness and finiteness for TACBBH, and also for a more general class that additionally allows to combine equality and disequality constraints between brothers.

A Coinductive Confluence Proof for Infinitary Lambda-Calculus
Łukasz Czajka

We give a coinductive proof of confluence, up to equivalence of root-active subterms, of infinitary lambda-calculus. We also show confluence of Böhm reduction (with respect to root-active terms) in infinitary lambda-calculus. In contrast to previous proofs, our proof makes heavy use of coinduction and does not employ the notion of descendants.

An Implicit Characterization of the Polynomial-Time Decidable Sets by Cons-Free Rewriting
Daniel de Carvalho, Jakob Grue Simonsen

We define the class of constrained cons-free rewriting systems and show that this class characterizes P, the set of languages decidable in polynomial time on a deterministic Turing machine. The main novelty of the characterization is that it allows very liberal properties of term rewriting, in particular non-deterministic evaluation: no reduction strategy is enforced, and systems are allowed to be non-confluent.

Preciseness of Subtyping on Intersection and Union Types
Mariangiola Dezani-Ciancaglini, Silvia Ghilezan

The notion of subtyping has gained an important role both in theoretical and applicative domains: in lambda and concurrent calculi as well as in programming languages. The soundness and the completeness, together referred to as the preciseness of subtyping, can be considered from two different points of view: denotational and operational. The former preciseness is based on the denotation of a type which is a mathematical object that describes the meaning of the type in accordance with the denotations of other expressions from the language. The latter preciseness has been recently developed with respect to type safety, i.e. the safe replacement of a term of a smaller type when a term of a bigger type is expected.

Abstract Datatypes for Real Numbers in Type Theory
Martín Hötzel Escardó, Alex Simpson

We propose an abstract datatype for a closed interval of real numbers to type theory, providing a representation-independent approach to programming with real numbers. The abstract datatype requires only function types and a natural numbers type for its formulation, and so can be added to any type theory that extends Gödel's System T. Our main result establishes that programming with the abstract datatype is equivalent in power to programming intensionally with representations of real numbers. We also consider representing arbitrary real numbers using a mantissa-exponent representation in which the mantissa is taken from the abstract interval.

Self Types for Dependently Typed Lambda Encodings
Peng Fu, Aaron Stump

We revisit lambda encodings of data, proposing new solutions to several old problems, in particular dependent elimination with lambda encodings. We start with a type-assignment form of the Calculus of Constructions, restricted recursive definitions and Miquel's implicit product. We add a type construct ιx.T, called a self type, which allows T to refer to the subject of typing. We show how the resulting System S with this novel form of dependency supports dependent elimination with lambda encodings, including induction principles. Strong normalization of S is established by defining an erasure from S to a version of Fω with positive recursive type definitions, which we analyze. We also prove type preservation for S.

First-Order Formative Rules
Carsten Fuhs, Cynthia Kop

This paper discusses the method of formative rules for first-order term rewriting, which was previously defined for a higher-order setting. Dual to the well-known usable rules, formative rules allow dropping some of the term constraints that need to be solved during a termination proof. Compared to the higher-order definition, the first-order setting allows for significant improvements of the technique.

Automated Complexity Analysis Based on Context-Sensitive Rewriting
Nao Hirokawa, Georg Moser

In this paper we present a simple technique for analysing the runtime complexity of rewrite systems. In complexity analysis many techniques are based on reduction orders. We show how the monotonicity condition for orders can be weakened by using the notion of context-sensitive rewriting. The presented technique is very easy to implement, even in a modular setting, and has been integrated in the Tyrolean Complexity Tool. We provide ample experimental data for assessing the viability of our method.

Amortised Resource Analysis and Typed Polynomial Interpretations
Martin Hofmann, Georg Moser

We introduce a novel resource analysis for typed term rewrite systems based on a potential-based type system. This type system gives rise to polynomial bounds on the innermost runtime complexity. We relate the thus obtained amortised resource analysis to polynomial interpretations and obtain the perhaps surprising result that whenever a rewrite system R can be well-typed, then there exists a polynomial interpretation that orients R. For this we adequately adapt the standard notion of polynomial interpretations to the typed setting.

Confluence by Critical Pair Analysis
Jiaxiang Liu, Nachum Dershowitz, Jean-Pierre Jouannaud

Knuth and Bendix showed that confluence of a terminating first-order rewrite system can be reduced to the joinability of its finitely many critical pairs. We show that this is still true of a rewrite system {RT}∪{RNT} such that RT is terminating and RNT is a left-linear, rank non-increasing, possibly non-terminating rewrite system. Confluence can then be reduced to the joinability of the critical pairs of RT and to the existence of decreasing diagrams for the critical pairs of RT inside RNT as well as for the rigid parallel critical pairs of RNT.

Proof Terms for Infinitary Rewriting
Carlos Lombardi, Alejandro Ríos, Roel de Vrijer

We generalize the notion of proof term to the realm of transfinite reduction. Proof terms represent reductions in the first-order term format, thereby facilitating their formal analysis. Transfinite reductions can be faithfully represented as infinitary proof terms, unique up to infinitary associativity. We use proof terms to define equivalence of transfinite reductions on the basis of permutation equations. A proof of the compression property via proof terms is presented, which establishes permutation equivalence between the original and the compressed reductions.

Construction of Retractile Proof Structures
Roberto Maieli

In this work we present a paradigm of focusing proof search based on an incremental construction of retractile (i.e, correct or sequentializable) proof structures of the pure (units free) multiplicative and additive fragment of linear logic. The correctness of proof construction steps (or expansion steps) is ensured by means of a system of graph retraction rules; this graph rewriting system is shown to be convergent, that is, terminating and confluent. Moreover, the proposed proof construction follows an optimal (parsimonious, indeed) retraction strategy that, at each expansion step, allows to take into account (abstract) graphs that are ”smaller” (w.r.t. the size) than the starting proof structures.

Local States in String Diagrams
Paul-André Melliès

We establish that the local state monad introduced by Plotkin and Power is a monad with graded arities in the category [Inj,Set]. From this, we deduce that the local state monad is associated to a graded Lawvere theory which is presented by generators and relations, depicted in the graphical language of string diagrams.

Reduction System for Extensional Lambda-mu Calculus
Koji Nakazawa, Tomoharu Nagai

The Λμ-calculus is an extension of Parigot's λμ-calculus. For the untyped Λμ-calculus, Saurin proved some fundamental properties such as the standardization and the separation theorem. Nakazawa and Katsumata gave extensional models, called stream models, in which terms are represented as functions on streams. This paper introduces a conservative extension of the Λμ-calculus, called Λμcons, from which the open term model is straightforwardly constructed as a stream model, and for which we can define a reduction system satisfying several fundamental properties such as confluence, subject reduction, and strong normalization.

The Structural Theory of Pure Type Systems
Cody Roux, Floris van Doorn

We investigate possible extensions of arbitrary given Pure Type Systems with additional sorts and rules which preserve the normalization property. In particular we identify the following interesting extensions: the disjoint union P+Q of two PTSs P and Q, the PTS ∀P.Q which intuitively captures the "Q-logic of P-terms" and poly P which intuitively denotes the predicative polymorphism extension of P.

Applicative May- and Should-Simulation in the Call-by-Value Lambda Calculus with AMB
Manfred Schmidt-Schauß, David Sabel

Motivated by the question whether sound and expressive applicative similarities for program calculi with should-convergence exist, this paper investigates expressive applicative similarities for the untyped call-by-value lambda-calculus extended with McCarthy's ambiguous choice operator amb. Soundness of the applicative similarities w.r.t. contextual equivalence based on may- and should-convergence is proved by adapting Howe's method to should-convergence. As usual for nondeterministic calculi, similarity is not complete w.r.t. contextual equivalence which requires a rather complex counter example as a witness. Also the call-by-value lambda-calculus with the weaker nondeterministic construct erratic choice is analyzed and sound applicative similarities are provided. This justifies the expectation that also for more expressive and call-by-need higher-order calculi there are sound and powerful similarities for should-convergence.

Implicational Relevance Logic is 2-ExpTime-Complete
Sylvain Schmitz

We show that provability in the implicational fragment of relevance logic is complete for doubly exponential time, using reductions to and from coverability in branching vector addition systems.

Near Semi-rings and Lambda Calculus
Rick Statman

A connection between lambda calculus and the algebra of near semi-rings is discussed. Among the results is the following completeness theorem.

All-Path Reachability Logic
Andrei Ştefănescu, Ştefan Ciobâcă, Radu Mereuta, Brandon Michael Moore, Traian Florin Şerbănută, Grigore Roşu

This paper presents a language-independent proof system for reachability properties of programs written in non-deterministic (e.g. concurrent) languages, referred to as all-path reachability logic. It derives partial-correctness properties with all-path semantics (a state satisfying a given precondition reaches states satisfying a given postcondition on all terminating execution paths). The proof system takes as axioms any unconditional operational semantics, and is sound (partially correct) and (relatively) complete, independent of the object language; the soundness has also been mechanized (Coq). This approach is implemented in a tool for semantics-based verification as part of the K framework.

Formalizing Monotone Algebras for Certification of Termination and Complexity Proofs
Christian Sternagel, René Thiemann

Monotone algebras are frequently used to generate reduction orders in automated termination and complexity proofs. To be able to certify these proofs, we formalized several kinds of interpretations in the proof assistant Isabelle/HOL. We report on our integration of matrix interpretations, arctic interpretations, and nonlinear polynomial interpretations over various domains, including the reals.

Conditional Confluence (System Description)
Thomas Sternagel, Aart Middeldorp

This paper describes the Conditional Confluence tool, a fully automatic confluence checker for first-order conditional term rewrite systems. The tool implements various confluence criteria that have been proposed in the literature. A simple technique is presented to test conditional critical pairs for infeasibility, which makes conditional confluence criteria more useful. Detailed experimental data is presented.

Nagoya Termination Tool
Akihisa Yamada, Keiichirou Kusakari, Toshiki Sakabe

This paper describes the implementation and techniques of the Nagoya Termination Tool, a termination prover for term rewrite systems. The main features of the tool are: the first implementation of the weighted path order which subsumes most of the existing reduction pairs, and the efficiency due to the strong cooperation with external SMT solvers. We present some new ideas that contribute to the efficiency and power of the tool.

Termination of Cycle Rewriting
Hans Zantema, Barbara König, Harrie Jan Sander Bruggink

String rewriting can not only be applied on strings, but also on cycles and even on general graphs. In this paper we investigate termination of string rewriting applied on cycles, shortly denoted as cycle rewriting, which is a strictly stronger requirement than termination on strings. Most techniques for proving termination of string rewriting fail for proving termination of cycle rewriting, but match bounds and some variants of matrix interpretations can be applied. Further we show how any terminating string rewriting system can be transformed to a terminating cycle rewriting system, preserving derivational complexity.

2013

Linear Logic and Strong Normalization
Beniamino Accattoli

Strong normalization for linear logic requires elaborated rewriting techniques. In this paper we give a new presentation of MELL proof nets, without any commutative cut-elimination rule. We show how this feature induces a compact and simple proof of strong normalization, via reducibility candidates. It is the first proof of strong normalization for MELL which does not rely on any form of confluence, and so it smoothly scales up to full linear logic. Moreover, it is an axiomatic proof, as more generally it holds for every set of rewriting rules satisfying three very natural requirements with respect to substitution, commutation with promotion, full composition, and Kesner's IE property. The insight indeed comes from the theory of explicit substitutions, and from looking at the exponentials as a substitution device.

A Combination Framework for Complexity
Martin Avanzini, Georg Moser

In this paper we present a combination framework for the automated polynomial complexity analysis of term rewrite systems. The framework covers both derivational and runtime complexity analysis, and is employed as theoretical foundation in the automated complexity tool TCT. We present generalisations of powerful complexity techniques, notably a generalisation of complexity pairs and (weak) dependency pairs. Finally, we also present a novel technique, called dependency graph decomposition, that in the dependency pair setting greatly increases modularity.

Tyrolean Complexity Tool: Features and Usage
Martin Avanzini, Georg Moser

The Tyrolean Complexity Tool, TCT for short, is an open source complexity analyser for term rewrite systems. Our tool TCT features a majority of the known techniques for the automated characterisation of polynomial complexity of rewrite systems and can investigate derivational and runtime complexity, for full and innermost rewriting. This system description outlines features and provides a short introduction to the usage of TCT.

Abstract Logical Model Checking of Infinite-State Systems Using Narrowing
Kyungmin Bae, Santiago Escobar, José Meseguer

A concurrent system can be naturally specified as a rewrite theory R = (Σ, E, R) where states are elements of the initial algebra of terms modulo E and concurrent transitions are axiomatized by the rewrite rules R. Under simple conditions, narrowing with rules R modulo equations E can be used to symbolically represent the system's state space by means of terms with logical variables. We call this symbolic representation a "logical state space" and it can also be used for model checking verification of LTL properties. Since in general such a logical state space can be infinite, we propose several abstraction techniques for obtaining either an over-approximation or an under-approximation of the logical state space: (i) a folding abstraction that collapses patterns into more general ones, (ii) an easy-to-check method to define (bisimilar) equational abstractions, and (iii) an iterated bounded model checking method that can detect if a logical state space within a given bound is complete. We also show that folding abstractions can be faithful for safety LTL properties, so that they do not generate any spurious counterexamples. These abstraction methods can be used in combination and, as we illustrate with examples, can be effective in making the logical state space finite. We have implemented these techniques in the Maude system, providing the first narrowing-based LTL model checker we are aware of.

Compression of Rewriting Systems for Termination Analysis
Alexander Bau, Markus Lohrey, Eric Nöth, Johannes Waldmann

We adapt the TreeRePair tree compression algorithm and use it as an intermediate step in proving termination of term rewriting systems. We introduce a cost function that approximates the size of constraint systems that specify compatibility of matrix interpretations. We show how to integrate the compression algorithm with the Dependency Pairs transformation. Experiments show that compression reduces running times of constraint solvers, and thus improves the power of automated termination provers.

A Variant of Higher-Order Anti-Unification
Alexander Baumgartner, Temur Kutsia, Jordi Levy, Mateu Villaret

We present a rule-based Huet's style anti-unification algorithm for simply-typed lambda-terms in eta-long beta-normal form, which computes a least general higher-order pattern generalization. For a pair of arbitrary terms of the same type, such a generalization always exists and is unique modulo alpha-equivalence and variable renaming. The algorithm computes it in cubic time within linear space. It has been implemented and the code is freely available.

Over-approximating Descendants by Synchronized Tree Languages
Yohan Boichut, Jacques Chabin, Pierre Réty

Over-approximating the descendants (successors) of a initial set of terms by a rewrite system is used in verification. The success of such verification methods depends on the quality of the approximation. To get better approximations, we are going to use non-regular languages. We present a procedure that always terminates and that computes over-approximation of descendants, using synchronized tree-(tuple) languages expressed by logic programs.

Unifying Nominal Unification
Christophe Calvès

Nominal unification is proven to be quadratic in time and space. It was so by two different approaches, both inspired by the Paterson-Wegman linear unification algorithm, but dramatically different in the way nominal and first-order constraints are dealt with. To handle nominal constraints, Levy and Villaret introduced the notion of replacing while Calvès and Fernández use permutations and sets of atoms. To deal with structural constraints, the former use multi-equations in a way similar to the Martelli-Montanari algorithm while the later mimic Paterson-Wegman. In this paper we abstract over these two approaches and genralize them into the notion of modality, highlighting the general ideas behind nominal unification. We show that replacings and environments are in fact isomorphic. This isomorphism is of prime importance to prove intricate properties on both sides and a step further to the real complexity of nominal unification.

Rewriting with Linear Inferences in Propositional Logic
Anupam Das

Linear inferences are sound implications of propositional logic where each variable appears exactly once in the premiss and conclusion. We consider a specific set of these inferences, MS, first studied by Straßburger, corresponding to the logical rules in deep inference proof theory. Despite previous results characterising the individual rules of MS, we show that there is no polynomial-time characterisation of MS, assuming that integers cannot be factorised in polynomial time. We also examine the length of rewrite paths in an extended system MSU that also has unit equations, utilising a notion dubbed trivialisation to reduce the case with units to the case without, amongst other observations on MS-rewriting and the set of linear inferences in general.

Proof Orders for Decreasing Diagrams
Bertram Felgenhauer, Vincent van Oostrom

We present and compare some well-founded proof orders for decreasing diagrams. These proof orders order a conversion above another conversion if the latter is obtained by filling any peak in the former by a (locally) decreasing diagram. Therefore each such proof order entails the decreasing diagrams technique for proving confluence. The proof orders differ with respect to monotonicity and complexity. Our results are developed in the setting of involutive monoids. We extend these results to obtain a decreasing diagrams technique for confluence modulo.

Decidable structures between Church-style and Curry-style
Ken-etsu Fujita, Aleksy Schubert

It is well-known that the type-checking and type-inference problems are undecidable for second order lambda-calculus in Curry-style, although those for Church-style are decidable. What causes the differences in decidability and undecidability on the problems? We examine crucial conditions on terms for the (un)decidability property from the viewpoint of partially typed terms, and what kinds of type annotations are essential for (un)decidability of type-related problems. It is revealed that there exists an intermediate structure of second order lambda-terms, called a style of hole-application, between Church-style and Curry-style, such that the type-related problems are decidable under the structure. We also extend this idea to the omega-order polymorphic calculus F-omega, and show that the type-checking and type-inference problems then become undecidable.

Expressibility in the Lambda Calculus with Mu
Clemens Grabmayer, Jan Rochel

We address a problem connected to the unfolding semantics of functional programming languages: give a useful characterization of those infinite lambda-terms that are lambda-letrec-expressible in the sense that they arise as infinite unfoldings of terms in lambda-letrec, the lambda-calculus with letrec. We provide two characterizations, using concepts we introduce for infinite lambda-terms: regularity, strong regularity, and binding-capturing chains.

A Homotopical Completion Procedure with Applications to Coherence of Monoids
Yves Guiraud, Philippe Malbos, Samuel Mimram

One of the most used algorithm in rewriting theory is the Knuth-Bendix completion procedure which starts from a terminating rewriting system and iteratively adds rules to it, trying to produce an equivalent convergent rewriting system. It is in particular used to study presentations of monoids, since normal forms of the rewriting system provide canonical representatives of words modulo the congruence generated by the rules. Here, we are interested in extending this procedure in order to retrieve information about the low-dimensional homotopy properties of a monoid. We therefore consider the notion of coherent presentation, which is a generalization of rewriting systems that keeps track of the cells generated by confluence diagrams. We extend the Knuth-Bendix completion procedure to this setting, resulting in a homotopical completion procedure. It is based on a generalization of Tietze transformations, which are operations that can be iteratively applied to relate any two presentations of the same monoid. We also explain how these transformations can be used to remove useless generators, rules, or confluence diagrams in a coherent presentation, thus leading to a homotopical reduction procedure. Finally, we apply these techniques to the study of some examples coming from representation theory, to compute minimal coherent presentations for them: braid, plactic and Chinese monoids.

Extending Abramsky's Lazy Lambda Calculus: (Non)-Conservativity of Embeddings
Manfred Schmidt-Schauß, Elena Machkasova, David Sabel

Our motivation is the question whether the lazy lambda calculus, a pure lambda calculus with the leftmost outermost rewriting strategy, considered under observational semantics, or extensions thereof, are an adequate model for semantic equivalences in real-world purely functional programming languages, in particular for a pure core language of Haskell. We explore several extensions of the lazy lambda calculus: addition of a seq-operator, addition of data constructors and case-expressions, and their combination, focusing on conservativity of these extensions. In addition to untyped calculi, we study their monomorphically and polymorphically typed versions. For most of the extensions we obtain non-conservativity which we prove by providing counterexamples. However, we prove conservativity of the extension by data constructors and case in the monomorphically typed scenario.

Algorithms for Extended Alpha-Equivalence and Complexity
Manfred Schmidt-Schauß, Conrad Rau, David Sabel

Equality of expressions in lambda-calculi, higher-order programming languages, higher-order programming calculi and process calculi is defined as alpha-equivalence. Permutability of bindings in let-constructs and structural congruence axioms extend alpha-equivalence. We analyse these extended alpha-equivalences and show that there are calculi with polynomial time algorithms, that a multiple-binding "let" may make alpha-equivalence as hard as finding graph-isomorphisms, and that the replication operator in the pi-calculus may lead to an EXPSPACE-hard alpha-equivalence problem.

Unification Modulo Nonnested Recursion Schemes via Anchored Semi-Unification
Gert Smolka, Tobias Tebbi

A recursion scheme is an orthogonal rewriting system with rules of the form f(x1,...,xn) → s. We consider terms to be equivalent if they rewrite to the same redex-free possibly infinite term after infinitary rewriting. For the restriction to the nonnested case, where nested redexes are forbidden, we prove the existence of principal unifiers modulo scheme equivalence. We give an algorithm computing principal unifiers by reducing the problem to a novel fragment of semi-unification we call anchored semi-unification. For anchored semi-unification, we develop a decision algorithm that returns a principal semi-unifier in the positive case.

Formalizing Knuth-Bendix Orders and Knuth-Bendix Completion
Christian Sternagel, René Thiemann

We present extensions of our Isabelle Formalization of Rewriting that cover two historically related concepts: the Knuth-Bendix order and the Knuth-Bendix completion procedure. The former, besides being the first development of its kind in a proof assistant, is based on a generalized version of the Knuth-Bendix order. We compare our version to variants from the literature and show all properties required to certify termination proofs of TRSs. The latter comprises the formalization of important facts that are related to completion, like Birkhoff's theorem, the critical pair theorem, and a soundness proof of completion, showing that the strict encompassment condition is superfluous for finite runs. As a result, we are able to certify completion proofs.

Automatic Decidability: A Schematic Calculus for Theories with Counting Operators
Elena Tushkanova, Christophe Ringeissen, Alain Giorgetti, Olga Kouchnarenko

Many verification problems can be reduced to a satisfiability problem modulo theories. For building satisfiability procedures the rewriting-based approach uses a general calculus for equational reasoning named paramodulation. Schematic paramodulation, in turn, provides means to reason on the derivations computed by paramodulation. Until now, schematic paramodulation was only studied for standard paramodulation. We present a schematic paramodulation calculus modulo a fragment of arithmetics, namely the theory of Integer Offsets. This new schematic calculus is used to prove the decidability of the satisfiability problem for some theories equipped with counting operators. We illustrate our theoretical contribution on theories representing extensions of classical data structures, e.g., lists and records. An implementation within the rewriting-based Maude system constitutes a practical contribution. It enables automatic decidability proofs for theories of practical use.

Normalized Completion Revisited
Sarah Winkler, Aart Middeldorp

Normalized completion (Marché 1996) is a widely applicable and efficient technique for com- pletion modulo theories. If successful, a normalized completion procedure computes a rewrite system that allows to decide the validity problem using normalized rewriting. In this paper we consider a slightly simplified inference system for finite normalized completion runs. We prove correctness, show faithfulness of critical pair criteria in our setting, and propose a different notion of normalizing pairs. We then show how normalized completion procedures can benefit from AC- termination tools instead of relying on a fixed AC-compatible reduction order. We outline our implementation of this approach in the completion tool mkbtt and present experimental results, including new completions.

Beyond Peano Arithmetic - Automatically Proving Termination of the Goodstein Sequence
Sarah Winkler, Harald Zankl, Aart Middeldorp

Kirby and Paris (1982) proved in a celebrated paper that a theorem of Goodstein (1944) cannot be established in Peano (1889) arithmetic. We present an encoding of Goodstein's theorem as a termination problem of a finite rewrite system. Using a novel implementation of ordinal interpretations, we are able to automatically prove termination of this system, resulting in the first automatic termination proof for a system whose derivational complexity is not multiple recursive. Our method can also cope with the encoding by Touzet (1998) of the battle of Hercules and Hydra, yet another system which has been out of reach for automated tools, until now.

Confluence by Decreasing Diagrams - Formalized
Harald Zankl

This paper presents a formalization of decreasing diagrams in the theorem prover Isabelle. It discusses mechanical proofs showing that any locally decreasing abstract rewrite system is confluent. The valley and the conversion version of decreasing diagrams are considered.

2012

An Abstract Factorization Theorem for Explicit Substitutions
Beniamino Accattoli

We study a simple form of standardization, here called factorization, for explicit substitutions calculi, i.e. lambda-calculi where beta-reduction is decomposed in various rules. These calculi, despite being non-terminating and non-orthogonal, have a key feature: each rule terminates when considered separately. It is well-known that the study of rewriting properties simplifies in presence of termination (e.g. confluence reduces to local confluence). This remark is exploited to develop an abstract theorem deducing factorization from some axioms on local diagrams. The axioms are simple and easy to check, in particular they do not mention residuals. The abstract theorem is then applied to some explicit substitution calculi related to Proof-Nets. We show how to recover standardization by levels, we model both call-by-name and call-by-value calculi and we characterize linear head reduction via a factorization theorem for a linear calculus of substitutions.

On the Invariance of the Unitary Cost Model for Head Reduction
Beniamino Accattoli, Ugo Dal Lago

The lambda-calculus is a widely accepted computational model of higher-order functional programs, yet there is not any direct and universally accepted cost model for it. As a consequence, the computational difficulty of reducing lambda-terms to their normal form is typically studied by reasoning on concrete implementation algorithms. In this paper, we show that when head reduction is the underlying dynamics, the unitary cost model is indeed invariant. This improves on known results, which only deal with weak (call-by-value or call-by-name) reduction. Invariance is proved by way of a linear calculus of explicit substitutions, which allows to nicely decompose any head reduction step in the lambda-calculus into more elementary substitution steps, thus making the combinatorics of head-reduction easier to reason about. The technique is also a promising tool to attack what we see as the main open problem, namely understanding for which normalizing strategies the unitary cost model is invariant, if any.

A Term Rewriting System for Kuratowski's Closure-Complement Problem
Osama Al-Hassani, Quratul-ain Mahesar, Claudio Sacerdoti Coen, Volker Sorge

We present a term rewriting system to solve a class of open problems that are generalisations of Kuratowski's closure-complement theorem. The problems are concerned with finding the number of distinct sets that can be obtained by applying combinations of axiomatically defined set operators. While the original problem considers only closure and complement of a topological space as operators, it can be generalised by adding operators and varying axiomatisation. We model these axioms as rewrite rules and construct a rewriting system that allows us to close some so far open variants of Kuratowski's problem by analysing several million inference steps on a typical personal computer.

Term Rewriting Systems as Topological Dynamical Systems
Soren Bjerg Andersen, Jakob Grue Simonsen

Topological dynamics is, roughly, the study of phenomena related to iterations of continuous maps from a metric space to itself. We show how the rewrite relation in term rewriting gives rise to dynamical systems in two distinct, natural ways: (A) One in which any deterministic rewriting strategy induces a dynamical system on the set of finite and infinite terms endowed with the usual metric, and (B) one in which the unconstrained rewriting relation induces a dynamical system on sets of sets of terms, specifically the set of compact subsets of the set of finite and infinite terms endowed with the Hausdorff metric. For both approaches, we give sufficient criteria for the induced systems to be well-defined dynamical systems and for (A) we demonstrate how the classic topological invariant called topological entropy turns out to be much less useful in the setting of term rewriting systems than in symbolic dynamics.

Infinitary Term Graph Rewriting is Simple, Sound and Complete
Patrick Bahr

Based on a simple metric and a simple partial order on term graphs, we develop two infinitary calculi of term graph rewriting. We show that, similarly to infinitary term rewriting, the partial order formalisation yields a conservative extension of the metric formalisation of the calculus. By showing that the resulting calculi simulate the corresponding well-established infinitary calculi of term rewriting in a sound and complete manner, we argue for the appropriateness of our approach to capture the notion of infinitary term graph rewriting.

Axiomatic Sharing-via-Labelling
Thibaut Balabonski

A judicious use of labelled terms makes it possible to bring together the simplicity of term rewriting and the sharing power of graph rewriting: this has been known for twenty years in the particular case of orthogonal first-order systems. The present paper introduces a concise and easily usable axiomatic presentation of sharing-via-labelling techniques that applies to higher-order term rewriting as well as to non-orthogonal term rewriting. This provides a general framework for the sharing of subterms and keeps the formalism as simple as term rewriting.

On the Decidability Status of Reachability and Coverability in Graph Transformation Systems
Nathalie Bertrand, Giorgio Delzanno, Barbara König, Arnaud Sangnier, Jan Stückrath

We study decidability issues for reachability problems in graph transformation systems, a powerful infinite-state model. For a fixed initial configuration, we consider reachability of an entirely specified configuration and of a configuration that satisfies a given pattern (coverability). The former is a fundamental problem for any computational model, the latter is strictly related to verification of safety properties in which the pattern specifies an infinite set of bad configurations. In this paper we reformulate results obtained, e.g., for context-free graph grammars and concurrency models, such as Petri nets, in the more general setting of graph transformation systems and study new results for classes of models obtained by adding constraints on the form of reduction rules.

Normalisation for Dynamic Pattern Calculi
Eduardo Bonelli, Delia Kesner, Carlos Lombardi, Alejandro Ríos

The Pure Pattern Calculus (PPC) extends the lambda-calculus, as well as the family of algebraic pattern calculi, with first-class patterns; that is, patterns can be passed as arguments, evaluated and returned as results. The notion of matching failure of the PPC not only provides a mechanism to define functions by pattern matching on cases but also supplies PPC with parallel-or-like, non-sequential behaviour. Therefore, devising normalising strategies for PPC to obtain well-behaved implementations turns out to be challenging. This paper focuses on normalising reduction strategies for PPC. We define a (multistep) strategy and show that it is normalising. The strategy generalises the leftmost-outermost strategy for lambda-calculus and is strictly finer than parallel-outermost. The normalisation proof is based on the notion of necessary set of redexes, a generalisation of the notion of needed redex encompassing non-sequential reduction systems.

A Semantic Proof that Reducibility Candidates entail Cut Elimination
Denis Cousineau, Olivier Hermant

Two main lines have been adopted to prove the cut elimination theorem: the syntactic one, that studies the process of reducing cuts, and the semantic one, that consists in interpreting a sequent in some algebra and extracting from this interpretation a cut-free proof of this very sequent. A link between those two methods was exhibited by studying in a semantic way, syntactical tools that allow to prove (strong) normalization of proof-terms, namely reducibility candidates. In the case of deduction modulo, a framework combining deduction and rewriting rules in which theories like Zermelo set theory and higher order logic can be expressed, this is obtained by constructing a reducibility candidates valued model. The existence of such a pre-model for a theory entails strong normalization of its proof-terms and, by the usual syntactic argument, the cut elimination property. In this paper, we strengthen this gate between syntactic and semantic methods, by providing a full semantic proof that the existence of a pre-model entails the cut elimination property for the considered theory in deduction modulo. We first define a new simplified variant of reducibility candidates à la Girard, that is sufficient to prove weak normalization of proof-terms (and therefore the cut elimination property). Then we build, from some model valued on the pre-Heyting algebra of those WN reducibility candidates, a regular model valued on a Heyting algebra on which we apply the usual soundness/strong completeness argument. Finally, we discuss further extensions of this new method towards normalization by evaluation techniques that commonly use Kripke semantics.

One-context Unification with STG-Compressed Terms is in NP
Carles Creus, Adrià Gascón, Guillem Godoy

One-context unification is an extension of first-order term unification in which a variable of arity one standing for a context may occur in the input terms. This problem arises in areas like program analysis, term rewriting and XML processing and is known to be solvable in nondeterministic polynomial time. We prove that this problem can be solved in nondeterministic polynomial time also when the input is compressed using Singleton Tree Grammars (STG's). STG's are a grammar-based compression method for terms that generalizes the directed acyclic graph representation. They have been recently considered as an efficient in-memory representation for large terms, since several operations on terms can be performed efficiently on their STG representation without a prior decompression.

Deciding Confluence of Ground Term Rewrite Systems in Cubic Time
Bertram Felgenhauer

It is well known that the confluence property of ground term rewrite systems (ground TRSs) is decidable in polynomial time. For an efficient implementation, the degree of this polynomial is of great interest. The best complexity bound in the literature is given by Comon, Godoy and Nieuwenhuis (2001), who describe an O(n5) algorithm, where n is the size of the ground TRS. In this paper we improve this bound to O(n3). The algorithm has been implemented in the confluence tool CSI.

Polynomial Interpretations for Higher-Order Rewriting
Carsten Fuhs, Cynthia Kop

The termination method of weakly monotonic algebras, which has been defined for higher-order rewriting in the HRS formalism, offers a lot of power, but has seen little use in recent years. We adapt and extend this method to the alternative formalism of algebraic functional systems, where the simply-typed lambda-calculus is combined with algebraic reduction. Using this theory, we define higher-order polynomial interpretations, and show how the implementation challenges of this technique can be tackled. A full implementation is provided in the termination tool Wanda.

On Soundness Conditions for Unraveling Deterministic Conditional Rewrite Systems
Karl Gmeiner, Bernhard Gramlich, Felix Schernhammer

We study (un)soundness of transformations of conditional term rewriting systems (CTRSs) into unconditional term rewriting systems (TRSs). The focus here is on analyzing (un)soundness of so-called unravelings, the most basic and natural class of such transformations. We extend our previous analysis from normal 1-CTRSs to the more general class of deterministic CTRSs (DCTRSs) where extra variables in right-hand sides of rules are allowed to a certain extent. We prove that the previous soundness results based on weak left-linearity and on right-linearity can be extended from normal 1-CTRSs to DCTRSs. Counterexamples show that such an extension to DCTRSs does not work for the previous criteria which were based on confluence and on non-erasingness, not even for right-stable systems. Yet, we prove weaker versions of soundness criteria based on confluence and on non-erasingness. Finally, we compare our approach and results with other recently established soundness criteria for unraveling DCTRSs.

Reinterpreting Compression in Infinitary Rewriting
Jeroen Ketema

Departing from a computational interpretation of compression in infinitary rewriting, we view compression as a degenerate case of standardisation. The change in perspective comes about via two observations: (a) no compression property can be recovered for non-left-linear systems and (b) some standardisation procedures, as a "side-effect", yield compressed reductions.

Finite Models vs Tree Automata in Safety Verification
Alexei Lisitsa

In this paper we deal with verification of safety properties of term-rewriting systems. The verification problem is translated to a purely logical problem of finding a finite countermodel for a first-order formula, which is further resolved by a generic finite model finding procedure. A finite countermodel produced during successful verification provides with a concise description of the system invariant sufficient to demonstrate a specific safety property. We show the relative completeness of this approach with respect to the tree automata completion technique. On a set of examples taken from the literature we demonstrate the efficiency of finite model finding approach as well as its explanatory power.

Triangulation in Rewriting
Vincent van Oostrom, Hans Zantema

We introduce a process, dubbed triangulation, turning any rewrite relation into a confluent one. It is more direct than usual completion, in the sense that objects connected by a peak are directly related rather than their normal forms. We investigate conditions under which this process preserves desirable properties such as termination.

Turing-Completeness of Polymorphic Stream Equation Systems
Christian Sattler, Florent Balestrieri

Polymorphic stream functions operate on the structure of streams, infinite sequences of elements, without inspection of the contained data, having to work on all streams over all signatures uniformly. A natural, yet restrictive class of polymorphic stream functions comprises those definable by a system of equations using only stream constructors and destructors and recursive calls. Using methods reminiscent of prior results in the field, we first show this class consists of exactly the computable polymorphic stream functions. Using much more intricate techniques, our main result states this holds true even for unary equations free of mutual recursion, yielding an elegant model of Turing-completeness in a severely restricted environment and allowing us to recover previous complexity results in a much more restricted setting.

Matching of Compressed Patterns with Character-Variables
Manfred Schmidt-Schauß

We consider the problem of finding an instance of a string-pattern s in a given string under compression by straight line programs (SLP). The variables of the string pattern can be instantiated by single characters. This is a generalisation of the fully compressed pattern match, which is the task of finding a compressed string in another compressed string, which is known to have a polynomial time algorithm. We mainly investigate patterns $s$ that are linear in the variables, i.e. variables occur at most once in s, also known as partial words. We show that fully compressed pattern matching with linear patterns can be performed in polynomial time. A polynomial-sized representation of all matches and all substitutions is also computed. Also, a related algorithm is given that computes all periods of a compressed linear pattern in polynomial time. A technical key result on the structure of partial words shows that an overlap of h+2 copies of a partial word w with at most h holes implies that w is strongly periodic.

Meaningless Sets in Infinitary Combinatory Logic
Paula Severi, Fer-Jan de Vries

In this paper we study meaningless sets in infinitary combinatory logic. So far only a handful of meaningless sets were known. We show that there are uncountably many meaningless sets. As an application to the semantics of finite combinatory logics, we show that there exist uncountably many combinatory algebras that are not a lambda algebra. We also study ways of weakening the axioms of meaningless sets to get, not only sufficient, but also necessary conditions for having confluence and normalisation.

A Rewriting Framework for Activities Subject to Regulations
Max I. Kanovich, Tajana Ban Kirigin, Vivek Nigam, Andre Scedrov, Carolyn L. Talcott, Ranko Perovic

Activities such as clinical investigations or financial processes are subject to regulations to ensure quality of results and avoid negative consequences. Regulations may be imposed by multiple governmental agencies as well as by institutional policies and protocols. Due to the complexity of both regulations and activities there is great potential for violation due to human error, misunderstanding, or even intent. Executable formal models of regulations, protocols, and activities can form the foundation for automated assistants to aid planning, monitoring, and compliance checking. We propose a model based on multiset rewriting where time is discrete and is specified by timestamps attached to facts. Actions, as well as initial, goal and critical states may be constrained by means of relative time constraints. Moreover, actions may have non-deterministic effects, that is, they may have different outcomes whenever applied. We demonstrate how specifications in our model can be straightforwardly mapped to the rewriting logic language Maude, and how one can use existing techniques to improve performance. Finally, we also determine the complexity of the plan compliance problem, that is, finding a plan that leads from an initial state to a desired goal state without reaching any undesired critical state. We consider all actions to be balanced, that is, their pre and post-conditions have the same number of facts. Under this assumption on actions, we show that the plan compliance problem is PSPACE-complete when all actions have only deterministic effects and is EXPTIME-complete when actions may have non-deterministic effects.

Semantic Evaluation, Intersection Types and Complexity of Simply Typed Lambda Calculus
Kazushige Terui

Consider the following problem: given a simply typed lambda term of Boolean type and of order r, does it normalize to "true"? A related problem is: given a term M of word type and of order r together with a finite automaton D, does D accept the word represented by the normal form of M? We prove that these problems are n-EXPTIME complete for r=2n+2, and n-EXPSPACE complete for r=2n+3. While the hardness part is relatively easy, the membership part is not so obvious; in particular, simply applying beta reduction does not work. Some preceding works employ semantic evaluation in the category of sets and functions, but it is not efficient enough for our purpose. We present an algorithm for the above type of problem that is a fine blend of beta reduction, Krivine abstract machine and semantic evaluation in a category based on preorders and order ideals, also known as the Scott model of linear logic. The semantic evaluation can also be presented as intersection type checking.

On the Formalization of Termination Techniques based on Multiset Orderings
René Thiemann, Guillaume Allais, Julian Nagele

Multiset orderings are a key ingredient in certain termination techniques like the recursive path ordering and a variant of size-change termination. In order to integrate these techniques in a certifier for termination proofs, we have added them to the Isabelle Formalization of Rewriting. To this end, it was required to extend the existing formalization on multiset orderings towards a generalized multiset ordering. Afterwards, the soundness proofs of both techniques have been established, although only after fixing some definitions. Concerning efficiency, it is known that the search for suitable parameters for both techniques is NP-hard. We show that checking the correct application of the techniques--where all parameters are provided--is also NP-hard, since the problem of deciding the generalized multiset ordering is NP-hard.

2011

FAST: An Efficient Decision Procedure for Deduction and Static Equivalence
Bruno Conchinha, David A. Basin, Carlos Caleiro

Message deducibility and static equivalence are central problems in symbolic security protocol analysis. We present FAST, an efficient decision procedure for these problems under subterm-convergent equational theories. FAST is a C++ implementation of an improved version of the algorithm presented in our previous work. This algorithm has a better asymptotic complexity than other algorithms implemented by existing tools for the same task, and FAST's optimizations further improve these complexity results. We describe here the main ideas of our implementation and compare its performance with competing tools. The results show that our implementation is significantly faster: for many examples, FAST terminates in under a second, whereas other tools take several minutes.

Automated Certified Proofs with CiME3
Évelyne Contejean, Pierre Courtieu, Julien Forest, Olivier Pons, Xavier Urbain

We present the rewriting toolkit CiME3. Amongst other original features, this version enjoys two kinds of engines: to handle and discover proofs of various properties of rewriting systems, and to generate Coq scripts from proof traces given in certification problem format in order to certify them with a skeptical proof assistant like Coq. Thus, these features open the way for using CiME3 to add automation to proofs of termination or confluence in a formal development in the Coq proof assistant.

Variants, Unification, Narrowing, and Symbolic Reachability in Maude 2.6
Francisco Durán, Steven Eker, Santiago Escobar, José Meseguer, Carolyn L. Talcott

This paper introduces some novel features of Maude 2.6 focusing on the variants of a term. Given an equational theory (Σ,Ax∪E), the E,Ax-variants of a term t are understood as the set of all pairs consisting of a substitution sigma and the E,Ax-canonical form of t sigma. The equational theory (Ax∪E) has the finite variant property if there is a finite set of most general variants. We have added support in Maude 2.6 for: (i) order-sorted unification modulo associativity, commutativity and identity, (ii) variant generation, (iii) order-sorted unification modulo finite variant theories, and (iv) narrowing-based symbolic reachability modulo finite variant theories. We also explain how these features have a number of interesting applications in areas such as unification theory, cryptographic protocol verification, business processes, and proofs of termination, confluence and coherence.

Termination Analysis of C Programs Using Compiler Intermediate Languages
Stephan Falke, Deepak Kapur, Carsten Sinz

Modeling the semantics of programming languages like C for the automated termination analysis of programs is a challenge if complete coverage of all language features should be achieved. On the other hand, low-level intermediate languages that occur during the compilation of C programs to machine code have a much simpler semantics since most of the intricacies of C are taken care of by the compiler frontend. It is thus a promising approach to use these intermediate languages for the automated termination analysis of C programs. In this paper we present the tool KITTeL based on this approach. For this, programs in the compiler intermediate language are translated into term rewrite systems (TRSs), and the termination proof itself is then performed on the automatically generated TRS. An evaluation on a large collection of C programs shows the effectiveness and practicality of KITTeL on "typical" examples.

First-Order Unification on Compressed Terms
Adrià Gascón, Sebastian Maneth, Lander Ramos

Singleton Tree Grammars (STGs) have recently drawn considerable attention. They generalize the sharing of subtrees known from DAGs to sharing of connected subgraphs. This allows to obtain smaller in-memory representations of trees than with DAGs. In the past years some important tree algorithms were proved to perform efficiently (without decompression) over STGs; e.g., type checking, equivalence checking, and unification. We present a tool that implements an extension of the unification algorithm for STGs. This algorithm makes extensive use of equivalence checking. For the latter we implemented two variants, the classical exact one and a recent randomized one. Our experiments show that the randomized algorithm performs better. The running times are also compared to those of unification over uncompressed trees.

Anagopos: A Reduction Graph Visualizer for Term Rewriting and Lambda Calculus
Niels Bjørn Bugge Grathwohl, Jeroen Ketema, Jens Duelund Pallesen, Jakob Grue Simonsen

We present Anagopos, an open source tool for visualizing reduction graphs of terms in lambda calculus and term rewriting. Anagopos allows step-by-step generation of reduction graphs under six different graph drawing algorithms. We provide ample examples of graphs drawn with the tool.

Maximal Completion
Dominik Klein, Nao Hirokawa

Given an equational system, completion procedures compute an equivalent and complete (terminating and confluent) term rewrite system. We present a very simple and efficient completion procedure, which is based on MaxSAT solving. Experiments show that the procedure is comparable to recent powerful completion tools.

CRSX - Combinatory Reduction Systems with Extensions
Kristoffer Høgsbro Rose

Combinatory Reduction Systems with Extensions (CRSX) is a system available from http://crsx.sourceforge.net and characterized by the following properties:
- Higher-order rewriting engine based on pure Combinatory Reduction Systems with full strong reduction (but no specified reduction strategy).
- Rule and term syntax based on lambda-calculus and term rewriting conventions including Unicode support.
- Strict checking and declaration requirements to avoid idiosyncratic errors in rewrite rules.
- Interpreter is implemented in Java 5 and usable stand-alone as well as from an Eclipse plugin (under development).
- Includes a custom parser generator (front-end to JavaCC parser generator) designed to ease parsing directly into higher-order abstract syntax (as well as permitting the use of custom syntax in rules files).
- Experimental (and evolving) sort system to help rule management.
- Compiler from (well-sorted deterministic subset of) CRSX to stand-alone C code.

A Reduction-Preserving Completion for Proving Confluence of Non-Terminating Term Rewriting Systems
Takahito Aoto, Yoshihito Toyama

We give a method to prove confluence of term rewriting systems that contain non-terminating rewrite rules such as commutativity and associativity. Usually, confluence of term rewriting systems containing such rules is proved by treating them as equational term rewriting systems and considering E-critical pairs and/or termination modulo E. In contrast, our method is based solely on usual critical pairs and usual termination. We first present confluence criteria for term rewriting systems whose rewrite rules can be partitioned into terminating part and possibly non-terminating part. We then give a reduction-preserving completion procedure so that the applicability of the criteria is enhanced. In contrast to the well-known Knuth-Bendix completion procedure which preserves the equivalence relation of the system, our completion procedure preserves the reduction relation of the system, by which confluence of the original system is inferred from that of the completed system.

Natural Inductive Theorems for Higher-Order Rewriting
Takahito Aoto, Toshiyuki Yamada, Yuki Chiba

The notion of inductive theorems is well-established in first-order term rewriting. In higher-order term rewriting, in contrast, it is not straightforward to extend this notion because of extensionality (Meinke, 1992). When extending the term rewriting based program transformation of Chiba et al. (2005) to higher-order term rewriting, we need extensibility, a property stating that inductive theorems are preserved by adding new functions via macros. In this paper, we propose and study a new notion of inductive theorems for higher-order rewriting, natural inductive theorems. This allows to incorporate properties such as extensionality and extensibility, based on simply typed S-expression rewriting (Yamada, 2001).

A Path Order for Rewrite Systems that Compute Exponential Time Functions
Martin Avanzini, Naohi Eguchi, Georg Moser

In this paper we present a new path order for rewrite systems, the exponential path order EPO*. Suppose a term rewrite system is compatible with EPO*, then the runtime complexity of this rewrite system is bounded from above by an exponential function. Furthermore, the class of function computed by a rewrite system compatible with EPO* equals the class of functions computable in exponential time on a Turing machine.

Modes of Convergence for Term Graph Rewriting
Patrick Bahr

Term graph rewriting provides a simple mechanism to finitely represent restricted forms of infinitary term rewriting. The correspondence between infinitary term rewriting and term graph rewriting has been studied to some extent. However, this endeavour is impaired by the lack of an appropriate counterpart of infinitary rewriting on the side of term graphs. We aim to fill this gap by devising two modes of convergence based on a partial order resp. a metric on term graphs. The thus obtained structures generalise corresponding modes of convergence that are usually studied in infinitary term rewriting. We argue that this yields a common framework in which both term rewriting and term graph rewriting can be studied. In order to substantiate our claim, we compare convergence on term graphs and on terms. In particular, we show that the resulting infinitary calculi of term graph rewriting exhibit the same correspondence as we know it from term rewriting: Convergence via the partial order is a conservative extension of the metric convergence.

Modular Termination Proofs of Recursive Java Bytecode Programs by Term Rewriting
Marc Brockschmidt, Carsten Otto, Jürgen Giesl

In earlier work we presented an approach to prove termination of non-recursive Java Bytecode (JBC) programs automatically. Here, JBC programs are first transformed to finite termination graphs which represent all possible runs of the program. Afterwards, the termination graphs are translated to term rewrite systems (TRSs) such that termination of the resulting TRSs implies termination of the original JBC programs. So in this way, existing techniques and tools from term rewriting can be used to prove termination of JBC automatically. In this paper, we improve this approach substantially in two ways: (1) We extend it in order to also analyze recursive JBC programs. To this end, one has to represent call stacks of arbitrary size. (2) To handle JBC programs with several methods, we modularize our approach in order to re-use termination graphs and TRSs for the separate methods and to prove termination of the resulting TRS in a modular way. We implemented our approach in the tool AProVE. Our experiments show that the new contributions increase the power of termination analysis for JBC significantly.

Rewriting-based Quantifier-free Interpolation for a Theory of Arrays
Roberto Bruttomesso, Silvio Ghilardi, Silvio Ranise

The use of interpolants in model checking is becoming an enabling technology to allow fast and robust verification of hardware and software. The application of encodings based on the theory of arrays, however, is limited by the impossibility of deriving quantifier-free interpolants in general. In this paper, we show that, with a minor extension to the theory of arrays, it is possible to obtain quantifier-free interpolants. We prove this by designing an interpolating procedure, based on solving equations between array updates. Rewriting techniques are used in the key steps of the solver and its proof of correctness. To the best of our knowledge, this is the first successful attempt of computing quantifier-free interpolants for a theory of arrays.

Improved Functional Flow and Reachability Analyses Using Indexed Linear Tree Grammars
Jonathan Kochems, Luke C.-H. Ong

The collecting semantics of a program defines the strongest static property of interest. We study the analysis of the collecting semantics of higher-order functional programs, cast as left-linear term rewriting systems. The analysis generalises functional flow analysis and the reachability problem for term rewriting systems, which are both undecidable. We present an algorithm that uses indexed linear tree grammars (ILTGs) both to describe the input set and compute the set that approximates the collecting semantics. ILTGs are equi-expressive with pushdown tree automata, and so, strictly more expressive than regular tree grammars. Our result can be seen as a refinement of Jones and Andersen's procedure, which uses regular tree grammars. The main technical innovation of our algorithm is the use of indices to capture (sets of) substitutions, thus enabling a more precise binding analysis than afforded by regular grammars. We give a simple proof of termination and soundness, and demonstrate that our method is more accurate than other approaches to functional flow and reachability analyses in the literature.

Higher Order Dependency Pairs for Algebraic Functional Systems
Cynthia Kop, Femke van Raamsdonk

We extend the termination method using dynamic dependency pairs to higher order rewriting systems with beta as a rewrite step, also called Algebraic Functional Systems (AFSs). We introduce a variation of usable rules, and use monotone algebras to solve the constraints generated by dependency pairs. This approach differs in several respects from those dealing with higher order rewriting modulo beta (e.g. HRSs).

Anti-Unification for Unranked Terms and Hedges
Temur Kutsia, Jordi Levy, Mateu Villaret

We study anti-unification for unranked terms and hedges that may contain term and hedge variables. The anti-unification problem of two hedges ~s1 and ~s2 is concerned with finding their generalization, a hedge ~q such that both ~s1 and ~s2 are instances of ~q under some substitutions. Hedge variables help to fill in gaps in generalizations, while term variables abstract single (sub)terms with different top function symbols. First, we design a complete and minimal algorithm to compute least general generalizations. Then, we improve the efficiency of the algorithm by restricting possible alternatives permitted in the generalizations. The restrictions are imposed with the help of a rigidity function that is a parameter in the improved algorithm and selects certain common subsequences from the hedges to be generalized. Finally, we indicate a possible application of the algorithm in software engineering.

Termination Proofs in the Dependency Pair Framework May Induce Multiple Recursive Derivational Complexity
Georg Moser, Andreas Schnabl

We study the complexity of rewrite systems shown terminating via the dependency pair framework using processors for reduction pairs, dependency graphs, or the subterm criterion. The complexity of such systems is bounded by a multiple recursive function, provided the complexity induced by the employed base techniques is at most multiple recursive. Moreover this upper bound is tight.

Revisiting Matrix Interpretations for Proving Termination of Term Rewriting
Friedrich Neurauter, Aart Middeldorp

Matrix interpretations are a powerful technique for proving termination of term rewrite systems, which is based on the well-known paradigm of interpreting terms into a domain equipped with a suitable well-founded order, such that every rewrite step causes a strict decrease. Traditionally, one uses vectors of non-negative numbers as domain, where two vectors are in the order relation if there is a strict decrease in the respective first components and a weak decrease in all other components. In this paper, we study various alternative well-founded orders on vectors of non-negative numbers based on vector norms and compare the resulting variants of matrix interpretations to each other and to the traditional approach. These comparisons are mainly theoretical in nature. We do, however, also identify one of these variants as a proper generalization of traditional matrix interpretations as a stand-alone termination method, which has the additional advantage that it gives rise to a more powerful implementation.

Soundness of Unravelings for Deterministic Conditional Term Rewriting Systems via Ultra-Properties Related to Linearity
Naoki Nishida, Masahiko Sakai, Toshiki Sakabe

Unravelings are transformations from a conditional term rewriting system (CTRS, for short) over an original signature into an unconditional term rewriting systems (TRS, for short) over an extended signature. They are not sound for every CTRS w.r.t. reduction, while they are complete w.r.t. reduction. Here, soundness w.r.t. reduction means that every reduction sequence of the corresponding unraveled TRS, of which the initial and end terms are over the original signature, can be simulated by the reduction of the original CTRS. In this paper, we show that an optimized variant of Ohlebusch's unraveling for deterministic CTRSs is sound w.r.t. reduction if the corresponding unraveled TRSs are left-linear or both right-linear and non-erasing. We also show that soundness of the variant implies that of Ohlebusch's unraveling.

Program Inversion for Tail Recursive Functions
Naoki Nishida, Germán Vidal

Program inversion is a fundamental problem that has been addressed in many different programming settings and applications. In the context of term rewriting, several methods already exist for computing the inverse of an injective function. These methods, however, usually return non-terminating inverted functions when the considered function is tail recursive. In this paper, we propose a direct and intuitive approach to the inversion of tail recursive functions. Our new technique is able to produce good results even without the use of an additional post-processing of determinization or completion. Moreover, when combined with a traditional approach to program inversion, it constitutes a promising approach to define a general method for program inversion. Our experimental results confirm that the new technique compares well with previous approaches.

Refinement Types as Higher-Order Dependency Pairs
Cody Roux

Refinement types are a well-studied manner of performing in-depth analysis on functional programs. The dependency pair method is a very powerful method used to prove termination of rewrite systems; however its extension to higher-order rewrite systems is still the subject of active research. We observe that a variant of refinement types allows us to express a form of higher-order dependency pair method: from the rewrite system labeled with typing information, we build a type-level approximated dependency graph, and describe a type level embedding preorder. We describe a syntactic termination criterion involving the graph and the preorder, which generalizes the simple projection criterion of Middeldorp and Hirokawa, and prove our main result: if the graph passes the criterion, then every well-typed term is strongly normalizing.

Weakening the Axiom of Overlap in Infinitary Lambda Calculus
Paula Severi, Fer-Jan de Vries

In this paper we present a set of necessary and sufficient conditions on a set of lambda terms to serve as the set of meaningless terms in an infinitary bottom extension of lambda calculus. So far only a set of sufficient conditions was known for choosing a suitable set of meaningless terms to make this construction produce confluent extensions. The conditions covered the three main known examples of sets of meaningless terms. However, the much later construction of many more examples of sets of meaningless terms satisfying the sufficient conditions renewed the interest in the necessity question and led us to reconsider the old conditions. The key idea in this paper is an alternative solution for solving the overlap between beta reduction and bottom reduction. This allows us to reformulate the Axiom of Overlap, which now determines together with the other conditions a larger class of sets of meaningless terms. We show that the reformulated conditions are not only sufficient but also necessary for obtaining a confluent and normalizing infinitary lambda beta bottom calculus. As an interesting consequence of the necessity proof we obtain for infinitary lambda calculus with beta and bot reduction that confluence implies normalization.

Modular and Certified Semantic Labeling and Unlabeling
Christian Sternagel, René Thiemann

Semantic labeling is a powerful transformation technique to prove termination of term rewrite systems. The dual technique is unlabeling. For unlabeling it is essential to drop the so called decreasing rules which sometimes have to be added when applying semantic labeling. We indicate two problems concerning unlabeling and present our solutions. The first problem is that currently unlabeling cannot be applied as a modular step, since the decreasing rules are determined by a semantic labeling step which may have taken place much earlier. To this end, we give an implicit definition of decreasing rules that does not depend on any knowledge about preceding labelings. The second problem is that unlabeling is in general unsound. To solve this issue, we introduce the notion of extended termination problems. Moreover, we show how existing termination techniques can be lifted to operate on extended termination problems. All our proofs have been formalized in Isabelle/HOL as part of the IsaFoR/CeTA project.

Type Preservation as a Confluence Problem
Aaron Stump, Garrin Kimmell, Roba El Haj Omar

This paper begins with recent work by Kuan, MacQueen, and Findler, which shows how standard type systems, such as the simply typed lambda calculus, can be viewed as abstract reduction systems operating on terms. The central idea is to think of the process of typing a term as the computation of an abstract value for that term. The standard metatheoretic property of type preservation can then be seen as a confluence problem involving the concrete and abstract operational semantics, viewed as abstract reduction systems (ARSs). In this paper, we build on the work of Kuan et al. by showing show how modern ARS theory, in particular the theory of decreasing diagrams, can be used to establish type preservation via confluence. We illustrate this idea through several examples of solving such problems using decreasing diagrams. We also consider how automated tools for analysis of term-rewriting systems can be applied in testing type.

Left-linear Bounded TRSs are Inverse Recognizability Preserving
Irène Durand, Marc Sylvestre

Bounded rewriting for linear term rewriting systems has been defined in (I. Durand, G. Sénizergues, M. Sylvestre. Termination of linear bounded term rewriting systems. Proceedings of the 21st International Conference on Rewriting Techniques and Applications) as a restriction of the usual notion of rewriting. We extend here this notion to the whole class of left-linear term rewriting systems, and we show that bounded rewriting is effectively inverse-recognizability preserving. The bounded class (BO) is, by definition, the set of left-linear systems for which every derivation can be replaced by a bottom-up derivation. The class BO contains (strictly) several classes of systems which were already known to be inverse-recognizability preserving: the left-linear growing systems, and the inverse right-linear finite-path overlapping systems.

Labelings for Decreasing Diagrams
Harald Zankl, Bertram Felgenhauer, Aart Middeldorp

This paper is concerned with automating the decreasing diagrams technique of van Oostrom for establishing confluence of term rewrite systems. We study abstract criteria that allow to lexicographically combine labelings to show local diagrams decreasing. This approach has two immediate benefits. First, it allows to use labelings for linear rewrite systems also for left-linear ones, provided some mild conditions are satisfied. Second, it admits an incremental method for proving confluence which subsumes recent developments in automating decreasing diagrams. The techniques proposed in the paper have been implemented and experimental results demonstrate how, e.g., the rule labeling benefits from our contributions.

Proving Equality of Streams Automatically
Hans Zantema, Jörg Endrullis

Streams are infinite sequences over a given data type. A stream specification is a set of equations intended to define a stream. In this paper we focus on equality of streams, more precisely, for a given set of equations two stream terms are said to be equal if they are equal in every model satisfying the given equations. We investigate techniques for proving equality of streams suitable for automation. Apart from techniques that were already available in the tool CIRC from Lucanu and Rosu, we also exploit well-definedness of streams, typically proved by proving productivity. Moreover, our approach does not restrict to behavioral input format and does not require termination. We present a tool Streambox that can prove equality of a wide range of examples fully automatically.

2010

Automated Confluence Proof by Decreasing Diagrams based on Rule-Labelling
Takahito Aoto

Decreasing diagrams technique (van Oostrom, 1994) is a technique that can be widely applied to prove confluence of rewrite systems. To directly apply the decreasing diagrams technique to prove confluence of rewrite systems, rule-labelling heuristic has been proposed by van Oostrom (2008). We show how constraints for ensuring confluence of term rewriting systems constructed based on the rule-labelling heuristic are encoded as linear arithmetic constraints suitable for solving the satisfiability of them by external SMT solvers. We point out an additional constraint omitted in (van Oostrom, 2008) that is needed to guarantee the soundness of confluence proofs based on the rule-labelling heuristic extended to deal with non-right-linear rules. We also present several extensions of the rule-labelling heuristic by which the applicability of the technique is enlarged.

Higher-Order (Non-)Modularity
Claus Appel, Vincent van Oostrom, Jakob Grue Simonsen

We show that, contrary to the situation in first-order term rewriting, almost none of the usual properties of rewriting are modular for higher-order rewriting, irrespective of the higher-order rewriting format. We show that for the particular format of simply typed applicative term rewriting systems modularity of confluence, normalization, and termination can be recovered by imposing suitable linearity constraints.

Closing the Gap Between Runtime Complexity and Polytime Computability
Martin Avanzini, Georg Moser

In earlier work, we have shown that for confluent TRSs, innermost polynomial runtime complexity induces polytime computability of the functions defined. In this paper, we generalise this result to full rewriting, for that we exploit graph rewriting. We give a new proof of the adequacy of graph rewriting for full rewriting that allows for a precise control of the resources copied. In sum we completely describe an implementation of rewriting on a Turing machine (TM for short). We show that the runtime complexity of the TRS and the runtime complexity of the TM is polynomially related. Our result strengthens the evidence that the complexity of a rewrite system is truthfully represented through the length of derivations. Moreover our result allows the classification of nondeterministic polytime-computation based on runtime complexity analysis of rewrite systems.

Abstract Models of Transfinite Reductions
Patrick Bahr

We investigate transfinite reductions in abstract reduction systems. To this end, we study two abstract models for transfinite reductions: a metric model generalising the usual metric approach to infinitary term rewriting and a novel partial order model. For both models we distinguish between a weak and a strong variant of convergence as known from infinitary term rewriting. Furthermore, we introduce an axiomatic model of reductions that is general enough to cover all of these models of transfinite reductions as well as the ordinary model of finite reductions. It is shown that, in this unifying axiomatic model, many basic relations between termination and confluence properties known from finite reductions still hold. The introduced models are applied to term rewriting but also to term graph rewriting. We can show that for both term rewriting as well as for term graph rewriting the partial order model forms a conservative extension to the metric model.

Partial Order Infinitary Term Rewriting and Böhm Trees
Patrick Bahr

We investigate an alternative model of infinitary term rewriting. Instead of a metric, a partial order on terms is employed to formalise (strong) convergence. We compare this partial order convergence of orthogonal term rewriting systems to the usual metric convergence of the corresponding Böhm extensions. The Böhm extension of a term rewriting system contains additional rules to equate so-called root-active terms. The core result we present is that reachability w.r.t. partial order convergence coincides with reachability w.r.t. metric convergence in the Böhm extension. This result is used to show that, unlike in the metric model, orthogonal systems are infinitarily confluent and infinitarily normalising in the partial order model. Moreover, we obtain, as in the metric model, a compression lemma. A corollary of this lemma is that reachability w.r.t. partial order convergence is a conservative extension of reachability w.r.t. metric convergence.

Unique Normal Forms in Infinitary Weakly Orthogonal Rewriting
Jörg Endrullis, Clemens Grabmayer, Dimitri Hendriks, Jan Willem Klop, Vincent van Oostrom

We present some contributions to the theory of infinitary rewriting for weakly orthogonal term rewrite systems, in which critical pairs may occur provided they are trivial. We show that the infinitary unique normal form property (UNinf) fails by a simple example of a weakly orthogonal TRS with two collapsing rules. By translating this example, we show that UNinf also fails for the infinitary lambda-beta-eta-calculus. As positive results we obtain the following: Infinitary confluence, and hence UNinf, holds for weakly orthogonal TRSs that do not contain collapsing rules. To this end we refine the compression lemma. Furthermore, we consider the triangle and diamond properties for infinitary developments in weakly orthogonal TRSs, by refining an earlier cluster-analysis for the finite case.

The Undecidability of Type Related Problems in Type-free Style System F
Ken-etsu Fujita, Aleksy Schubert

We consider here a number of variations on the System F, that are predicative second-order systems whose terms are intermediate between the Curry style and Church style. The terms here contain the information on where the universal quantifier elimination and introduction in the type inference process must take place, which is similar to Church forms. However, they omit the information on which types are involved in the rules, which is similar to Curry forms. In this paper we prove the undecidability of the type-checking, type inference and typability problems for the system. Moreover, the proof works for the predicative version of the system with finitely stratified polymorphic types. The result includes the bounds on the Leivant's level numbers for types used in the instances leading to the undecidability.

On (Un)Soundness of Unravelings
Karl Gmeiner, Bernhard Gramlich, Felix Schernhammer

We revisit (un)soundness of transformations of conditional into unconditional rewrite systems. The focus here is on so-called unravelings, the most simple and natural kind of such transformations, for the class of normal conditional systems without extra variables. By a systematic and thorough study of existing counterexamples and of the potential sources of unsoundness we obtain several new positive and negative results. In particular, we prove the following new results: Confluence, non-erasingness and weak left-linearity (of a given conditional system) each guarantee soundness of the unraveled version w.r.t. the original one. The latter result substantially extends the only known sufficient criterion for soundness, namely left-linearity. Furthermore, by means of counterexamples we refute various other tempting conjectures about sufficient conditions for soundness.

A Proof Calculus Which Reduces Syntactic Bureaucracy
Alessio Guglielmi, Tom Gundersen, Michel Parigot

In usual proof systems, like the sequent calculus, only a very limited way of combining proofs is available through the tree structure. We present in this paper a logic-independent proof calculus, where proofs can be freely composed by connectives, and prove its basic properties. The main advantage of this proof calculus is that it allows to avoid certain types of syntactic bureaucracy inherent to all usual proof systems, in particular the sequent calculus. Proofs in this system closely reflect their atomic flow, which traces the behaviour of atoms through structural rules. The general definition is illustrated by the standard deep-inference system for propositional logic, for which there are known rewriting techniques that achieve cut elimination based only on the information in atomic flows.

A Rewriting Logic Semantics Approach to Modular Program Analysis
Mark Hills, Grigore Roşu

The K framework, based on rewriting logic semantics, provides a powerful logic for defining the semantics of programming languages. While most work in this area has focused on defining an evaluation semantics for a language, it is also possible to define an abstract semantics that can be used for program analysis. Using the SILF language (Hills, Serbanuta and Rosu, 2007), this paper describes one technique for defining such a semantics: policy frameworks. In policy frameworks, an analysis-generic, modular framework is first defined for a language. Individual analyses, called policies, are then defined as extensions of this framework, with each policy defining analysis-specific semantic rules and an annotation language which, in combination with support in the language front-end, allows users to annotate program types and functions with information used during program analysis. Standard term rewriting techniques are used to analyze programs by evaluating them in the policy semantics.

Infinitary Rewriting: Foundations Revisited
Stefan Kahrs

Infinitary Term Rewriting allows to express infinitary terms and infinitary reductions that converge to them. As their notion of transfinite reduction in general, and as binary relations in particular two concepts have been studied in the past: strongly and weakly convergent reductions, and in the last decade research has mostly focused around the former. Finitary rewriting has a strong connection to the equational theory of its rule set: if the rewrite system is confluent this (implies consistency of the theory and) gives rise to a semi-decision procedure for the theory, and if the rewrite system is in addition terminating this becomes a decision procedure. This connection is the original reason for the study of these properties in rewriting. For infinitary rewriting there is barely an established notion of an equational theory. The reason this issue is not trivial is that such a theory would need to include some form of ``getting to limits'', and there are different options one can pursue. These options are being looked at here, as well as several alternatives for the notion of reduction relation and their relationships to these equational theories.

Underspecified computation of normal forms
Alexander Koller, Stefan Thater

We show how to compute readings of ambiguous natural language sentences that are minimal in some way. Formally, we consider the problem of computing, out of a set C of trees and a rewrite system R, those trees in C that cannot be rewritten into a tree in C. We solve the problem for sets of trees that are described by semantic representations typically used in computational linguistics, and a certain class of rewrite systems that we use to approximate entailment, and show how to compute the irreducible trees efficiently by intersecting tree automata. Our algorithm solves the problem of computing weakest readings that has been open for 25 years in computational linguistics.

Order-Sorted Unification with Regular Expression Sorts
Temur Kutsia, Mircea Marin

We extend first-order order-sorted unification by permitting regular expression sorts for variables and in the domains of function symbols. The set of basic sorts is finite. The obtained signature corresponds to a finite bottom-up hedge automaton. The unification problem in such a theory generalizes some known unification problems. Its unification type is infinitary. We give a complete unification procedure and prove decidability.

An Efficient Nominal Unification Algorithm
Jordi Levy, Mateu Villaret

Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo alpha-equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadratic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently.

Computing Critical Pairs in 2-Dimensional Rewriting Systems
Samuel Mimram

Rewriting systems on words are very useful in the study of monoids. In good cases, they give finite presentations of the monoids, allowing their manipulation by a computer. Even better, when the presentation is confluent and terminating, they provide one with a notion of canonical representative for the elements of the presented monoid. Polygraphs are a higher-dimensional generalization of this notion of presentation, from the setting of monoids to the much more general setting of n-categories. Here, we are interested in proving confluence for polygraphs presenting 2-categories, which can be seen as a generalization of term rewriting systems. For this purpose, we propose an adaptation of the usual algorithm for computing critical pairs. Interestingly, this framework is much richer than term rewriting systems and requires the elaboration of a new theoretical framework for representing critical pairs, based on contexts in compact 2-categories.

Polynomial Interpretations over the Reals do not Subsume Polynomial Interpretations over the Integers
Friedrich Neurauter, Aart Middeldorp

Polynomial interpretations are a useful technique for proving termination of term rewrite systems. They come in various flavors: polynomial interpretations with real, rational and integer coefficients. In 2006, Lucas proved that there are rewrite systems that can be shown polynomially terminating by polynomial interpretations with real (algebraic) coefficients, but cannot be shown polynomially terminating using polynomials with rational coefficients only. He also proved a similar theorem with respect to the use of rational coefficients versus integer coefficients. In this paper we show that polynomial interpretations with real or rational coefficients do not subsume polynomial interpretations with integer coefficients, contrary to what is commonly believed. We further show that polynomial interpretations with real coefficients subsume polynomial interpretations with rational coefficients.

Automated Termination Analysis of Java Bytecode by Term Rewriting
Carsten Otto, Marc Brockschmidt, Christian von Essen, Jürgen Giesl

We present an automated approach to prove termination of Java Bytecode (JBC) programs by automatically transforming them to term rewrite systems (TRSs). In this way, the numerous techniques and tools developed for TRS termination can now be used for imperative object-oriented languages like Java, which can be compiled into JBC.

Declarative Debugging of Missing Answers for Maude
Adrián Riesco, Alberto Verdejo, Narciso Martí-Oliet

Declarative debugging is a semi-automatic technique that starts from an incorrect computation and locates a program fragment responsible for the error by building a tree representing this computation and guiding the user through it to find the error. Membership equational logic (MEL) is an equational logic that in addition to equations allows the statement of membership axioms characterizing the elements of a sort. Rewriting logic is a logic of change that extends MEL by adding rewrite rules, that correspond to transitions between states and can be nondeterministic. In this paper we propose a calculus that allows to infer normal forms and least sorts with the equational part, and sets of reachable terms through rules. We use an abbreviation of the proof trees computed with this calculus to build appropriate debugging trees for missing answers (results that are erroneous because they are incomplete), whose adequacy for debugging is proved. Using these trees we have implemented a declarative debugger for Maude, a high-performance system based on rewriting logic, whose use is illustrated with an example.

Simulation in the Call-by-Need Lambda-Calculus with letrec
Manfred Schmidt-Schauß, David Sabel, Elena Machkasova

This paper shows the equivalence of applicative similarity and contextual approximation, and hence also of bisimilarity and contextual equivalence, in the deterministic call-by-need lambda calculus with letrec. Bisimilarity simplifies equivalence proofs in the calculus and opens a way for more convenient correctness proofs for program transformations. Although this property may be a natural one to expect, to the best of our knowledge, this paper is the first one providing a proof. The proof technique is to transfer the contextual approximation into Abramsky's lazy lambda calculus by a fully abstract and surjective translation. This also shows that the natural embedding of Abramsky's lazy lambda calculus into the call-by-need lambda calculus with letrec is an isomorphism between the respective term-models. We show that the equivalence property proven in this paper transfers to a call-by-need letrec calculus developed by Ariola and Felleisen.

Weak Convergence and Uniform Normalization in Infinitary Rewriting
Jakob Grue Simonsen

We study infinitary term rewriting systems containing finitely many rules. For these, we show that if a weakly convergent reduction is not strongly convergent, it contains a term that reduces to itself in one step (but the step itself need not be part of the reduction). Using this result, we prove the starkly surprising result that for any orthogonal system with finitely many rules, the system is weakly normalizing under weak convergence if{f} it is strongly normalizing under weak convergence if{f} it is weakly normalizing under strong convergence if{f} it is strongly normalizing under strong convergence. As further corollaries, we derive a number of new results for weakly convergent rewriting: Systems with finitely many rules enjoy unique normal forms, and acyclic orthogonal systems are confluent. Our results suggest that it may be possible to recover some of the positive results for strongly convergent rewriting in the setting of weak convergence, if systems with finitely many rules are considered. Finally, we give a number of counterexamples showing failure of most of the results when infinite sets of rules are allowed.

Certified Subterm Criterion and Certified Usable Rules
Christian Sternagel, René Thiemann

In this paper we present our formalization of two important termination techniques for term rewrite systems: the subterm criterion and the reduction pair processor in combination with usable rules. For both techniques we developed executable check functions in the theorem prover Isabelle/HOL which can certify the correct application of these techniques in some given termination proof. As there are several variants of usable rules we designed our check function in such a way that it accepts all known variants, even those which are not explicitly spelled out in previous papers. We integrated our formalization in the publicly available IsaFoR-library. This led to a significant increase in the power of CeTA, the corresponding certified termination proof checker that is extracted from IsaFoR.

Termination of linear bounded term rewriting systems
Irène Durand, Géraud Sénizergues, Marc Sylvestre

For the whole class of linear term rewriting systems and for each integer k, we define k-bounded rewriting as a restriction of the usual notion of rewriting. We show that the k-bounded uniform termination, the k-bounded termination, the inverse k-bounded uniform, and the inverse k-bounded problems are decidable. The k-bounded class (BO(k)) is, by definition, the set of linear systems for which every derivation can be replaced by a k-bounded derivation. In general, for BO(k) systems, the uniform (respectively inverse uniform) k-bounded termination problem is not equivalent to the uniform (resp. inverse uniform) termination problem, and the k-bounded (respectively inverse k-bounded) termination problem is not equivalent to the termination (respectively inverse termination) problem. This leads us to define more restricted classes for which these problems are equivalent: the classes BOLP(k) of k-bounded systems that have the length preservation property. By definition, a system is BOLP(k) if every derivation of length n can be replaced by a k-bounded derivation of length n. We define the class BOLP of bounded systems that have the length preservation property as the union of all the BOLP(k) classes. The class BOLP contains (strictly) several already known classes of systems: the inverse left-basic semi-Thue systems, the linear growing term rewriting systems, the inverse Linear-Finite-Path-Ordering systems, the strongly bottom-up systems.

Polynomially Bounded Matrix Interpretations
Johannes Waldmann

Matrix interpretations can be used to bound the derivational complexity of rewrite systems. We present a criterion that completely characterizes matrix interpretations that are polynomially bounded. It includes the method of upper triangular interpretations as a special case, and we prove that the inclusion is strict. The criterion can be expressed as a finite domain constraint system. It translates to a Boolean constraint system with a size that is polynomial in the dimension of the interpretation. We report on performance of an implementation.

Optimizing mkbTT
Sarah Winkler, Haruhiko Sato, Aart Middeldorp, Masahito Kurihara

We describe performance enhancements that have been added to mkbTT, a modern completion tool combining multi-completion with the use of termination tools.

Modular Complexity Analysis via Relative Complexity
Harald Zankl, Martin Korp

In this paper we introduce a modular framework which allows to infer (feasible) upper bounds on the (derivational) complexity of term rewrite systems by combining different criteria. All current investigations to analyze the derivational complexity are based on a single termination proof, possibly preceded by transformations. We prove that the modular framework is strictly more powerful than the conventional setting. Furthermore, the results have been implemented and experiments show significant gains in power.

Proving Productivity in Infinite Data Structures
Hans Zantema, Matthias Raffelsieper

For a general class of infinite data structures including streams, binary trees, and the combination of finite and infinite lists, we investigate the notion of productivity. This generalizes stream productivity. We develop a general technique to prove productivity based on proving context-sensitive termination, by which the power of present termination tools can be exploited. In order to treat cases where the approach does not apply directly, we develop transformations extending the power of the basic approach. We present a tool combining these ingredients that can prove productivity of a wide range of examples fully automatically.

2009

Loops under Strategies
René Thiemann, Christian Sternagel

Most techniques to automatically disprove termination of term rewrite systems search for a loop. Whereas a loop implies non-termination for full rewriting, this is not necessarily the case if one considers rewriting under strategies. Therefore, in this paper we first generalize the notion of a loop to a loop under a given strategy. In a second step we present two novel decision procedures to check whether a given loop is a context-sensitive or an outermost loop. We implemented and successfully evaluated our method in the termination prover.

Proving Termination of Integer Term Rewriting
Carsten Fuhs, Jürgen Giesl, Martin Plücker, Peter Schneider-Kamp, Stephan Falke

When using rewrite techniques for termination analysis of programs, a main problem are pre-defined data types like integers. We extend term rewriting by built-in integers and adapt the dependency pair framework to prove termination of integer term rewriting automatically.

Dependency Pairs and Polynomial Path Orders
Martin Avanzini, Georg Moser

We show how polynomial path orders can be employed efficiently in conjunction with weak innermost dependency pairs to automatically certify polynomial runtime complexity of term rewrite systems and the polytime computability of the functions computed. The established techniques have been implemented and we provide ample experimental data to assess the new method.

Unique Normalization for Shallow TRS
Guillem Godoy, Florent Jacquemard

Computation with a term rewrite system (TRS) consists in the application of its rules from a given starting term until a normal form is reached, which is considered the result of the computation. The unique normalization (UN) property for a TRS R states that any starting term can reach at most one normal form when R is used, i.e. that the computation with R is unique.
We study the decidability of this property for classes of TRS defined by syntactic restrictions such as linearity (variables can occur only once in each side of the rules), flatness (sides of the rules have height at most one) and shallowness (variables occur at depth at most one in the rules).
We prove that UN is decidable in polynomial time for shallow and linear TRS, using tree automata techniques. This result is very near to the limits of decidability, since this property is known undecidable even for very restricted classes like right-ground TRS, flat TRS and also right-flat and linear TRS. We also show that UN is even undecidable for flat and right-linear TRS. The latter result is in contrast with the fact that many other natural properties like reachability, termination, confluence, weak normalization, etc. are decidable for this class of TRS.

The Existential Fragment of the One-Step Parallel Rewriting Theory
Aleksy Schubert

It is known that the first-order theory with a single predicate &rarrow; that denotes a one-step rewriting reduction on terms is undecidable already for formulae with &exists; ∀ prefix. Several decidability results exist for the fragment of the theory in which the formulae start with the &exists; prefix only. This paper considers a similar fragment for a predicate &rarrow;p which denotes the parallel one-step rewriting reduction. We show that the first-order theory of &rarrow;p is undecidable already for formulae with &exists;7 prefix and left-linear rewrite systems.

Proving Confluence of Term Rewriting Systems Automatically
Takahito Aoto, Junichi Yoshida, Yoshihito Toyama

We have developed an automated confluence prover for term rewriting systems (TRSs). This paper presents theoretical and technical ingredients that have been used in our prover. A distinctive feature of our prover is incorporation of several divide-and-conquer criteria such as those for commutative (Toyama, 1988), layer-preserving (Ohlebusch, 1994) and persistent (Aoto & Toyama, 1997) combinations. For a TRS to which direct confluence criteria do not apply, the prover decomposes it into components and tries to apply direct confluence criteria to each component. Then the prover combines these results to infer the (non-)confluence of the whole system. To the best of our knowledge, an automated confluence prover based on such an approach has been unknown.

A Proof Theoretic Analysis of Intruder Theories
Alwen Tiu, Rajeev Goré

We consider the problem of intruder deduction in security protocol analysis: that is, deciding whether a given message M can be deduced from a set of messages Γ under the theory of blind signatures and arbitrary convergent equational theories modulo associativity and commutativity (AC) of certain binary operators. The traditional formulations of intruder deduction are usually given in natural-deduction-like systems and proving decidability requires significant effort in showing that the rules are "local" in some sense. By using the well-known translation between natural deduction and sequent calculus, we recast the intruder deduction problem as proof search in sequent calculus, in which locality is immediate. Using standard proof theoretic methods, such as permutability of rules and cut elimination, we show that the intruder deduction problem can be reduced, in polynomial time, to the elementary deduction problems, which amounts to solving certain equations in the underlying individual equational theories. We further show that this result extends to combinations of disjoint AC-convergent theories whereby the decidability of intruder deduction under the combined theory reduces to the decidability of elementary deduction in each constituent theory. Although various researchers have reported similar results for individual cases, our work shows that these results can be obtained using a systematic and uniform methodology based on the sequent calculus.

Flat and One-Variable Clauses for Single Blind Copying Protocols: The XOR Case
Helmut Seidl, Kumar Neeraj Verma

In cryptographic protocols with the single blind copying restriction, at most one piece of unknown data is allowed to be copied in each step of the protocol. The secrecy problem for such protocols can be modeled as the satisfiability problem for the class of first-order Horn clauses called flat and one-variable Horn clauses, and is known to be DEXPTIME-complete. We show that when an XOR operator is additionally present, then the secrecy problem is decidable in 3-EXPTIME. We also note that replacing XOR by the theory of associativity-commutativity or by the theory of Abelian groups, or removing some of the syntactic restrictions on the clauses, leads to undecidability.

Protocol Security and Algebraic Properties: Decision Results for a Bounded Number of Sessions
Sergiu Bursuc, Hubert Comon-Lundh

We consider the problem of deciding the security of cryptographic protocols for a bounded number of sessions, taking into account some algebraic properties of the security primitives, for instance Abelian group properties. We propose a general method for deriving decision algorithms, splitting the task into 4 properties of the rewriting system describing the intruder capabilities: locality, conservativity, finite variant property and decidability of one-step deducibility constraints. We illustrate this method on a non trivial example, combining several Abelian Group properties, exponentiation and a homomorphism, showing a decidability result for this combination.

YAPA: A Generic Tool for Computing Intruder Knowledge
Mathieu Baudet, Véronique Cortier, Stéphanie Delaune

Reasoning about the knowledge of an attacker is a necessary step in many formal analyses of security protocols. In the framework of the applied pi calculus, as in similar languages based on equational logics, knowledge is typically expressed by two relations: deducibility and static equivalence. Several decision procedures have been proposed for these relations under a variety of equational theories. However, each theory has its particular algorithm, and none has been implemented so far.
We provide a generic procedure for deducibility and static equivalence that takes as input any convergent rewrite system. We show that our algorithm covers all the existing decision procedures for convergent theories. We also provide an efficient implementation, and compare it briefly with the more general tool ProVerif.

Well-Definedness of Streams by Termination
Hans Zantema

Streams are infinite sequences over a given data type. A stream specification is a set of equations intended to define a stream. We propose a transformation from such a stream specification to a TRS in such a way that termination of the resulting TRS implies that the stream specification admits a unique solution. As a consequence, proving such well-definedness of several interesting stream specifications can be done fully automatically using present powerful tools for proving TRS termination.

Modularity of Convergence in Infinitary Rewriting
Stefan Kahrs

Properties of Term Rewriting Systems are called modular iff they are preserved under disjoint union, i.e. when combining two Term Rewriting Systems with disjoint signatures. Convergence is the property of Infinitary Term Rewriting Systems that all reduction sequences converge to a limit. Strong Convergence requires in addition that no redex position in a reduction sequence is used infinitely often.
In this paper it is shown that Strong Convergence is a modular property, lifting a restriction from a known result by Simonsen, and that Convergence is modular for non-collapsing Infinitary Term Rewriting Systems.

A Heterogeneous Pushout Approach to Term-Graph Transformation
Dominique Duval, Rachid Echahed, Frédéric Prost

We address the problem of cyclic termgraph rewriting. We propose a new framework where rewrite rules are tuples of the form (L,R,τ,σ) such that L and R are termgraphs representing the left-hand and the right-hand sides of the rule, τ is a mapping from the nodes of L to those of R and σ is a partial function from nodes of R to nodes of L. The mapping τ describes how incident edges of the nodes in L are connected in R, it is not required to be a graph morphism as in classical algebraic approaches of graph transformation. The role of σ is to indicate the parts of L to be cloned (copied). Furthermore, we introduce a notion of heterogeneous pushout and define rewrite steps as heterogeneous pushouts in a given category. Among the features of the proposed rewrite systems, we quote the ability to perform local and global redirection of pointers, addition and deletion of nodes as well as cloning and collapsing substructures.

An Explicit Framework for Interaction Nets
Marc de Falco

Interaction nets are a graphical formalism inspired by Linear Logic proof-nets often used for studying higher order rewriting e.g. β-reduction. Traditional presentations of interaction nets are based on graph theory and rely on elementary properties of graph theory. We give here a more explicit presentation based on notions borrowed from Girard's Geometry of Interaction: interaction nets are presented as partial permutations and a composition of nets, the gluing, is derived from the execution formula. We then define contexts and reduction as the context closure of rules. We prove strong confluence of the reduction within our framework and show how interaction nets can be viewed as the quotient of some generalized proof-nets.

Dual Calculus with Inductive and Coinductive Types
Daisuke Kimura, Makoto Tatsuta

This paper gives an extension of Dual Calculus by introducing inductive types and coinductive types. The same duality as Dual Calculus is shown to hold in the new system, that is, this paper presents its involution for the new system and proves that it preserves both typing and reduction. The duality between inductive types and coinductive types is shown by the existence of the involution that maps an inductive type and a coinductive type to each other. The strong normalization in this system is also proved. First, strong normalization in second-order Dual Calculus is shown by translating it into second-order symmetric lambda calculus. Next, strong normalization in Dual Calculus with inductive and coinductive types is proved by translating it into second-order Dual Calculus.

Comparing Böhm-Like Trees
Jeroen Ketema

Extending the infinitary rewriting definition of Böhm-like trees to infinitary Combinatory Reduction Systems (iCRSs), we show that each Böhm-like tree defined by means of infinitary rewriting can also be defined by means of a direct approximant function. In addition, we show that counterexamples exists to the reverse implication.
This paper extends earlier unpublished work from the author's Ph.D. thesis [1].

The Derivational Complexity Induced by the Dependency Pair Method
Georg Moser, Andreas Schnabl

We study the derivational complexity induced by the (basic) dependency pair method. Suppose the derivational complexity induced by a termination method is closed under elementary functions. We show that the derivational complexity induced by the dependency pair method based on this termination technique is the same as for the direct technique. Therefore, the derivational complexity induced by the dependency pair method based on lexicographic path orders or multiset path orders is multiple recursive or primitive recursive, respectively. Moreover for the dependency pair method based on Knuth-Bendix orders, we obtain that the derivational complexity function is majorised by the Ackermann function. These characterisations are essentially optimal.

Local Termination
Jörg Endrullis, Roel C. de Vrijer, Johannes Waldmann

The characterization of termination using well-founded monotone algebras has been a milestone on the way to automated termination techniques, of which we have seen an extensive development over the past years. Both the semantic characterization and most known termination methods are concerned with global termination, uniformly of all the terms of a term rewriting system (TRS). In this paper we consider local termination, of specific sets of terms within a given TRS.
The principal goal of this paper is generalizing the semantic characterization of global termination to local termination. This is made possible by admitting the well-founded monotone algebras to be partial. We show that our results can be applied in the development of techniques for proving local termination. We give several examples, among which a verifiable characterization of the terminating S-terms in CL.

VMTL - A Modular Termination Laboratory
Felix Schernhammer, Bernhard Gramlich

The automated analysis of termination of term rewriting systems (TRSs) has drawn a lot of attention in the scientific community during the last decades and many different methods and approaches have been developed for this purpose. We present VMTL (Vienna Modular Termination Laboratory), a tool implementing some of the most recent and powerful algorithms for termination analysis of TRSs, while providing an open interface that allows users to easily plug in new algorithms in a modular fashion according to the widely adopted dependency pair framework. Apart from modular extensibility, VMTL focuses on analyzing the termination behaviour of conditional term rewriting systems (CTRSs). Using one of the latest transformational techniques, the resulting restricted termination problems (for unconditional context-sensitive TRSs) are processed with dedicated algorithms.

Tyrolean Termination Tool 2
Martin Korp, Christian Sternagel, Harald Zankl, Aart Middeldorp

This paper describes the second edition of the Tyrolean Termination Tool - a fully automatic termination analyzer for first-order term rewrite systems. The main features of this tool are its (non-)termination proving power, its speed, its flexibility due to a strategy language, and the fact that the source code of the whole project is freely available. The clean design together with a stand-alone OCaml library for term rewriting, make it a perfect starting point for other tools concerned with rewriting as well as experimental implementations of new termination methods.

From Outermost to Context-Sensitive Rewriting
Jörg Endrullis, Dimitri Hendriks

We define a transformation from term rewriting systems (TRSs) to context-sensitive TRSs in such a way that termination of the target system implies outermost termination of the original system. For the class of left-linear TRSs the transformation is complete. Thereby state-of-the-art termination methods and automated termination provers for context-sensitive rewriting become available for proving termination of outermost rewriting. The translation has been implemented in Jambox, making it the most successful tool in the category of outermost rewriting of the last edition of the annual termination competition.

A Fully Abstract Semantics for Constructor Systems
Francisco Javier López-Fraguas, Juan Rodríguez-Hortalá, Jaime Sánchez-Hernández

Constructor-based term rewriting systems are a useful subclass of TRS, in particular for programming purposes. In this kind of systems constructors determine a universe of values, which are the expected output of the computations. Then it would be natural to think of a semantics associating each expression to the set of its reachable values. Somehow surprisingly, the resulting semantics has poor properties, for it is not compositional nor fully abstract when non-confluent systems are considered. In this paper we propose a novel semantics for expressions in constructor systems, which is compositional and fully abstract (with respect to sensible observation functions, in particular the set of reachable values for an expression), and therefore can serve as appropriate basis for semantic based analysis or manipulation of such kind of rewrite systems.

The Π02-Completeness of Most of the Properties of Rewriting Systems You Care About (and Productivity)
Jakob Grue Simonsen

Most of the standard pleasant properties of term rewriting systems are undecidable; to wit: local confluence, confluence, normalization, termination, and completeness.
Mere undecidability is insufficient to rule out a number of possibly useful properties: For instance, if the set of normalizing term rewriting systems were recursively enumerable, there would be a program yielding "yes" in finite time if applied to any normalizing term rewriting system.
The contribution of this paper is to show (the uniform version of) each member of the list of properties above (as well as the property of being a productive specification of a stream) complete for the class Π02. Thus, there is neither a program that can enumerate the set of rewriting systems enjoying any one of the properties, nor is there a program enumerating the set of systems that do not.
For normalization and termination we show both the ordinary version and the ground versions (where rules may contain variables, but only ground terms may be rewritten) Π02-complete. For local confluence, confluence and completeness, we show the ground versions Π02-complete.

Unification in the Description Logic EL
Franz Baader, Barbara Morawska

The Description Logic EL has recently drawn considerable attention since, on the one hand, important inference problems such as the subsumption problem are polynomial. On the other hand, EL is used to define large biomedical ontologies. Unification in Description Logics has been proposed as a novel inference service that can, for example, be used to detect redundancies in ontologies. The main result of this paper is that unification in EL is decidable. More precisely, EL-unification is NP-complete, and thus has the same complexity as EL-matching. We also show that, w.r.t. the unification type, EL is less well-behaved: it is of type zero, which in particular implies that there are unification problems that have no finite complete set of unifiers.

Unification with Singleton Tree Grammars
Adrià Gascón, Guillem Godoy, Manfred Schmidt-Schauß

First-order term unification is an essential concept in areas like functional and logic programming, automated deduction, deductive databases, artificial intelligence, information retrieval, compiler design, etc. We build upon recent developments in general grammar-based compression mechanisms for terms, which are more general than dags and investigate algorithms for first-order unification of compressed terms.
We prove that the first-order unification of compressed terms is decidable in polynomial time, and also that a compressed representation of the most general unifier can be computed in polynomial time.

Unification and Narrowing in Maude 2.4
Manuel Clavel, Francisco Durán, Steven Eker, Santiago Escobar, Patrick Lincoln, Narciso Martí-Oliet, José Meseguer, Carolyn L. Talcott

Maude is a high-performance reflective language and system supporting both equational and rewriting logic specification and programming for a wide range of applications, and has a relatively large worldwide user and open-source developer base. This paper introduces novel features of Maude 2.4 including support for unification and narrowing. Unification is supported in Core Maude, the core rewriting engine of Maude, with commands and metalevel functions for order-sorted unification modulo some frequently occurring equational axioms. Narrowing is currently supported in its Full Maude extension. We also give a brief summary of the most important features of Maude 2.4 that were not part of Maude 2.0 and earlier releases. These features include communication with external objects, a new implementation of its module algebra, and new predefined libraries. We also review some new Maude applications.

2008

Modular Termination of Basic Narrowing
María Alpuente, Santiago Escobar, José Iborra

Basic narrowing is a restricted form of narrowing which constrains narrowing steps to a set of non-blocked (or basic) positions. Basic narrowing has a number of important applications including equational unification in canonical theories. Another application is analyzing termination of narrowing by checking the termination of basic narrowing, as done in pioneering work by Hullot. In this work, we study the modularity of termination of basic narrowing in hierarchical combinations of TRSs, including a generalization of proper extensions with shared subsystem. This provides new algorithmic criteria to prove termination of basic narrowing.

Linear-algebraic lambda-calculus: higher-order, encodings, and confluence.
Pablo Arrighi, Gilles Dowek

We introduce a minimal language combining higher-order computation and linear algebra. This language extends the λ-calculus with the possibility to make arbitrary linear combinations of terms α.t + β.u. We describe how to "execute" this language in terms of a few rewrite rules, and justify them through the two fundamental requirements that the language be a language of linear operators, and that it be higher-order. We mention the perspectives of this work in the field of quantum computation, whose circuits we show can be easily encoded in the calculus. Finally we prove the confluence of the calculus, this is our main result.

Term-Graph Rewriting Via Explicit Paths
Emilie Balland, Pierre-Étienne Moreau

The notion of path is classical in graph theory but not directly used in the term rewriting community. The main idea of this work is to raise the notion of path to the level of first-order terms, i.e. paths become part of the terms and not just meta-information about them. These paths are represented by words of integers (positive or negative) and are interpreted as relative addresses in terms. In this way, paths can also be seen as a generalization of the classical notion of position for the first-order terms and are inspired by de Bruijn indexes.
In this paper, we define an original framework called Referenced Term Rewriting where paths are used to represent pointers between subterms. Using this approach, any term-graph rewriting systems can be simulated using a term rewrite-based environment.

Finer Is Better: Abstraction Refinement for Rewriting Approximations
Yohan Boichut, Roméo Courbis, Pierre-Cyrille Héam, Olga Kouchnarenko

Term rewriting systems are now commonly used as a modeling language for programs or systems. On those rewriting based models, reachability analysis, i.e. proving or disproving that a given term is reachable from a set of input terms, provides an efficient verification technique. For disproving reachability (i.e. proving non reachability of a term) on non terminating and non confluent rewriting models, Knuth-Bendix completion and other usual rewriting techniques do not apply. Using the tree automaton completion technique, it has been shown that the non reachability of a term t can be shown by computing an over-approximation of the set of reachable terms and prove that t is not in the over-approximation. However, when the term t is in the approximation, nothing can be said.
In this paper, we improve this approach as follows: given a term t, we try to compute an over-approximation which does not contain t by using an approximation refinement that we propose. If the approximation refinement fails then t is a reachable term. This semi-algorithm has been prototyped in the Timbuk tool. We present some experiments with this prototype showing the interest of such an approach w.r.t. verification on rewriting models.

A Needed Rewriting Strategy for Data-Structures with Pointers
Rachid Echahed, Nicolas Peltier

We propose a reduction strategy for systems of rewrite rules operating on term-graphs. These term-graphs are intended to encode pointer-based data-structures that are commonly used in programming, with cycles and sharing. We show that this reduction strategy is optimal w.r.t. a given dependency schema, which intuitively encodes the "interferences" among the nodes in the term-graphs. We provide a new way of computing such dependency schemata.

Effectively Checking the Finite Variant Property
Santiago Escobar, José Meseguer, Ralf Sasse

An equational theory decomposed into a set B of equational axioms and a set Δ of rewrite rules has the finite variant (FV) property in the sense of Comon-Lundh and Delaune iff for each term t there is a finite set {t1,...,tn} of →Δ,B-normalized instances of t so that any instance of t normalizes to an instance of some ti modulo B. This is a very useful property for cryptographic protocol analysis, and for solving both unification and disunification problems. Yet, at present the property has to be established by hand, giving a separate mathematical proof for each given theory: no checking algorithms seem to be known. In this paper we give both a necessary and a sufficient condition for FV from which we derive an algorithm ensuring the sufficient condition, and thus FV. This algorithm can check automatically a number of examples of FV known in the literature.

Dependency Pairs for Rewriting with Built-In Numbers and Semantic Data Structures
Stephan Falke, Deepak Kapur

This paper defines an expressive class of constrained equational rewrite systems that supports the use of semantic data structures (e.g., sets or multisets) and contains built-in numbers, thus extending our previous work presented at CADE 2007 [6]. These rewrite systems, which are based on normalized rewriting on constructor terms, allow the specification of algorithms in a natural and elegant way. Built-in numbers are helpful for this since numbers are a primitive data type in every programming language. We develop a dependency pair framework for these rewrite systems, resulting in a flexible and powerful method for showing termination that can be automated effectively. Various powerful techniques are developed within this framework, including a subterm criterion and reduction pairs that need to consider only subsets of the rules and equations. It is well-known from the dependency pair framework for ordinary rewriting that these techniques are often crucial for a successful automatic termination proof. Termination of a large collection of examples can be established using the presented techniques.

Maximal Termination
Carsten Fuhs, Jürgen Giesl, Aart Middeldorp, Peter Schneider-Kamp, René Thiemann, Harald Zankl

We present a new approach for termination proofs that uses polynomial interpretations (with possibly negative coefficients) together with the "maximum" function. To obtain a powerful automatic method, we solve two main challenges: (1) We show how to adapt the latest developments in the dependency pair framework to our setting. (2) We show how to automate the search for such interpretations by integrating "max" into recent SAT-based methods for polynomial interpretations. Experimental results support our approach.

Usable Rules for Context-Sensitive Rewrite Systems
Raúl Gutiérrez, Salvador Lucas, Xavier Urbain

Recently, the dependency pairs (DP) approach has been generalized to context-sensitive rewriting (CSR). Although the context-sensitive dependency pairs (CS-DP) approach provides a very good basis for proving termination of CSR, the current developments basically correspond to a ten-years-old DP approach. Thus, the task of adapting all recently introduced dependency pairs techniques to get a more powerful approach becomes an important issue. In this direction, usable rules are one of the most interesting and powerful notions. Actually usable rule have been investigated in connection with proofs of innermost termination of CSR. However, the existing results apply to a quite restricted class of systems. In this paper, we introduce a notion of usable rules that can be used in proofs of termination of CSR with arbitrary systems. Our benchmarks show that the performance of the CS-DP approach is much better when such usable rules are considered in proofs of termination of CSR.

Combining Equational Tree Automata over AC and ACI Theories
Joe Hendrix, Hitoshi Ohsaki

In this paper, we study combining equational tree automata in two different senses: (1) whether decidability results about equational tree automata over disjoint theories E1 and E2 imply similar decidability results in the combined theory E1∪E2 ; (2) checking emptiness of a language obtained from the Boolean combination of regular equational tree languages. We present a negative result for the first problem. Specifically, we show that the intersection-emptiness problem for tree automata over a theory containing at least one AC symbol, one ACI symbol, and 4 constants is undecidable despite being decidable if either the AC or ACI symbol is removed. Our result shows that decidability of intersection-emptiness is a non-modular property even for the union of disjoint theories. Our second contribution is to show a decidability result which implies the decidability of two open problems: (1) If idempotence is treated as a rule f(x,x) → x rather than an equation f(x,x) = x, is it decidable whether an AC tree automata accepts an idempotent normal form? (2) If E contains a single ACI symbol and arbitrary free symbols, is emptiness decidable for a Boolean combination of regular E -tree languages?

Closure of Hedge-Automata Languages by Hedge Rewriting
Florent Jacquemard, Michaël Rusinowitch

We consider rewriting systems for unranked ordered terms, i.e. trees where the number of successors of a node is not determined by its label, and is not a priori bounded. The rewriting systems are defined such that variables in the rewrite rules can be substituted by hedges (sequences of terms) instead of just terms. Consequently, this notion of rewriting subsumes both standard term rewriting and word rewriting.
We investigate some preservation properties for two classes of languages of unranked ordered terms under this generalization of term rewriting. The considered classes include languages of hedge automata (HA) and some extension (called CF-HA) with context-free languages in transitions, instead of regular languages.
In particular, we show that the set of unranked terms reachable from a given HA language, using a so called inverse context-free rewrite system, is a HA language. The proof, based on a HA completion procedure, reuses and combines known techniques with non-trivial adaptations. Moreover, we prove, with different techniques, that the closure of CF-HA languages with respect to restricted context-free rewrite systems, the symmetric case of the above rewrite systems, is a CF-HA language. As a consequence, the problems of ground reachability and regular hedge model checking are decidable in both cases. We give several counter examples showing that we cannot relax the restrictions.

On Normalisation of Infinitary Combinatory Reduction Systems
Jeroen Ketema

For fully-extended, orthogonal infinitary Combinatory Reduction Systems, we prove that terms with perpetual reductions starting from them do not have (head) normal forms. Using this, we show that
1. needed reduction strategies are normalising for fully-extended, orthogonal infinitary Combinatory Reduction Systems, and that
2. weak and strong normalisation coincide for such systems as a whole and, in case reductions are non-erasing, also for terms.

Innermost Reachability and Context Sensitive Reachability Properties Are Decidable for Linear Right-Shallow Term Rewriting Systems
Yoshiharu Kojima, Masahiko Sakai

A reachability problem is a problem used to decide whether s is reachable to t by R or not for a given two terms s, t and a term rewriting system R. Since it is known that this problem is undecidable, effort has been devoted to finding subclasses of term rewriting systems in which the reachability is decidable. However few works on decidability exist for innermost reduction strategy or context-sensitive rewriting.
In this paper, we show that innermost reachability and contextsensitive reachability are decidable for linear right-shallow term rewriting systems. Our approach is based on the tree automata technique that is commonly used for analysis of reachability and its related properties.

Arctic Termination ...Below Zero
Adam Koprowski, Johannes Waldmann

We introduce the arctic matrix method for automatically proving termination of term rewriting. We use vectors and matrices over the arctic semi-ring: natural numbers extended with -∞, with the operations "max" and "plus". This extends the matrix method for term rewriting and the arctic matrix method for string rewriting. In combination with the Dependency Pairs transformation, this allows for some conceptually simple termination proofs in cases where only much more involved proofs were known before. We further generalize to arctic numbers "below zero": integers extended with -∞. This allows to treat some termination problems with symbols that require a predecessor semantics. The contents of the paper has been formally verified in the Coq proof assistant and the formalization has been contributed to the CoLoR library of certified termination techniques. This allows formal verification of termination proofs using the arctic matrix method. We also report on experiments with an implementation of this method which, compared to results from 2007, outperforms TPA (winner of the certified termination competition for term rewriting), and in the string rewriting category is as powerful as Matchbox was but now all of the proofs are certified.

Logics and Automata for Totally Ordered Trees
Marco Kuhlmann, Joachim Niehren

A totally ordered tree is a tree equipped with an additional total order on its nodes. It provides a formal model for data that comes with both a hierarchical and a sequential structure; one example for such data are natural language sentences, where a sequential structure is given by word order, and a hierarchical structure is given by grammatical relations between words. In this paper, we study monadic second-order logic (MSO) for totally ordered terms. We show that the MSO satisfiability problem of unrestricted structures is undecidable, but give a decision procedure for practically relevant sub-classes, based on tree automata.

Diagram Rewriting for Orthogonal Matrices: A Study of Critical Peaks
Yves Lafont, Pierre Rannou

Orthogonal diagrams represent decompositions of isometries of Rn into symmetries and rotations. Some convergent (that is noetherian and confluent) rewrite system for this structure was introduced by the first author. One of the rules is similar to Yang-Baxter equation. It involves a map h : ]0, π[3 → ]0, π[3.
In order to obtain the algebraic properties of h, we study the confluence of critical peaks (or critical pairs) for our rewrite system. For that purpose, we introduce parametric diagrams describing the calculation of angles of rotations generated by rewriting. In particular, one of those properties is related to the tetrahedron equation (also called Zamolodchikov equation).

Nominal Unification from a Higher-Order Perspective
Jordi Levy, Mateu Villaret

Nominal Logic is an extension of first-order logic with equality, name-binding, name-swapping, and freshness of names. Contrarily to higher-order logic, bound variables are treated as atoms, and only free variables are proper unknowns in nominal unification. This allows "variable capture", breaking a fundamental principle of lambda-calculus. Despite this difference, nominal unification can be seen from a higher-order perspective. From this view, we show that nominal unification can be reduced to a particular fragment of higher-order unification problems: higher-order patterns unification. This reduction proves that nominal unification can be decided in quadratic deterministic time.

Functional-Logic Graph Parser Combinators
Steffen Mazanek, Mark Minas

Parser combinators are a popular technique among functional programmers for writing parsers. They allow the definition of parsers for string languages in a manner quite similar to BNF rules. In recent papers we have shown that the combinator approach is also beneficial for graph parsing. However, we have noted as well that certain graph languages are difficult to describe in a purely functional way.
In this paper we demonstrate that functional-logic languages can be used to conveniently implement graph parsers. Therefore, we provide a direct mapping from hyperedge replacement grammars to graph parsers. As in the string setting, our combinators closely reflect the building blocks of this grammar formalism. Finally, we show by example that our framework is strictly more powerful than hyperedge replacement grammars.
We make heavy use of key features of both the functional and the logic programming approach: Higher-order functions allow the treatment of parsers as first class citizens. Non-determinism and logical variables are beneficial for dealing with errors and incomplete information. Parsers can even be applied backwards and thus be used as generators or for graph completion.

Proving Quadratic Derivational Complexities Using Context Dependent Interpretations
Georg Moser, Andreas Schnabl

In this paper we study context dependent interpretations, a semantic termination method extending interpretations over the natural numbers, introduced by Hofbauer. We present two subclasses of context dependent interpretations and establish tight upper bounds on the induced derivational complexities. In particular we delineate a class of interpretations that induces quadratic derivational complexity. Furthermore, we present an algorithm for mechanically proving termination of rewrite systems with context dependent interpretations. This algorithm has been implemented and we present ample numerical data for the assessment of the viability of the method.

Tree Automata for Non-linear Arithmetic
Naoki Kobayashi, Hitoshi Ohsaki

Tree automata modulo associativity and commutativity axioms, called AC tree automata, accept trees by iterating the transition modulo equational reasoning. The class of languages accepted by monotone AC tree automata is known to include the solution set of the inequality x×y≥z , which implies that the class properly includes the AC closure of regular tree languages. In the paper, we characterize more precisely the expressiveness of monotone AC tree automata, based on the observation that, in addition to polynomials, a class of exponential constraints (called monotone exponential Diophantine inequalities) can be expressed by monotone AC tree automata with a minimal signature. Moreover, we show that a class of arithmetic logic consisting of monotone exponential Diophantine inequalities is definable by monotone AC tree automata. The results presented in the paper are obtained by applying our novel tree automata technique, called linearly bounded projection.

Confluence by Decreasing Diagrams
Vincent van Oostrom

The decreasing diagrams technique is a complete method to reduce confluence of a rewrite relation to local confluence. Whereas previous presentations have focussed on the proof the technique is correct, here we focus on applicability. We present a simple but powerful generalisation of the technique, requiring peaks to be closed only by conversions instead of valleys, which is demonstrated to further ease applicability.

A Finite Simulation Method in a Non-deterministic Call-by-Need Lambda-Calculus with Letrec, Constructors, and Case
Manfred Schmidt-Schauß, Elena Machkasova

The paper proposes a variation of simulation for checking and proving contextual equivalence in a non-deterministic call-by-need lambda-calculus with constructors, case, seq, and a letrec with cyclic dependencies. It also proposes a novel method to prove its correctness. The calculus' semantics is based on a small-step rewrite semantics and on may-convergence. The cyclic nature of letrec bindings, as well as non-determinism, makes known approaches to prove that simulation implies contextual preorder, such as Howe's proof technique, inapplicable in this setting. The basic technique for the simulation as well as the correctness proof is called pre-evaluation, which computes a set of answers for every closed expression. If simulation succeeds in finite computation depth, then it is guaranteed to show contextual preorder of expressions.

Root-Labeling
Christian Sternagel, Aart Middeldorp

In 2006 Jambox, a termination prover developed by Endrullis, surprised the termination community by winning the string rewriting division and almost beating AProVE in the term rewriting division of the international termination competition. The success of Jambox for strings is partly due to a very special case of semantic labeling. In this paper we integrate this technique, which we call root-labeling, into the dependency pair framework. The result is a simple processor with help of which TTT2 surprised the termination community in 2007 by producing the first automatically generated termination proof of a string rewrite system with non-primitive recursive complexity (Touzet, 1998). Unlike many other recent termination methods, the root-labeling processor is trivial to automate and completely unsuitable for producing human readable proofs.

Combining Rewriting with Noetherian Induction to Reason on Non-orientable Equalities
Sorin Stratulat

We propose a new (Noetherian) induction schema to reason on equalities and show how to integrate it into implicit induction-based inference systems. Non-orientable conjectures of the form lhs = rhs and their instances can be soundly used as induction hypotheses in rewrite operations. It covers the most important rewriting-based induction proof techniques: i)term rewriting induction if lhs = rhs is orientable, ii) enhanced rewriting induction if lhs and rhs are comparable, iii)ordered rewriting induction if the instances of lhs = rhs are orientable, and iv) relaxed rewriting induction if the instances of lhs = rhs are not comparable.In practice, it helps to automatize the (rewrite-based) reasoning on a larger class of non-orientable equalities, like the permutative and associativity equalities.

Deciding Innermost Loops
René Thiemann, Jürgen Giesl, Peter Schneider-Kamp

We present the first method to disprove innermost termination of term rewrite systems automatically. To this end, we first develop a suitable notion of an innermost loop. Second, we show how to detect innermost loops: One can start with any technique amenable to find loops. Then our novel procedure can be applied to decide whether a given loop is an innermost loop. We implemented and successfully evaluated our method in the termination prover AProVE.

Termination Proof of S-Expression Rewriting Systems with Recursive Path Relations
Yoshihito Toyama

S-expression rewriting systems were proposed by the author (RTA 2004) for termination analysis of Lisp-like untyped higher-order functional programs. This paper presents a short and direct proof for the fact that every finite S-expression rewriting system is terminating if it is compatible with a recursive path relation with status. By considering well-founded binary relations instead of well-founded orders, we give a much simpler proof than the one depending on Kruskal's tree theorem.

Encoding the Pure Lambda Calculus into Hierarchical Graph Rewriting
Kazunori Ueda

Fine-grained reformulation of the lambda calculus is expected to solve several difficulties with the notion of substitutions - definition, implementation and cost properties. However, previous attempts including those using explicit substitutions and those using Interaction Nets were not ideally simple when it came to the encoding of the pure (as opposed to weak) lambda calculus. This paper presents a novel, fine-grained, and highly asynchronous encoding of the pure lambda calculus using LMNtal, a hierarchical graph rewriting language, and discusses its properties. The major strength of the encoding is that it is significantly simpler than previous encodings, making it promising as an alternative formulation, rather than just the encoding, of the pure lambda calculus. The membrane construct of LMNtal plays an essential role in encoding colored tokens and operations on them. The encoding has been tested using the publicly available LMNtal implementation.

Revisiting Cut-Elimination: One Difficult Proof Is Really a Proof
Christian Urban, Bozhi Zhu

Powerful proof techniques, such as logical relation arguments, have been developed for establishing the strong normalisation property of term- rewriting systems. The first author used such a logical relation argument to establish strong normalising for a cut-elimination procedure in classical logic. He presented a rather complicated, but informal, proof establishing this property. The difficulties in this proof arise from a quite subtle substitution operation, which implements proof transformation that permute cuts over other inference rules. We have formalised this proof in the theorem prover Isabelle/HOL using the Nominal Datatype Package, closely following the informal proof given by the first author in his PhD-thesis. In the process, we identified and resolved a gap in one central lemma and a number of smaller problems in others. We also needed to make one informal definition rigorous. We thus show that the original proof is indeed a proof and that present automated proving technology is adequate for formalising such difficult proofs.

Reduction Under Substitution
Jörg Endrullis, Roel C. de Vrijer

The Reduction-under-Substitution Lemma (RuS), due to van Daalen [Daa80], provides an answer to the following question concerning the lambda calculus: given a reduction M[x:=L]
*N, what can we say about the contribution of the substitution to the result N. It is related to a not very well-known lemma that was conjectured by Barendregt in the early 70's, addressing the similar question as to the contribution of the argument M in a reduction FM
*N . The origin of Barendregt's Lemma lies in undefinablity proofs, whereas van Daalen's interest came from its application to the so-called Square Brackets Lemma, which is used in proofs of strong normalization.
In this paper we compare various forms of RuS. We strengthen RuS to multiple substitution and context filling and show how it can be used to give short and perspicuous proofs of undefinability results. Most of these are known as consequences of Berry's Sequentiality Theorem, but some fall outside its scope. We show that RuS can also be used to prove the sequentiality theorem itself. To that purpose we give a further adaptation of RuS, now also involving "bottom" reduction rules, sending unsolvable terms to a bottom element and in the limit producing Böhm trees.

Normalization of Infinite Terms
Hans Zantema

We investigate the property SN being the natural concept related to termination when considering term rewriting applied to infinite terms. It turns out that this property can be fully characterized by a variant of monotone algebras equipped with a metric. A fruitful special case is obtained when the algebra is finite and the required metric properties are obtained for free. It turns out that the matrix method can be applied to find proofs of SN based on these observations. In this way SN can be proved fully automatically for some interesting examples related to combinatory logic.

2007

Intruders with Caps
Siva Anantharaman, Paliath Narendran, Michaël Rusinowitch

In the analysis of cryptographic protocols, a treacherous set of terms is one from which an intruder can get access to what was intended to be secret, by adding on to the top of a sequence of elements of this set, a cap formed of symbols legally part of his/her knowledge. In this paper, we give sufficient conditions on the rewrite system modeling the intruder's abilities, such as using encryption and decryption functions, to ensure that it is decidable if such caps exist. The following classes of intruder systems are studied: linear, dwindling, Δ-strong, and optimally reducing; and depending on the class considered, the cap problem ("find a cap for a given set of terms") is shown respectively to be in P, NP-complete, decidable, and undecidable.

Tom: Piggybacking Rewriting on Java
Emilie Balland, Paul Brauner, Radu Kopetz, Pierre-Étienne Moreau, Antoine Reilles

We present the Tom language that extends Java with the purpose of providing high level constructs inspired by the rewriting community. Tom furnishes a bridge between a general purpose language and higher level specifications that use rewriting. This approach was motivated by the promotion of rewriting techniques and their integration in large scale applications. Powerful matching capabilities along with a rich strategy language are among Tom's strong points, making it easy to use and competitive with other rule based languages.

Rewriting Approximations for Fast Prototyping of Static Analyzers
Yohan Boichut, Thomas Genet, Thomas P. Jensen, Luka Le Roux

This paper shows how to construct static analyzers using tree automata and rewriting techniques. Starting from a term rewriting system representing the operational semantics of the target programming language and given a program to analyze, we automatically construct an over-approximation of the set of reachable terms, i.e. of the program states that can be reached. The approach enables fast prototyping of static analyzers because modifying the analysis simply amounts to changing the set of rewrite rules defining the approximation. A salient feature of this approach is that the approximation is correct by construction and hence does not require an explicit correctness proof. To illustrate the framework proposed here on a realistic programming language we instantiate it with the Java Virtual Machine semantics and perform class analysis on Java bytecode programs.

Determining Unify-Stable Presentations
Thierry Boy de la Tour, Mnacho Echenim

The class of equational theories defined by so-called unify-stable presentations was recently introduced, as well as a complete and terminating unification algorithm modulo any such theory. However, two equivalent presentations may have a different status, one being unify-stable and the other not. The problem of deciding whether an equational theory admits a unify-stable presentation or not thus remained open. We show that this problem is decidable and that we can compute a unify-stable presentation for any theory, provided one exists. We also provide a fairly efficient algorithm for such a task, and conclude by proving that deciding whether a theory admits a unify-stable presentation and computing such a presentation are problems in the Luks equivalence class.

Confluence of Pattern-Based Calculi
Horatiu Cirstea, Germain Faure

Different pattern calculi integrate the functional mechanisms from the λ-calculus and the matching capabilities from rewriting. Several approaches are used to obtain the confluence but in practice the proof methods share the same structure and each variation on the way pattern-abstractions are applied needs another proof of confluence.
We propose here a generic confluence proof where the way pattern-abstractions are applied is axiomatized. Intuitively, the conditions guarantee that the matching is stable by substitution and by reduction.
We show that our approach directly applies to different pattern calculi, namely the lambda calculus with patterns, the pure pattern calculus and the rewriting calculus. We also characterize a class of matching algorithms and consequently of pattern-calculi that are not confluent.

A Simple Proof That Super-Consistency Implies Cut Elimination
Gilles Dowek, Olivier Hermant

We give a simple and direct proof that super-consistency implies cut elimination in deduction modulo. This proof can be seen as a simplification of the proof that super-consistency implies proof normalization. It also takes ideas from the semantic proofs of cut elimination that proceed by proving the completeness of the cut free calculus. In particular, it gives a generalization, to all super-consistent theories, of the notion of V-complex, introduced in the semantic cut elimination proofs for simple type theory.

Bottom-Up Rewriting Is Inverse Recognizability Preserving
Irène Durand, Géraud Sénizergues

For the whole class of linear term rewriting systems, we define bottom-up rewriting which is a restriction of the usual notion of rewriting. We show that bottom-up rewriting effectively inverse-preserves recognizability and analyze the complexity of the underlying construction. The Bottom-Up class (BU) is, by definition, the set of linear systems for which every derivation can be replaced by a bottom-up derivation. Membership to BU turns out to be undecidable; we are thus lead to define more restricted classes: the classes SBU(k), k ∈ N of Strongly Bottom-Up(k) systems for which we show that membership is decidable. We define the class of Strongly Bottom-Up systems by SBU = ∪k ∈ N SBU(k). We give a polynomial sufficient condition for a system to be in SBU. The class SBU contains (strictly) several classes of systems which were already known to inverse preserve recognizability.

Adjunction for Garbage Collection with Application to Graph Rewriting
Dominique Duval, Rachid Echahed, Frédéric Prost

We investigate garbage collection of unreachable parts of rooted graphs from a categorical point of view. First, we define this task as the right adjoint of an inclusion functor. We also show that garbage collection may be stated via a left adjoint, hence preserving colimits, followed by two right adjoints. These three adjoints cope well with the different phases of a traditional garbage collector. Consequently, our results should naturally help to better formulate graph transformation steps in order to get rid of garbage (unwanted nodes). We illustrate this point on a particular class of graph rewriting systems based on a double pushout approach and featuring edge redirection. Our approach gives a neat rewriting step akin to the one on terms, where garbage never appears in the reduced term.

Non Strict Confluent Rewrite Systems for Data-Structures with Pointers
Rachid Echahed, Nicolas Peltier

We introduce a notion of rewrite rules operating on a particular class of data-structures, represented as (cyclic) term-graphs. All basic transformations are available: node creation/deletion, node relabeling and edge redirections (including global redirections). This allows one to write algorithms handling pointers that cannot be efficiently specified using existing declarative languages. Such rewrite systems are not confluent in general, even if we stick to orthogonal, left-linear rules. In order to ensure unique normal forms, we introduce a notion of priority ordering between the nodes, which allows the programmer to control the normalization of a graph if needed. The use of total priority orderings makes rewriting purely deterministic, which is not always efficient in practice. To overcome this issue, we then show how to define more flexible strategies, which yield shorter derivations and avoid useless rewriting steps (lazy rewriting).

Symbolic Model Checking of Infinite-State Systems Using Narrowing
Santiago Escobar, José Meseguer

Rewriting is a general and expressive way of specifying concurrent systems, where concurrent transitions are axiomatized by rewrite rules. Narrowing is a complete symbolic method for model checking reachability properties. We show that this method can be reinterpreted as a lifting simulation relating the original system and the symbolic system associated to the narrowing transitions. Since the narrowing graph can be infinite, this lifting simulation only gives us a semi-decision procedure for the failure of invariants. However, we propose new methods for folding the narrowing tree that can in practice result in finite systems that symbolically simulate the original system and can be used to algorithmically verify its properties. We also show how both narrowing and folding can be used to symbolically model check systems which, in addition, have state predicates, and therefore correspond to Kripke structures on which ACTL* and LTL formulas can be algorithmically verified using such finite symbolic abstractions.

Delayed Substitutions
José Espírito Santo

This paper investigates an approach to substitution alternative to the implicit treatment of the λ-calculus and the explicit treatment of explicit substitution calculi. In this approach, substitutions are delayed (but not executed) explicitly. We implement this idea with two calculi, one where substitution is a primitive construction of the calculus, the other where substitutions is represented by a β-redex. For both calculi, confluence and (preservation of) strong normalisation are proved (the latter fails for a related system due to Revesz, as we show). Applications of delayed substitutions are of theoretical nature. The strong normalisation result implies strong normalisation for other calculi, like the computational lambda-calculus, lambda-calculi with generalised applications, or calculi of cut-elimination for sequent calculus. We give an investigation of the computational interpretation of cut-elimination in terms of generation, execution, and delaying of substitutions, paying particular attention to how generalised applications improve such interpretation.

Innermost-Reachability and Innermost-Joinability Are Decidable for Shallow Term Rewrite Systems
Guillem Godoy, Eduard Huntingford

Reachability and joinability are central properties of term rewriting. Unfortunately they are undecidable in general, and even for some restricted classes of term rewrite systems, like shallow term rewrite systems (where variables are only allowed to occur at depth 0 or 1 in the terms of the rules).
Innermost rewriting is one of the most studied and used strategies for rewriting, since it corresponds to the "call by value" computation of programming languages. Henceforth, it is meaningful to study whether reachability and joinability are indeed decidable for a significant class of term rewrite systems with the use of the innermost strategy.
In this paper we show that reachability and joinability are decidable for shallow term rewrite systems assuming that the innermost strategy is used. All of these results are obtained via the definition of the concept of weak normal form, and a construction of a finite representation of all weak normal forms reachable from every constant. For the particular left-linear shallow case and assuming that the maximum arity of the signature is a constant, these results are obtained with polynomial time complexity.

Termination of Rewriting with Right-Flat Rules
Guillem Godoy, Eduard Huntingford, Ashish Tiwari

Termination and innermost termination are shown to be decidable for term rewrite systems whose right-hand side terms are restricted to be shallow (variables occur at depth at most one) and linear. Innermost termination is also shown to be decidable for shallow rewrite systems. In all cases, we show that nontermination implies nontermination starting from flat terms. The proof is completed by using the useful enabling result that, for right shallow rewrite systems, existence of nonterminating derivations starting from a given term is decidable. We also show that termination is undecidable for shallow rewrite systems. For right-shallow systems, general and innermost termination are both undecidable.

Abstract Critical Pairs and Confluence of Arbitrary Binary Relations
Rémy Haemmerlé, François Fages

In a seminal paper, Huet introduced abstract properties of term rewriting systems, and the confluence analysis of terminating term rewriting systems by critical pairs computation. In this paper, we provide an abstract notion of critical pair for arbitrary binary relations and context operators. We show how this notion applies to the confluence analysis of various transition systems, ranging from classical term rewriting systems to production rules with constraints and partial control strategies, such as the Constraint Handling Rules language CHR. Interestingly, we show in all these cases that some classical critical pairs can be disregarded. The crux of these analyses is the ability to compute critical pairs between states built with general context operators, on which a bounded, not necessarily well-founded, ordering is assumed.

On the Completeness of Context-Sensitive Order-Sorted Specifications
Joe Hendrix, José Meseguer

We propose three different notions of completeness for order-sorted equational specifications supporting context-sensitive rewriting modulo axioms relative to a replacement map μ. Our three notions are: (1) a definition of μ-canonical completeness under which μ-canonical forms coincide with canonical forms; (2) a definition of semantic completeness that guarantees that the μ-operational semantics and standard initial algebra semantics are isomorphic; and (3) an appropriate definition of μ-sufficient completeness with respect to a set of constructor symbols. Based on these notions, we use equational tree automata techniques to obtain decision procedures for checking these three kinds of completeness for equational specifications satisfying appropriate requirements such as weak normalization, ground confluence and sort-decreasingness, and left-linearity. The decision procedures are implemented as an extension of the Maude sufficient completeness checker.

KOOL: An Application of Rewriting Logic to Language Prototyping and Analysis
Mark Hills, Grigore Roşu

This paper presents KOOL, a concurrent, dynamic, object-oriented language defined in rewriting logic. KOOL has been designed as an experimental language, with a focus on making the language easy to extend. This is done by taking advantage of the flexibility provided by rewriting logic, which allows for the rapid prototyping of new language features. An example of this process is illustrated by sketching the addition of synchronized methods. KOOL also provides support for program analysis through language extensions and the underlying capabilities of rewriting logic. This support is illustrated with several examples.

Simple Proofs of Characterizing Strong Normalization for Explicit Substitution Calculi
Kentaro Kikuchi

We present a method of lifting to explicit substitution calculi some characterizations of the strongly normalizing terms of λ-calculus by means of intersection type systems. The method is first illustrated by applying to a composition-free calculus of explicit substitutions, yielding a simpler proof than the previous one by Lengrand et al. Then we present a new intersection type system in the style of sequent calculus, and show that it characterizes the strongly normalizing terms of Dyckhoff and Urban's extension of Herbelin's explicit substitution calculus.

Proving Termination of Rewrite Systems Using Bounds
Martin Korp, Aart Middeldorp

The use of automata techniques to prove the termination of string rewrite systems and left-linear term rewrite systems is advocated by Geser et al. in a recent sequence of papers. We extend their work to non-left-linear rewrite systems. The key to this extension is the introduction of so-called raise rules and the use of tree automata that are not quite deterministic. Furthermore, we present negative solutions to two open problems related to string rewrite systems.

Sequence Unification Through Currying
Temur Kutsia, Jordi Levy, Mateu Villaret

Sequence variables play an interesting role in unification and matching when dealing with terms in an unranked signature. Sequence Unification generalizes Word Unification and seems to be appealing for information extraction in XML documents, program transformation, and rule-based programming.
In this work we study a relation between Sequence Unification and another generalization of Word Unification: Context Unification. We introduce a variant of Context Unification, called Left-Hole Context Unification that serves us to reduce Sequence Unification to it: We define a partial currying procedure to translate sequence unification problems into left-hole context unification problems, and prove soundness of the translation. Furthermore, a precise characterization of the shape of the unifiers allows us to easily reduce Left-Hole Context Unification to (the decidable problem of) Word Unification with Regular Constraints, obtaining then a decidability proof for an extension of Sequence Unification.

The Termination Competition
Claude Marché, Hans Zantema

Since 2004, a Termination Competition is organized every year. This competition boosted a lot the development of automatic termination tools, but also the design of new techniques for proving termination. We present the background, results, and conclusions of the three first editions, and discuss perspectives and challenges for the future.

Random Descent
Vincent van Oostrom

We introduce a method for establishing that a reduction strategy is normalising and minimal, or dually, that it is perpetual and maximal, in the setting of abstract rewriting. While being complete, the method allows to reduce these global properties to the verification of local diagrams. We show its usefulness both by giving uniform proofs of some known results and by establishing new ones.

Correctness of Copy in Calculi with Letrec
Manfred Schmidt-Schauß

Call-by-need lambda calculi with letrec provide a rewriting-based operational semantics for (lazy) call-by-name functional languages. These calculi model the sharing behavior during evaluation more closely than let-based calculi that use a fixpoint combinator. However, currently the knowledge about correctness w.r.t. observational equivalence of modifying the sharing in letrec-based calculi is full of gaps. In this paper we develop a new proof method based on a calculus on infinite trees, generalizing the parallel 1-reduction, for showing correctness of instantiation operations. We demonstrate the method in the small calculus LRλ and show that copying at compile-time can be done without restrictions. We also show that the call-by-need and call-by-name strategies are equivalent w.r.t. contextual equivalence. A consequence is correctness of all the transformations like instantiation, inlining, specialization and common subexpression elimination in LRλ. The result for LRλ also gives an answer to unresolved problems in several papers and thus contributes to the knowledge about deterministic calculi with letrec.
The method also works for a calculus with case and constructors, and also with parallel or. We are also confident that the method scales up for proving correctness of copy-related transformations in non-deterministic lambda calculi if restricted to "deterministic" subterms.

A Characterization of Medial as Rewriting Rule
Lutz Straßburger

Medial is an inference rule scheme that appears in various deductive systems based on deep inference. In this paper we investigate the properties of medial as rewriting rule independently from logic. We present a graph theoretical criterion for checking whether there exists a medial rewriting path between two formulas. Finally, we return to logic and apply our criterion for giving a combinatorial proof for a decomposition theorem, i.e., proof theoretical statement about syntax.

The Maximum Length of Mu-Reduction in Lambda Mu-Calculus
Makoto Tatsuta

This paper gives the exact number of the maximum length of mu-reduction and permutative conversions for an untyped term in lambda mu-calculus with disjunction. This number is described by using induction on the number of symbols in a term. It is also shown that leftmost short reduction and innermost null reduction produce the longest reduction sequence.

On Linear Combinations of λ-Terms
Lionel Vaux

We define an extension of λ-calculus with linear combinations, endowing the set of terms with a structure of R-module, where R is a fixed set of scalars. Terms are moreover subject to identities similar to usual pointwise definition of linear combinations of functions with values in a vector space. We then extend β-reduction on those algebraic λ-terms as follows: at+u reduces to at'+u as soon as term t reduces to t' and a is a non-zero scalar. We prove that reduction is confluent.
Under the assumption that the set R of scalars is positive (ie a sum of scalars is zero iff all of them are zero), we show that this algebraic λ-calculus is a conservative extension of ordinary λ-calculus. On the other hand, we show that if R admits negative elements, then every term reduces to every other term.

Satisfying KBO Constraints
Harald Zankl, Aart Middeldorp

This paper presents two new approaches to prove termination of rewrite systems with the Knuth-Bendix order efficiently. The constraints for the weight function and for the precedence are encoded in (pseudo-)propositional logic and the resulting formula is tested for satisfiability. Any satisfying assignment represents a weight function and a precedence such that the induced Knuth-Bendix order orients the rules of the encoded rewrite system from left to right.

Termination by Quasi-periodic Interpretations
Hans Zantema, Johannes Waldmann

We present a new method for automatically proving termination of term rewriting and string rewriting. It is based on the well-known idea of interpretation of terms in natural numbers where every rewrite step causes a decrease. In the dependency pair setting only weak monotonicity is required for these interpretations. For these we use quasi-periodic functions. It turns out that then the decreasingness for rules only needs to be checked for finitely many values, which is easy to implement.
Using this technique we automatically prove termination of over ten string rewriting systems in TPDB for which termination was open until now.

2006

Solving Partial Order Constraints for LPO Termination
Michael Codish, Vitaly Lagoon, Peter J. Stuckey

This paper introduces a new kind of propositional encoding for reasoning about partial orders. The symbols in an unspecified partial order are viewed as variables which take integer values and are interpreted as indices in the order. For a partial order statement on n symbols each index is represented in
[log2n] propositional variables and partial order constraints between symbols are modeled on the bit representations. We illustrate the application of our approach to determine LPO termination for term rewrite systems. Experimental results are unequivocal, indicating orders of magnitude speedups in comparison with current implementations for LPO termination. The proposed encoding is general and relevant to other applications which involve propositional reasoning about partial orders.

Computationally Equivalent Elimination of Conditions
Traian Florin Şerbănută, Grigore Roşu

An automatic and easy to implement transformation of conditional term rewrite systems into computationally equivalent unconditional term rewrite systems is presented. No special support is needed from the underlying unconditional rewrite engine. Since unconditional rewriting is more amenable to parallelization, our transformation is expected to lead to efficient concurrent implementations of rewriting.

On the Correctness of Bubbling
Sergio Antoy, Daniel W. Brown, Su-Hui Chiang

Bubbling, a recently introduced graph transformation for functional logic computations, is well-suited for the reduction of redexes with distinct replacements. Unlike backtracking, bubbling preserves operational completeness; unlike copying, it avoids the up-front construction of large contexts of redexes, an expensive and frequently wasteful operation. We recall the notion of bubbling and offer the first proof of its completeness and soundness with respect to rewriting.

Propositional Tree Automata
Joe Hendrix, Hitoshi Ohsaki, Mahesh Viswanathan

In the paper, we introduce a new tree automata framework, called propositional tree automata, capturing the class of tree languages that are closed under an equational theory and Boolean operations. This framework originates in work on developing a sufficient completeness checker for specifications with rewriting modulo an equational theory. Propositional tree automata recognize regular equational tree languages. However, unlike regular equational tree automata, the class of propositional tree automata is closed under Boolean operations. This extra expressiveness does not affect the decidability of the membership problem. This paper also analyzes in detail the emptiness problem for propositional tree automata with associative theories. Though undecidable in general, we present a semi-algorithm for checking emptiness based on machine learning that we have found useful in practice.

Generalizing Newman's Lemma for Left-Linear Rewrite Systems
Bernhard Gramlich, Salvador Lucas

Confluence criteria for non-terminating rewrite systems are known to be rare and notoriously difficult to obtain. Here we prove a new result in this direction. Our main result is a generalized version of Newman's Lemma for left-linear term rewriting systems that does not need a full termination assumption. We discuss its relationships to previous confluence criteria, its restrictions, examples of application as well as open problems. The whole approach is developed in the (more general) framework of context-sensitive rewriting which thus turns out to be useful also for ordinary (context-free) rewriting.

Unions of Equational Monadic Theories
Piotr Hoffman

We investigate the decidability of unions of decidable equational theories. We focus on monadic theories, i.e., theories over signatures with unary symbols only. This allows us to make use of the equivalence between monoid amalgams and unions of monadic theories. We show that if the intersection theory is unitary, then the decidability of the union is guaranteed by the decidability of tensor products. We prove that if the intersection theory is a group or a group with zero, then the union is decidable. Finally, we show that even if the intersection theory is a 3-element monoid and is unitary, the union may be undecidable, but that it will always be decidable if the intersection is 2-element unitary. We also show that unions of regular theories, i.e., theories recognizable by finite automata, can be undecidable. However, we prove that they are decidable if the intersection theory is unitary.

Modular Church-Rosser Modulo
Jean-Pierre Jouannaud

In [12], Toyama proved that the union of two confluent term-rewriting systems that share absolutely no function symbols or constants is likewise confluent, a property called modularity. The proof of this beautiful modularity result, technically based on slicing terms into an homogeneous cap and a so called alien, possibly heterogeneous substitution, was later substantially simplified in [5,11].
In this paper we present a further simplification of the proof of Toyama's result for confluence, which shows that the crux of the problem lies in two different properties: a cleaning lemma, whose goal is to anticipate the application of collapsing reductions; a modularity property of ordered completion, that allows to pairwise match the caps and alien substitutions of two equivalent terms.
We then show that Toyama's modularity result scales up to rewriting modulo equations in all considered cases.

Hierarchical Combination of Intruder Theories
Yannick Chevalier, Michaël Rusinowitch

Recently automated deduction tools have proved to be very effective for detecting attacks on cryptographic protocols. These analysis can be improved, for finding more subtle weaknesses, by a more accurate modelling of operators employed by protocols. Several works have shown how to handle a single algebraic operator (associated with a fixed intruder theory) or how to combine several operators satisfying disjoint theories. However several interesting equational theories, such as exponentiation with an abelian group law for exponents remain out of the scope of these techniques. This has motivated us to introduce a new notion of hierarchical combination for intruder theories and to show decidability results for the deduction problem in these theories. Under a simple hypothesis, we were able to simplify this deduction problem. This simplification is then applied to prove the decidability of constraint systems w.r.t. an intruder relying on exponentiation theory.

Feasible Trace Reconstruction for Rewriting Approximations
Yohan Boichut, Thomas Genet

Term Rewriting Systems are now commonly used as a modeling language for programs or systems. On those rewriting based models, reachability analysis, i.e. proving or disproving that a given term is reachable from a set of input terms, provides an efficient verification technique. For disproving reachability (i.e. proving non reachability of a term) on non terminating and non confluent rewriting models, Knuth-Bendix completion and other usual rewriting techniques do not apply. Using the tree automaton completion technique, it has been shown that the non reachability of a term t can be shown by computing an over-approximation of the set of reachable terms and prove that t is not in the approximation. However, when the term t is in the approximation, nothing can be said. In this paper, we refine this approach and propose a method taking advantage of the approximation to compute a rewriting path to the reachable term when it exists, i.e. produce a counter example. The algorithm has been prototyped in the Timbuk tool. We present some experiments with this prototype showing the interest of such an approach w.r.t. verification of rewriting models.

Syntactic Descriptions: A Type System for Solving Matching Equations in the Linear λ-Calculus
Sylvain Salvati

We present an extension of the λ(η)-calculus with a case construct that propagates through functions like a head linear substitution, and show that this construction permits to recover the expressiveness of ML-style pattern matching. We then prove that this system enjoys the Church-Rosser property using a semi-automatic 'divide and conquer' technique by which we determine all the pairs of commuting subsystems of the formalism (considering all the possible combinations of the nine primitive reduction rules). Finally, we prove a separation theorem similar to Böhm's theorem for the whole formalism.

A Terminating and Confluent Linear Lambda Calculus
Yo Ohta, Masahito Hasegawa

We present a rewriting system for the linear lambda calculus corresponding to the {!,-o}-fragment of intuitionistic linear logic. This rewriting system is shown to be strongly normalizing, and Church-Rosser modulo the trivial commuting conversion. Thus it provides a simple decision method for the equational theory of the linear lambda calculus. As an application we prove the strong normalization of the simply typed computational lambda calculus by giving a reduction-preserving translation into the linear lambda calculus.

A Lambda-Calculus with Constructors
Ariel Arbiser, Alexandre Miquel, Alejandro Ríos

We present an extension of the λ(η)-calculus with a case construct that propagates through functions like a head linear substitution, and show that this construction permits to recover the expressiveness of ML-style pattern matching. We then prove that this system enjoys the Church-Rosser property using a semi-automatic 'divide and conquer' technique by which we determine all the pairs of commuting subsystems of the formalism (considering all the possible combinations of the nine primitive reduction rules). Finally, we prove a separation theorem similar to Böhm's theorem for the whole formalism.

Structural Proof Theory as Rewriting
José Espírito Santo, Maria João Frade, Luis Pinto

The multiary version of the λ-calculus with generalized applications integrates smoothly both a fragment of sequent calculus and the system of natural deduction of von Plato. It is equipped with reduction rules (corresponding to cut-elimination/normalisation rules) and permutation rules, typical of sequent calculus and of natural deduction with generalised elimination rules. We argue that this system is a suitable tool for doing structural proof theory as rewriting. As an illustration, we investigate combinations of reduction and permutation rules and whether these combinations induce rewriting systems which are confluent and terminating. In some cases, the combination allows the simulation of non-terminating reduction sequences known from explicit substitution calculi. In other cases, we succeed in capturing interesting classes of derivations as the normal forms w.r.t. well-behaved combinations of rules. We identify six of these "combined" normal forms, among which are two classes, due to Herbelin and Mints, in bijection with normal, ordinary natural deductions. A computational explanation for the variety of "combined" normal forms is the existence of three ways of expressing multiple application in the calculus.

Checking Conservativity of Overloaded Definitions in Higher-Order Logic
Steven Obua

Overloading in the context of higher-order logic has been used for some time now. We define what we mean by Higher-Order Logic with Conservative Overloading (HOLCO). HOLCO captures how overloading is actually applied by the users of Isabelle.
We show that checking whether definitions obey the rules of HOLCO is not even semi-decidable.
The undecidability proof reveals strong ties between our problem and the dependency pair method by Arts and Giesl for proving termination of TRSs via the notion overloading TRS. The dependency graph of overloading TRSs can be computed exactly. We exploit this by providing an algorithm that checks the conservativity of definitions based on the dependency pair method and a simple form of linear polynomial interpretation; the algorithm also uses the strategy of Hirokawa and Middeldorp of recursively calculating the strongly connected components of the dependency graph. The algorithm is powerful enough to deal with all overloaded definitions that the author has encountered so far in practice.
An implementation of this algorithm is available as part of a package that adds conservative overloading to Isabelle. This package also allows to delegate the conservativity check to external tools like the Tyrolean Termination Tool or the Automated Program Verification Environment.

Certified Higher-Order Recursive Path Ordering
Adam Koprowski

The paper reports on a formalization of a proof of well-foundedness of the higher-order recursive path ordering (HORPO) in the proof checker Coq. The development is axiom-free and fully constructive. Three substantive parts that could be used also in other developments are the formalizations of the simply-typed lambda calculus, of finite multisets and of the multiset ordering. The Coq code consists of more than 1000 lemmas and 300 definitions.

Dealing with Non-orientable Equations in Rewriting Induction
Takahito Aoto

Rewriting induction (Reddy, 1990) is an automated proof method for inductive theorems of term rewriting systems. Reasoning by the rewriting induction is based on the noetherian induction on some reduction order. Thus, when the given conjecture is not orientable by the reduction order in use, any proof attempts for that conjecture fails; also conjectures such as a commutativity equation are out of the scope of the rewriting induction because they can not be oriented by any reduction order. In this paper, we give an enhanced rewriting induction which can deal with non-orientable conjectures. We also present an extension which intends an incremental use of our enhanced rewriting induction.

TPA: Termination Proved Automatically
Adam Koprowski

TPA is a tool for proving termination of term rewrite systems (TRSs) in a fully automated fashion. The distinctive feature of TPA is the support for relative termination and the use of the technique of semantic labelling with natural numbers. Thanks to the latter, TPA is capable of delivering automated termination proofs for some difficult TRSs for which all other tools fail.

RAPT: A Program Transformation System Based on Term Rewriting
Yuki Chiba, Takahito Aoto

Chiba et al. (2005) proposed a framework of program transformation by template based on term rewriting in which correctness of the transformation is verified automatically. This paper describes RAPT (Rewriting-based Automated Program Transformation system) which implements this framework.

The CL-Atse Protocol Analyser
Mathieu Turuani

This paper presents an overview of the CL-Atse tool, an efficient and versatile automatic analyser for the security of cryptographic protocols. CL-Atse takes as input a protocol specified as a set of rewriting rules (IF format, produced by the AVISPA compiler), and uses rewriting and constraint solving techniques to model all reachable states of the participants and decide if an attack exists w.r.t. the Dolev-Yao intruder. Any state-based security property can be modelled (like secrecy, authentication, fairness, etc...), and the algebraic properties of operators like xor or exponentiation are taken into account with much less limitations than other tools, thanks to a complete modular unification algorithm. Also, useful constraints like typing, inequalities, or shared sets of knowledge (with set operations like removes, negative tests, etc...) can also be analysed.

Slothrop: Knuth-Bendix Completion with a Modern Termination Checker
Ian Wehrman, Aaron Stump, Edwin M. Westbrook

A Knuth-Bendix completion procedure is parametrized by a reduction ordering used to ensure termination of intermediate and resulting rewriting systems. While in principle any reduction ordering can be used, modern completion tools typically implement only Knuth-Bendix and path orderings. Consequently, the theories for which completion can possibly yield a decision procedure are limited to those that can be oriented with a single path order.
In this paper, we present a variant on the Knuth-Bendix completion procedure in which no ordering is assumed. Instead we rely on a modern termination checker to verify termination of rewriting systems. The new method is correct if it terminates; the resulting rewrite system is convergent and equivalent to the input theory. Completions are also not just ground-convergent, but fully convergent. We present an implementation of the new procedure, Slothrop, which automatically obtains such completions for theories that do not admit path orderings.

Predictive Labeling
Nao Hirokawa, Aart Middeldorp

Semantic labeling is a transformation technique for proving the termination of rewrite systems. The semantic part is given by a quasi-model of the rewrite rules. In this paper we present a variant of semantic labeling in which the quasi-model condition is only demanded for the usable rules induced by the labeling. Our variant is less powerful in theory but maybe more useful in practice.

Termination of String Rewriting with Matrix Interpretations
Dieter Hofbauer, Johannes Waldmann

A rewriting system can be shown terminating by an order-preserving mapping into a well-founded domain. We present an instance of this scheme for string rewriting where the domain is a set of square matrices of natural numbers, equipped with a well-founded ordering that is not total. The coefficients of the matrices can be found via a transformation to a boolean satisfiability problem. The matrix method also supports relative termination, thus it fits with the dependency pair method as well. Our implementation is able to automatically solve hard termination problems.

Decidability of Termination for Semi-constructor TRSs, Left-Linear Shallow TRSs and Related Systems
Yi Wang, Masahiko Sakai

We consider several classes of term rewriting systems and prove that termination is decidable for these classes. By showing the cycling property of infinite dependency chains, we prove that termination is decidable for semi-constructor case, which is a superclass of right-ground TRSs. By analyzing argument propagation cycles in the dependency graph, we show that termination is also decidable for left-linear shallow TRSs. Moreover we extend these by combining these two techniques.

Proving Positive Almost Sure Termination Under Strategies
Olivier Bournez, Florent Garnier

In last RTA, we introduced the notion of probabilistic rewrite systems and we gave some conditions entailing termination of those systems within a finite mean number of reduction steps.
Termination was considered under arbitrary unrestricted policies. Policies correspond to strategies for non-probabilistic rewrite systems.
This is often natural or more useful to restrict policies to a subclass. We introduce the notion of positive almost sure termination under strategies, and we provide sufficient criteria to prove termination of a given probabilitic rewrite system under strategies. This is illustrated with several examples.

A Proof of Finite Family Developments for Higher-Order Rewriting Using a Prefix Property
Harrie Jan Sander Bruggink

A prefix property is the property that, given a reduction, the ancestor of a prefix of the target is a prefix of the source. In this paper we prove a prefix property for the class of Higher-Order Rewriting Systems with patterns (HRSs), by reducing it to a similar prefix property of a λ-calculus with explicit substitutions. This prefix property is then used to prove that Higher-order Rewriting Systems enjoy Finite Family Developments. This property states, that reductions in which the creation depth of the redexes is bounded are finite, and is a useful tool to prove various properties of HRSs.

Higher-Order Orderings for Normal Rewriting
Jean-Pierre Jouannaud, Albert Rubio

We extend the termination proof methods based on reduction orderings to higher-order rewriting systems à la Nipkow using higher-order pattern matching for firing rules, and accommodate for any use of eta, as a reduction, as an expansion or as an equation. As a main novelty, we provide with a mechanism for transforming any reduction ordering including beta-reduction, such as the higher-order recursive path ordering, into a reduction ordering for proving termination of rewriting à la Nipkow. Non-trivial examples are carried out.

Bounded Second-Order Unification Is NP-Complete
Jordi Levy, Manfred Schmidt-Schauß, Mateu Villaret

Bounded Second-Order Unification is the problem of deciding, for a given second-order equation and a positive integer m, whether there exists a unifier σ such that, for every second-order variable F, the terms instantiated for F have at most m occurrences of every bound variable.
It is already known that Bounded Second-Order Unification is decidable and NP-hard, whereas general Second-Order Unification is undecidable. We prove that Bounded Second-Order Unification is NP-complete, provided that m is given in unary encoding, by proving that a size-minimal solution can be represented in polynomial space, and then applying a generalization of Plandowski's polynomial algorithm that compares compacted terms in polynomial time.

2005

Generalized Innermost Rewriting
Jaco van de Pol, Hans Zantema

We propose two generalizations of innermost rewriting for which we prove that termination of innermost rewriting is equivalent to termination of generalized innermost rewriting. As a consequence, by rewriting in an arbitrary TRS certain non-innermost steps may be allowed by which the termination behavior and efficiency is often much better, but never worse than by only doing innermost rewriting.

Orderings for Innermost Termination
Mirtha-Lina Fernández, Guillem Godoy, Albert Rubio

This paper shows that the suitable orderings for proving innermost termination are characterized by the innermost parallel monotonicity, IP-monotonicity for short. This property may lead to several innermost-specific orderings. Here, an IP-monotonic version of the Recursive Path Ordering is presented. This variant can be used (directly or as ingredient of the Dependency Pairs method) for proving innermost termination of non-terminating term rewrite systems.

Leanest Quasi-orderings
Nachum Dershowitz, E. Castedo Ellerman

A convenient method for defining a quasi-ordering, such as those used for proving termination of rewriting, is to choose the minimum of a set of quasi-orderings satisfying some desired traits. Unfortunately, a minimum in terms of set inclusion can be non-existent even when an intuitive "minimum" exists. We suggest an alternative to set inclusion, called "leanness", show that leanness is a partial ordering of quasi-orderings, and provide sufficient conditions for the existence of a "leanest" ordering.

Abstract Modularity
Michael Abbott, Neil Ghani, Christoph Lüth

Modular rewriting seeks criteria under which rewrite systems inherit properties from their smaller subsystems. This divide and conquer methodology is particularly useful for reasoning about large systems where other techniques fail to scale adequately. Research has typically focused on reasoning about the modularity of specific properties for specific ways of combining specific forms of rewriting.
This paper is, we believe, the first to ask a much more general question. Namely, what can be said about modularity independently of the specific form of rewriting, combination and property at hand. A priori there is no reason to believe that anything can actually be said about modularity without reference to the specifics of the particular systems etc. However, this paper shows that, quite surprisingly, much can indeed be said.

Union of Equational Theories: An Algebraic Approach
Piotr Hoffman

We consider the well-known problem of deciding the union of decidable equational theories. We focus on monadic theories, i.e., theories over signatures with unary function symbols only. The equivalence of the category of monadic equational theories and the category of monoids is used. This equivalence facilitates a translation of the considered decidability problem into the word problem in the pushout of monoids which themselves have decidable word problems. Using monoids, existing results on the union of theories are then restated and proved in a succint way. The idea is then analyzed of first guaranteeing that the union is a "jointly conservative" extension and then using this property to show decidability of the union. It is shown that "joint conservativity" is equivalent to the corresponding monoid amalgam being embeddable; this allows one to apply results from amalgamation theory to this problem. Then we prove that using this property to show decidability is a more difficult matter: it turns out that even if this property and some additional conditions hold, the problem remains undecidable.

Equivariant Unification
James Cheney

Nominal logic is a variant of first-order logic with special facilities for reasoning about names and binding based on the underlying concepts of swapping and freshness. It serves as the basis of logic programming and term rewriting techniques that provide similar advantages to, but remain simpler than, higher-order logic programming or term rewriting systems. Previous work on nominal rewriting and logic programming has relied on nominal unification, that is, unification up to equality in nominal logic. However, because of nominal logic's equivariance property, these applications require a stronger form of unification, which we call equivariant unification. Unfortunately, equivariant unification and matching are NP-hard decision problems. This paper presents an algorithm for equivariant unification that produces a complete set of finitely many solutions, as well as NP decision procedure and a version that enumerates solutions one at a time. In addition, we present a polynomial time algorithm for swapping-free equivariant matching, that is, for matching problems in which the swapping operation does not appear.

Faster Basic Syntactic Mutation with Sorts for Some Separable Equational Theories
Christopher Lynch, Barbara Morawska

Sorting information arises naturally in E-unification problems. This information is used to rule out invalid solutions. We show how to use sorting information to make E-unification procedures more efficient. We illustrate our ideas using Basic Syntactic Mutation. We give classes of problems where E-unification becomes polynomial. We show how E-unification can be separated into a polynomial part and a more complicated part using a specialized algorithm. Our approach is motivated by a real problem arising from Cryptographic Protocol Verification.

Unification in a Class of Permutative Theories
Thierry Boy de la Tour, Mnacho Echenim

It has been proposed in [1] to perform deduction modulo leaf permutative theories, which are notoriously hard to handle directly in equational theorem proving. But unification modulo such theories is a difficult task, not tackled in [1]; a subclass of flat equations has been considered only recently, in [2]. Our emphasis on group theoretic structures led us in [6] to the definition of a more general subclass of leaf permutative theories, the unify-stable theories. They have good semantic and algorithmic properties, which we use here to design a complete unification algorithm.

Dependency Pairs for Simply Typed Term Rewriting
Takahito Aoto, Toshiyuki Yamada

Simply typed term rewriting proposed by Yamada (RTA, 2001) is a framework of higher-order term rewriting without bound variables. In this paper, the dependency pair method of first-order term rewriting introduced by Arts and Giesl (TCS, 2000) is extended in order to show termination of simply typed term rewriting systems. Basic concepts such as dependency pairs and estimated dependency graph in the simply typed term rewriting framework are clarified. The subterm criterion introduced by Hirokawa and Middeldorp (RTA, 2004) is successfully extended to the case where terms of function type are allowed. Finally, an experimental result for a collection of simply typed term rewriting systems is presented. Our method is compared with the direct application of the first-order dependency pair method to a first-order encoding of simply typed term rewriting systems.

Universal Algebra for Termination of Higher-Order Rewriting
Makoto Hamana

We show that the structures of binding algebras and Σ-monoids by Fiore, Plotkin and Turi are sound and complete models of Klop's Combinatory Reduction Systems (CRSs). These algebraic structures play the same role of universal algebra for term rewriting systems. Restricting the algebraic structures to the ones equipped with well-founded relations, we obtain a complete characterisation of terminating CRSs. We can also naturally extend the characterisation to rewriting on meta-terms by using the notion of Σ-monoids.

Quasi-interpretations and Small Space Bounds
Guillaume Bonfante, Jean-Yves Marion, Jean-Yves Moyen

Quasi-interpretations are an useful tool to control resources usage of term rewriting systems, either time or space. They not only combine well with path orderings and provide characterizations of usual complexity classes but also give hints in order to optimize the program. Moreover, the existence of a quasi-interpretation is decidable.
In this paper, we present some more characterizations of complexity classes using quasi-interpretations. We mainly focus on small space-bounded complexity classes. On one hand, by restricting quasi-interpretations to sums (that is allowing only affine quasi-interpretations), we obtain a characterization of LinSpace. On the other hand, a strong tiering discipline on programs together with quasi-interpretations yield a characterization of LogSpace.
Lastly, we give two new characterizations of Pspace: in the first, the quasi-interpretation has to be strictly decreasing on each rule and in the second, some linearity constraints are added to the system but no assumption concerning the termination proof is made.

A Sufficient Completeness Reasoning Tool for Partial Specifications
Joe Hendrix, Manuel Clavel, José Meseguer

We present the Maude sufficient completeness tool, which explicitly supports sufficient completeness reasoning for partial conditional specifications having sorts and subsorts and with domains of functions defined by conditional memberships. Our tool consists of two main components: (i) a sufficient completeness analyzer that generates a set of proof obligations which if discharged, ensures sufficient completeness; and (ii) Maude's inductive theorem prover (ITP) that is used as a backend to try to automatically discharge those proof obligations.

Tyrolean Termination Tool
Nao Hirokawa, Aart Middeldorp

This paper describes the Tyrolean Termination Tool (TTT in the sequel), the successor of the Tsukuba Termination Tool [12]. We describe the differences between the two and explain the new features, some of which are not (yet) available in any other termination tool, in some detail. TTT is a tool for automatically proving termination of rewrite systems based on the dependency pair method of Arts and Giesl [3]. It produces high-quality output and has a convenient web interface. The tool is available at http://cl2-informatik.uibk.ac.at/ttt.
TTT incorporates several new improvements to the dependency pair method. In addition, it is now possible to run the tool in fully automatic mode on a collection of rewrite systems. Moreover, besides ordinary (first-order) rewrite systems, the tool accepts simply-typed applicative rewrite systems which are transformed into ordinary rewrite systems by the recent method of Aoto and Yamada [2].
In the next section we describe the differences between the semi automatic mode and the Tsukuba Termination Tool. Section 3 describes the fully automatic mode. In Section 4 we show a termination proof of a simply-typed applicative system obtained by TTT. In Section 5 we describe how to input a collection of rewrite systems and how to interpret the resulting output. Some implementation details are given in Section 6. The final section contains a short comparison with other tools for automatically proving termination.

λμ-Calculus and Duality: Call-by-Name and Call-by-Value
Jérôme Rocheteau

Under the extension of Curry-Howard's correspondence to classical logic, Gentzen's NK and LK systems can be seen as syntax-directed systems of simple types respectively for Parigot's λμ-calculus and Curien-Herbelin's λμμ
-calculus. We aim at showing their computational equivalence. We define translations between these calculi. We prove simulation theorems for an undirected evaluation as well as for call-by-name and call-by-value evaluations.

Reduction in a Linear Lambda-Calculus with Applications to Operational Semantics
Alex K. Simpson

We study beta-reduction in a linear lambda-calculus derived from Abramsky's linear combinatory algebras. Reductions are classified depending on whether the redex is in the computationally active part of a term ("surface" reductions) or whether it is suspended within the body of a thunk ("internal" reductions). If surface reduction is considered on its own then any normalizing term is strongly normalizing. More generally, if a term can be reduced to surface normal form by a combined sequence of surface and internal reductions then every combined reduction sequence from the term contains only finitely many surface reductions. We apply these results to the operational semantics of Lily, a second-order linear lambda-calculus with recursion, introduced by Bierman, Pitts and Russo, for which we give simple proofs that call-by-value, call-by-name and call-by-need contextual equivalences coincide.

Higher-Order Matching in the Linear Lambda Calculus in the Absence of Constants Is NP-Complete
Ryo Yoshinaka

A lambda term is linear if every bound variable occurs exactly once. The same constant may occur more than once in a linear term. It is known that higher-order matching in the linear lambda calculus is NP-complete (de Groote 2000), even if each unknown occurs exactly once (Salvati and de Groote 2003). Salvati and de Groote (2003) also claim that the interpolation problem, a more restricted kind of matching problem which has just one occurrence of just one unknown, is NP-complete in the linear lambda calculus. In this paper, we correct a flaw in Salvati and de Groote's (2003) proof of this claim, and prove that NP-hardness still holds if we exclude constants from problem instances. Thus, multiple occurrences of constants do not play an essential role for NP-hardness of higher-order matching in the linear lambda calculus.

Localized Fairness: A Rewriting Semantics
José Meseguer

Fairness is a rich phenomenon: we have weak and strong fairness, and many different variants of those concepts: transition fairness, object/process fairness, actor fairness, position fairness, and so on, associated with specific models or languages, but lacking a common theoretical framework. This work uses rewriting semantics as a common theoretical framework for fairness. A common thread tying together the different fairness variants is the notion of localization: fairness must often be localized to specific entities in a system. For systems specified as rewrite theories localization can be formalized by making explicit the subset of variables in a rule corresponding to the items that must be localized. In this way, localized fairness becomes a parametric notion, that can be easily specialized to model a very wide range of fairness phenomena. After formalizing these concepts and proving basic results, the paper studies in detail both a relative and an absolute LTL semantics for rewrite theories with localized fairness requirements, and shows that it is always possible to pass from the relative to the absolute semantics by means of a theory transformation. This allows using a standard LTL model checker to check properties under fairness assumptions.

Partial Inversion of Constructor Term Rewriting Systems
Naoki Nishida, Masahiko Sakai, Toshiki Sakabe

Partial-inversion compilers generate programs which compute some unknown inputs of given programs from a given output and the rest of inputs whose values are already given. In this paper, we propose a partial-inversion compiler of constructor term rewriting systems. The compiler automatically generates a conditional term rewriting system, and then unravels it to an unconditional system. To improve the efficiency of inverse computation, we show that innermost strategy is usable to obtain all solutions if the generated system is right-linear.

Natural Narrowing for General Term Rewriting Systems
Santiago Escobar, José Meseguer, Prasanna Thati

For narrowing to be an efficient evaluation mechanism, several lazy narrowing strategies have been proposed, although typically for the restricted case of left-linear constructor systems. These assumptions, while reasonable for functional programming applications, are too restrictive for a much broader range of applications to which narrowing can be fruitfully applied, including applications where rules have a non-equational meaning either as transitions in a concurrent system or as inferences in a logical system. In this paper, we propose an efficient lazy narrowing strategy called natural narrowing which can be applied to general term rewriting systems with no restrictions whatsoever. An important consequence of this generalization is the wide range of applications that can now be efficiently supported by narrowing, such as symbolic model checking and theorem proving.

The Finite Variant Property: How to Get Rid of Some Algebraic Properties
Hubert Comon-Lundh, Stéphanie Delaune

We consider the following problem: Given a term t, a rewrite system R, a finite set of equations E' such that R is E'-convergent, compute finitely many instances of t: t1,...,tn such that, for every substitution σ, there is an index i and a substitution θ such that tσ↓ =E' tiθ (where tσ↓ is the normal form of tσ w.r.t. →E'\R).
The goal of this paper is to give equivalent (resp. sufficient) conditions for the finite variant property and to systematically investigate this property for equational theories, which are relevant to security protocols verification. For instance, we prove that the finite variant property holds for Abelian Groups, and a theory of modular exponentiation and does not hold for the theory ACUNh (Associativity, Commutativity, Unit, Nilpotence, homomorphism).

Intruder Deduction for AC-Like Equational Theories with Homomorphisms
Pascal Lafourcade, Denis Lugiez, Ralf Treinen

Cryptographic protocols are small programs which involve a high level of concurrency and which are difficult to analyze by hand. The most successful methods to verify such protocols rely on rewriting techniques and automated deduction in order to implement or mimic the process calculus describing the protocol execution.
We focus on the intruder deduction problem, that is the vulnerability to passive attacks, in presence of several variants of AC-like axioms (from AC to Abelian groups, including the theory of exclusive or) and homomorphism which are the most frequent axioms arising in cryptographic protocols. Solutions are known for the cases of exclusive or, of Abelian groups, and of homomorphism alone. In this paper we address the combination of these AC-like theories with the law of homomorphism which leads to much more complex decision problems.
We prove decidability of the intruder deduction problem in all cases considered. Our decision procedure is in EXPTIME, except for a restricted case in which we have been able to get a PTIME decision procedure using a property of one-counter and pushdown automata.

Proving Positive Almost-Sure Termination
Olivier Bournez, Florent Garnier

In order to extend the modeling capabilities of rewriting systems, it is rather natural to consider that the firing of rules can be subject to some probabilistic laws. Considering rewrite rules subject to probabilities leads to numerous questions about the underlying notions and results.
We focus here on the problem of termination of a set of probabilistic rewrite rules. A probabilistic rewrite system is said almost surely terminating if the probability that a derivation leads to a normal form is one. Such a system is said positively almost surely terminating if furthermore the mean length of a derivation is finite. We provide several results and techniques in order to prove positive almost sure termination of a given set of probabilistic rewrite rules. All these techniques subsume classical ones for non-probabilistic systems.

Termination of Single-Threaded One-Rule Semi-Thue Systems
Wojciech Moczydlowski, Alfons Geser

This paper is a contribution to the long standing open problem of uniform termination of Semi-Thue Systems that consist of one rule s → t. McNaughton previously showed that rules incapable of (1) deleting t completely from both sides, (2) deleting t completely from the left, and (3) deleting t completely from the right, have a decidable uniform termination problem. We use a novel approach to show that Premise (2) or, symmetrically, Premise (3), is inessential. Our approach is based on derivations in which every pair of successive steps has an overlap. We call such derivations single-threaded.

On Tree Automata that Certify Termination of Left-Linear Term Rewriting Systems
Alfons Geser, Dieter Hofbauer, Johannes Waldmann, Hans Zantema

We present a new method for proving termination of term rewriting systems automatically. It is a generalization of the match bound method for string rewriting. To prove that a term rewriting system terminates on a given regular language of terms, we first construct an enriched system over a new signature that simulates the original derivations. The enriched system is an infinite system over an infinite signature, but it is locally terminating: every restriction of the enriched system to a finite signature is terminating. We then construct iteratively a finite tree automaton that accepts the enriched given regular language and is closed under rewriting modulo the enriched system. If this procedure stops, then the enriched system is compact: every enriched derivation involves only a finite signature. Therefore, the original system terminates. We present three methods to construct the enrichment: top heights, roof heights, and match heights. Top and roof heights work for left-linear systems, while match heights give a powerful method for linear systems. For linear systems, the method is strengthened further by a forward closure construction. Using these methods, we give examples for automated termination proofs that cannot be obtained by standard methods.

Extending the Explicit Substitution Paradigm
Delia Kesner, Stéphane Lengrand

We present a simple term language with explicit operators for erasure, duplication and substitution enjoying a sound and complete correspondence with the intuitionistic fragment of Linear Logic's Proof Nets. We establish the good operational behaviour of the language by means of some fundamental properties such as confluence, preservation of strong normalisation, strong normalisation of well-typed terms and step by step simulation. This formalism is the first term calculus with explicit substitutions having full composition and preserving strong normalisation.

Arithmetic as a Theory Modulo
Gilles Dowek, Benjamin Werner

We present constructive arithmetic in Deduction modulo with rewrite rules only.

Infinitary Combinatory Reduction Systems
Jeroen Ketema, Jakob Grue Simonsen

We define infinitary combinatory reduction systems (iCRSs). This provides the first extension of infinitary rewriting to higher-order rewriting. We lift two well-known results from infinitary term rewriting systems and infinitary λ-calculus to iCRSs:
1. every reduction sequence in a fully-extended left-linear iCRS is compressible to a reduction sequence of length at most ω, and
2. every complete development of the same set of redexes in an orthogonal iCRS ends in the same term.

Proof-Producing Congruence Closure
Robert Nieuwenhuis, Albert Oliveras

Many applications of congruence closure nowadays require the ability of recovering, among the thousands of input equations, the small subset that caused the equivalence of a given pair of terms. For this purpose, here we introduce an incremental congruence closure algorithm that has an additional Explain operation.
First, two variations of union-find data structures with Explain are introduced. Then, these are applied inside a congruence closure algorithm with Explain, where a k-step proof can be recovered in almost optimal time (quasi-linear in k), without increasing the overall O(n log n) runtime of the fastest known congruence closure algorithms.
This non-trivial (ground) equational reasoning result has been quite intensively sought after (see, e.g., [SD99,dMRS04,KS04]), and moreover has important applications to verification.

The Algebra of Equality Proofs
Aaron Stump, Li-Yang Tan

Proofs of equalities may be built from assumptions using proof rules for reflexivity, symmetry, and transitivity. Reflexivity is an axiom proving x=x for any x; symmetry is a 1-premise rule taking a proof of x=y and returning a proof of y=x; and transitivity is a 2-premise rule taking proofs of x=y and y=z, and returning a proof of x=z. Define an equivalence relation to hold between proofs iff they prove a theorem in common. The main theoretical result of the paper is that if all assumptions are independent, this equivalence relation is axiomatized by the standard axioms of group theory: reflexivity is the unit of the group, symmetry is the inverse, and transitivity is the multiplication. Using a standard completion of the group axioms, we obtain a rewrite system which puts equality proofs into canonical form. Proofs in this canonical form use the fewest possible assumptions, and a proof can be canonized in linear time using a simple strategy. This result is applied to obtain a simple extension of the union-find algorithm for ground equational reasoning which produces minimal proofs. The time complexity of the original union-find operations is preserved, and minimal proofs are produced in worst-case time O(nlog23), where n is the number of expressions being equated. As a second application, the approach is used to achieve significant performance improvements for the CVC cooperating decision procedure.

On Computing Reachability Sets of Process Rewrite Systems
Ahmed Bouajjani, Tayssir Touili

We consider the problem of symbolic reachability analysis of a class of term rewrite systems called Process Rewrite Systems (PRS). A PRS can be seen as the union of two mutually interdependent sets of term rewrite rules: a prefix rewrite system (or, equivalently, a pushdown system), and a multiset rewrite system (or, equivalently, a Petri net). These systems are natural models for multithreaded programs with dynamic creation of concurrent processes and recursive procedure calls. We propose a generic framework based on tree automata allowing to combine (finite-state automata based) procedures for the reachability analysis of pushdown systems with (linear arithmetics/semilinear sets based) procedures for the analysis of Petri nets in order to analyze PRS models. We provide a construction which is parametrized by such procedures and we show that it can be instantiated to (1) derive procedures for constructing the (exact) reachability sets of significant classes of PRS, (2) derive various approximate algorithms, or exact semi-algorithms, for the reachability analysis of PRS obtained by using existing symbolic reachability analysis techniques for Petri nets and counter automata.

Automata and Logics for Unranked and Unordered Trees
Iovka Boneva, Jean-Marc Talbot

In this paper, we consider the monadic second order logic (MSO) and two of its extensions, namely Counting MSO (CMSO) and Presburger MSO (PMSO), interpreted over unranked and unordered trees. We survey classes of tree automata introduced for the logics PMSO and CMSO as well as other related formalisms; we gather results from the literature and sometimes clarify or fill the remaining gaps between those various formalisms. Finally, we complete our study by adapting these classes of automata for capturing precisely the expressiveness of the logic MSO.

2004

A Type-Based Termination Criterion for Dependently-Typed Higher-Order Rewrite Systems
Frédéric Blanqui

Several authors devised type-based termination criteria for ML-like languages allowing non-structural recursive calls. We extend these works to general rewriting and dependent types, hence providing a powerful termination criterion for the combination of rewriting and β-reduction in the Calculus of Constructions.

Termination of S-Expression Rewriting Systems: Lexicographic Path Ordering for Higher-Order Terms
Yoshihito Toyama

This paper expands the termination proof techniques based on the lexicographic path ordering to term rewriting systems over varyadic terms, in which each function symbol may have more than one arity. By removing the deletion property from the usual notion of the embedding relation, we adapt Kruskal's tree theorem to the lexicographic comparison over varyadic terms. The result presented is that finite term rewriting systems over varyadic terms are terminating whenever they are compatible with the lexicographic path order. The ordering is simple, but powerful enough to handle most of higher-order rewriting systems without λ-abstraction, expressed as S-expression rewriting systems.

Monadic Second-Order Unification Is NP-Complete
Jordi Levy, Manfred Schmidt-Schauß, Mateu Villaret

Monadic Second-Order Unification (MSOU) is Second-Order Unification where all function constants occurring in the equations are unary. Here we prove that the problem of deciding whether a set of monadic equations has a unifier is NP-complete. We also prove that Monadic Second-Order Matching is also NP-complete.

A Certified AC Matching Algorithm
Évelyne Contejean

In this paper, we propose a matching algorithm for terms containing some function symbols which can be either free, commutative or associative-commutative. This algorithm is presented by inference rules and these rules have been formally proven sound and complete, and decreasing in the COQ proof assistant while the corresponding algorithm is implemented in the CiME system. Moreover some preparatory work has been done in COQ, such as proving that checking the equality of two terms modulo some commutative and associative-commutative theories is decidable.

Matchbox: A Tool for Match-Bounded String Rewriting
Johannes Waldmann

The program Matchbox implements the exact computation of the set of descendants of a regular language, and of the set of non-terminating strings, with respect to an (inverse) match-bounded string rewriting system. Matchbox can search for proof or disproof of a Boolean combination of match-height properties of a given rewrite system, and some of its transformed variants. This is applied in various ways to search for proofs of termination and non-termination. Matchbox is the first program that delivers automated proofs of termination for some difficult string rewriting systems.

TORPA: Termination of Rewriting Proved Automatically
Hans Zantema

The tool TORPA (Termination of Rewriting Proved Automatically) can be used to prove termination of string rewriting systems (SRSs) fully automatically. The underlying techniques include semantic labelling, polynomial interpretations, recursive path order, the dependency pair method and match bounds of right hand sides of forward closures.

Querying Unranked Trees with Stepwise Tree Automata
Julien Carme, Joachim Niehren, Marc Tommasi

The problem of selecting nodes in unranked trees is the most basic querying problem for XML. We propose stepwise tree automata for querying unranked trees. Stepwise tree automata can express the same monadic queries as monadic Datalog and monadic second-order logic. We prove this result by reduction to the ranked case, via a new systematic correspondence that relates unranked and ranked queries.

A Verification Technique Using Term Rewriting Systems and Abstract Interpretation
Toshinori Takai

Verifying the safety property of a transition system given by a term rewriting system is an undecidable problem. In this paper, we give an abstraction for the problem which is automatically generated from a given TRS by using abstract interpretation. Then we show that there are some cases in which the problem can be decided. Also we show a new decidable subclass of term rewriting systems which effectively preserves recognizability.

Rewriting for Fitch Style Natural Deductions
Herman Geuvers, Rob Nederpelt

Logical systems in natural deduction style are usually presented in the Gentzen style. A different definition of natural deduction, that corresponds more closely to proofs in ordinary mathematical practice, is given in [Fitch 1952]. We define precisely a Curry-Howard interpretation that maps Fitch style deductions to simply typed terms, and we analyze why it is not an isomorphism. We then describe three reduction relations on Fitch style natural deductions: one that removes garbage (subproofs that are not needed for the conclusion), one that removes repeats and one that unshares shared subproofs. We also define an equivalence relation that allows to interchange independent steps. We prove that two Fitch deductions are mapped to the same λ-term if and only if they are equal via the congruence closure of the aforementioned relations (the reduction relations plus the equivalence relation). This gives a Curry-Howard isomorphism between equivalence classes of Fitch deductions and simply typed λ-terms. Then we define the notion of cut-elimination on Fitch deductions, which is only possible for deductions that are completely unshared (normal forms of the unsharing reduction). For conciseness, we restrict in this paper to the implicational fragment of propositional logic, but we believe that our results extend to full first order predicate logic.

Efficient λ-Evaluation with Interaction Nets
Ian Mackie

This paper presents an efficient implementation of the λ-calculus using the graph rewriting formalism of interaction nets. Building upon a series of previous works, we obtain one of the most efficient implementations of this kind to date: out performing existing interaction net implementations, as well as other approaches. We conclude the paper with extensive testing to demonstrate the capabilities of this evaluator.

Proving Properties of Term Rewrite Systems via Logic Programs
Sébastien Limet, Gernot Salzer

We present a general translation of term rewrite systems (TRS) to logic programs such that basic rewriting derivations become logic deductions. Certain TRS result in so-called cs-programs, which were originally studied in the context of constraint systems and tree tuple languages. By applying decidability and computability results of cs-programs we obtain new classes of TRS that have nice properties like decidability of unification, regular sets of descendants or finite representations of R-unifiers. Our findings generalize former results in the field of term rewriting.

On the Modularity of Confluence in Infinitary Term Rewriting
Jakob Grue Simonsen

We show that, unlike the case in finitary term rewriting, confluence is not a modular property of infinitary term rewriting systems, even when these are non-collapsing. We also give a positive result: two sufficient conditions for the modularity of confluence in the infinitary setting.

MU-TERM: A Tool for Proving Termination of Context-Sensitive Rewriting
Salvador Lucas

Restrictions of rewriting can eventually achieve termination by pruning all infinite rewrite sequences issued from every term. Context-sensitive rewriting (CSR) is an example of such a restriction. In CSR, the replacements in some arguments of the function symbols are permanently forbidden. This paper describes MU-TERM, a tool which can be used to automatically prove termination of CSR. The tool implements the generation of the appropriate orderings for proving termination of CSR by means of polynomial interpretations over the rational numbers. In fact, MU-TERM is the first termination tool which generates term orderings based on such polynomial interpretations. These orderings can also be used, in a number of different ways, for proving termination of ordinary rewriting. Proofs of termination of CSR are also possible via existing transformations to TRSs (without any replacement restriction) which are also implemented in MU-TERM.

Automated Termination Proofs with AProVE
Jürgen Giesl, René Thiemann, Peter Schneider-Kamp, Stephan Falke

We describe the system ProVE, an automated prover to verify (innermost) termination of term rewrite systems (TRSs). For this system, we have developed and implemented efficient algorithms based on classical simplification orders, dependency pairs, and the size-change principle. In particular, it contains many new improvements of the dependency pair approach that make automated termination proving more powerful and efficient. In ProVE, termination proofs can be performed with a user-friendly graphical interface and the system is currently among the most powerful termination provers available.

An Approximation Based Approach to Infinitary Lambda Calculi
Stefan Blom

We explore an alternative for metric limits in the context of infinitary lambda calculus with transfinite reduction sequences. We will show how to use the new approach to get calculi that correspond to the 111, 101 and 001 infinitary lambda calculi of Kennaway et al, which have been proved to correspond to Berarducci Trees, Levy-Longo Trees and Böhm Trees respectively. We will identify subsets of the sets of meaningless terms of the metric calculi and prove that the approximation based calculi are equivalent to their metric counterparts up to these subsets.

Böhm-Like Trees for Term Rewriting Systems
Jeroen Ketema

In this paper we define Böhm-like trees for term rewriting systems (TRSs). The definition is based on the similarities between the Böhm trees, the Lévy-Longo trees, and the Berarducci trees. That is, the similarities between the Böhm-like trees of the λ-calculus. Given a term t a tree partially represents the root-stable part of t as created in each maximal fair reduction of t. In addition to defining Böhm-like trees for TRSs we define a subclass of Böhm-like trees whose members are monotone and continuous.

Inductive Theorems for Higher-Order Rewriting
Takahito Aoto, Toshiyuki Yamada, Yoshihito Toyama

Based on the simply typed term rewriting framework, inductive reasoning in higher-order rewriting is studied. The notion of higher-order inductive theorems is introduced to reflect higher-order feature of simply typed term rewriting. Then the inductionless induction methods in first-order term rewriting are incorporated to verify higher-order inductive theorems. In order to ensure that higher-order inductive theorems are closed under contexts, the notion of higher-order sufficient completeness is introduced. Finally, the decidability of higher-order sufficient completeness is discussed.

The Joinability and Unification Problems for Confluent Semi-constructor TRSs
Ichiro Mitsuhashi, Michio Oyamaguchi, Yoshikatsu Ohta, Toshiyuki Yamada

The unification problem for term rewriting systems (TRSs) is the problem of deciding, for a TRS R and two terms s and t, whether s and t are unifiable modulo R. Mitsuhashi et al. have shown that the problem is decidable for confluent simple TRSs. Here, a TRS is simple if the right-hand side of every rewrite rule is a ground term or a variable. In this paper, we extend this result and show that the unification problem for confluent semi-constructor TRSs is decidable. Here, a semi-constructor TRS is such a TRS that every subterm of the right-hand side of each rewrite rule is ground if its root is a defined symbol. We first show the decidability of joinability for confluent semi-constructor TRSs. Then, using the decision algorithm for joinability, we obtain a unification algorithm for confluent semi-constructor TRSs.

A Visual Environment for Developing Context-Sensitive Term Rewriting Systems
Jacob Matthews, Robert Bruce Findler, Matthew Flatt, Matthias Felleisen

Over the past decade, researchers have found context-sensitive term-rewriting semantics to be powerful and expressive tools for modeling programming languages, particularly in establishing type soundness proofs. Unfortunately, developing such semantics is an error-prone activity. To address that problem, we have designed PLT Redex, an embedded domain-specific language that helps users interactively create and debug context-sensitive term-rewriting systems. We introduce the tool with a series of examples and discuss our experience using it in courses and developing an operational semantics for R5RS Scheme.

2003

Confluence as a Cut Elimination Property
Gilles Dowek

The goal of this note is to compare two notions, one coming from the theory of rewrite systems and the other from proof theory: confluence and cut elimination. We show that to each rewrite system on terms, we can associate a logical system: asymmetric deduction modulo this rewrite system and that the confluence property of the rewrite system is equivalent to the cut elimination property of the associated logical system. This equivalence, however, does not extend to rewrite systems directly rewriting atomic propositions.

Associative-Commutative Rewriting on Large Terms
Steven Eker

We introduce a novel representation for associative-commutative (AC) terms which, for certain important classes of rewrite rules, allows both the AC matching and the AC renormalization steps to be accomplished using time and space that is logarithmic in the size of the flattened AC argument lists involved. This novel representation can be cumbersome for other, more general algorithms and manipulations. Hence, we describe machine efficient techniques for converting to and from a more conventional representation together with a heuristic for deciding at runtime when to convert a term to the new representation. We sketch how our approach can be generalized to order-sorted AC rewriting and to other equational theories. We also present some experimental results using the Maude 2 interpreter.

A Rule-Based Approach for Automated Generation of Kinetic Chemical Mechanisms
Olivier Bournez, Guy-Marie Côme, Valérie Conraud, Hélène Kirchner, Liliana Ibanescu

Several software systems have been developed recently for the automated generation of combustion reactions kinetic mechanisms using different representations of species and reactions and different generation algorithms. In parallel, several software systems based on rewriting have been developed for the easy modeling and prototyping of systems using rules controlled by strategies. This paper presents our current experience in using the rewrite system ELAN for the automated generation of the combustion reactions mechanisms previously implemented in the EXGAS kinetic mechanism generator system. We emphasize the benefits of using rewriting and rule-based programming controlled by strategies for the generation of kinetic mechanisms.

Efficient Reductions with Director Strings
François-Régis Sinot, Maribel Fernández, Ian Mackie

We present a name free λ-calculus with explicit substitutions based on a generalized notion of director strings: we annotate a term with information about how each substitution should be propagated through the term. We first present a calculus where we can simulate arbitrary β-reduction steps, and then simplify the rules to model the evaluation of functional programs (reduction to weak head normal form). We also show that we can derive the closed reduction strategy (a weak strategy which, in contrast with standard weak strategies allows certain reductions to take place inside λ-abstractions thus offering more sharing). Our experimental results confirm that, for large combinator based terms, our weak evaluation strategies out-perform standard evaluators. Moreover, we derive two abstract machines for strong reduction which inherit the efficiency of the weak evaluators.

Rewriting Logic and Probabilities
Olivier Bournez, Mathieu Hoyrup

Rewriting Logic has shown to provide a general and elegant framework for unifying a wide variety of models, including concurrency models and deduction systems. In order to extend the modeling capabilities of rule based languages, it is natural to consider that the firing of rules can be subject to some probabilistic laws. Considering rewrite rules subject to probabilities leads to numerous questions about the underlying notions and results. In this paper, we discuss whether there exists a notion of probabilistic rewrite system with an associated notion of probabilistic rewriting logic.

The Maude 2.0 System
Manuel Clavel, Francisco Durán, Steven Eker, Patrick Lincoln, Narciso Martí-Oliet, José Meseguer, Carolyn L. Talcott

This paper gives an overview of the Maude 2.0 system. We emphasize the full generality with which rewriting logic and membership equational logic are supported, operational semantics issues, the new built-in modules, the more general Full Maude module algebra, the new META-LEVEL module, the LTL model checker, and new implementation techniques yielding substantial performance improvements in rewriting modulo. We also comment on Maude's formal tool environment and on applications.

Diagrams for Meaning Preservation
Joe B. Wells, Detlef Plump, Fairouz Kamareddine

This paper presents an abstract framework and multiple diagram-based methods for proving meaning preservation, i.e., that all rewrite steps of a rewriting system preserve the meaning given by an operational semantics based on a rewriting strategy. While previous rewriting-based methods have generally needed the treated rewriting system as a whole to have such properties as, e.g., confluence, standardization, and/or termination or boundedness of developments, our methods can work when all of these conditions fail, and thus can handle more rewriting systems. We isolate the new lift/project with termination diagram as the key proof idea and show that previous rewritingbased methods (Plotkin's method based on confluence and standardization and Machkasova and Turbak's method based on distinct lift and project properties) implicitly use this diagram. Furthermore, our framework and proof methods help reduce the proof burden substantially by, e.g., supporting separate treatment of partitions of the rewrite steps, needing only elementary diagrams for rewrite step interactions, excluding many rewrite step interactions from consideration, needing weaker termination properties, and providing generic support for using developments in combination with any method.

Expression Reduction Systems with Patterns
Julien Forest, Delia Kesner

We introduce a new higher-order rewriting formalism, called Expression Reduction Systems with Patterns (ERSP), where abstraction is not only allowed on variables but also on nested patterns. These patterns are built by combining standard algebraic patterns with choice constructors used to denote different possible structures allowed for an abstracted argument. In other words, the non deterministic choice between different rewriting rules which is inherent to classical rewriting formalisms can be lifted here to the level of patterns. We show that confluence holds for a reasonable class of systems and terms.

Residuals in Higher-Order Rewriting
Harrie Jan Sander Bruggink

Residuals have been studied for various forms of rewriting and residual systems have been defined to capture residuals in an abstract setting. In this article we study residuals in orthogonal Pattern Rewriting Systems (PRSs). First, the rewrite relation is defined by means of a higher-order rewriting logic, and proof terms are defined that witness reductions. Then, we have the formal machinery to define a residual operator for PRSs, and we will prove that an orthogonal PRS together with the residual operator mentioned above, is a residual system. As a side-effect, all results of (abstract)residual theory are inherited by orthogonal PRSs, such as confluence, and the notion of permutation equivalence of reductions.

Rewriting UNITY
Adam Granicz, Daniel M. Zimmerman, Jason Hickey

In this paper we describe the implementation of the UNITY formalism as an extension of general-purpose languages and show its translation to C abstract syntax using PHOBOS, our generic front-end in the Mojave compiler. PHOBOS uses term rewriting to define the syntax and semantics of programming languages, and automates their translation to an internal compiler representation. Furthermore, it provides access to formal reasoning capabilities using the integrated MetaPRL theorem prover, through which advanced optimizations and transformations can be implemented or formal proofs derived.

New Decidability Results for Fragments of First-Order Logic and Application to Cryptographic Protocols
Hubert Comon-Lundh, Véronique Cortier

We consider a new extension of the Skolem class for first-order logic and prove its decidability by resolution techniques. We then extend this class including the built-in equational theory of exclusive or. Again, we prove the decidability of the class by resolution techniques. Considering such fragments of first-order logic is motivated by the automatic verification of cryptographic protocols, for an arbitrary number of sessions; the first-order formalization is an approximation of the set of possible traces, for instance relaxing the nonce freshness assumption. As a consequence, we get some new decidability results for the verification of cryptographic protocols with exclusive or.

An E-unification Algorithm for Analyzing Protocols That Use Modular Exponentiation
Deepak Kapur, Paliath Narendran, Lida Wang

Modular multiplication and exponentiation are common operations in modern cryptography. Unification problems with respect to some equational theories that these operations satisfy are investigated. Two different but related equational theories are analyzed. A unification algorithm is given for one of the theories which relies on solving syzygies over multivariate integral polynomials with noncommuting indeterminates. For the other theory, in which the distributivity property of exponentiation over multiplication is assumed, the unifiability problem is shown to be undecidable by adapting a construction developed by one of the authors to reduce Hilbert's 10th problem to the solvability problem for linear equations over semi-rings. A new algorithm for computing strong Gröbner bases of right ideals over the polynomial ring Z1, ..., Xn> is proposed; unlike earlier algorithms proposed by Baader as well as by Madlener and Reinert which work only for right admissible term orderings with the boundedness property, this algorithm works for any right admissible term ordering. The algorithms for some of these unification problems are expected to be integrated into Naval Research Lab.'s Protocol Analyzer (NPA), a tool developed by Catherine Meadows, which has been successfully used to analyze cryptographic protocols, particularly emerging standards such as the Internet Engineering Task Force's (IETF) Internet Key Exchange [11] and Group Domain of Interpretation [12] protocols. Techniques from several different fields - particularly symbolic computation (ideal theory and Gröebner basis algorithms) and unification theory - are thus used to address problems arising in state-based cryptographic protocol analysis.

Two-Way Equational Tree Automata for AC-Like Theories: Decidability and Closure Properties
Kumar Neeraj Verma

We study two-way tree automata modulo equational theories. We deal with the theories of Abelian groups (ACUM), idempotent commutative monoids (ACUI), and the theory of exclusive-or (ACUX), as well as some variants including the theory of commutative monoids (ACU). We show that the one-way automata for all these theories are closed under union and intersection, and emptiness is decidable. For two-way automata the situation is more complex. In all these theories except ACUI, we show that two-way automata can be effectively reduced to one-way automata, provided some care is taken in the definition of the so-called push clauses. (The ACUI case is open.) In particular, the two-way automata modulo these theories are closed under union and intersection, and emptiness is decidable. We also note that alternating variants have undecidable emptiness problem for most theories, contrarily to the non-equational case where alternation is essentially harmless.

Rule-Based Analysis of Dimensional Safety
Feng Chen, Grigore Roşu, Ram Prasad Venkatesan

Dimensional safety policy checking is an old topic in software analysis concerned with ensuring that programs do not violate basic principles of units of measurement. Scientific and/or navigation software is routinely dimensional and violations of measurement unit safety policies can hide significant domain-specific errors which are hard or impossible to find otherwise. Dimensional analysis of programs written in conventional programming languages is addressed in this paper. We draw general design principles for dimensional analysis tools and then discuss our prototypes, implemented by rewriting, which include both dynamic and static checkers. Our approach is based on assume/assert annotations of code which are properly interpreted by our tools and ignored by standard compilers/interpreters. The output of our prototypes consists of warnings that list those expressions violating the unit safety policy. These prototypes are implemented in the rewriting system Maude.

On the Complexity of Higher-Order Matching in the Linear lambda-Calculus
Sylvain Salvati, Philippe de Groote

We prove that linear second-order matching in the linear λ-calculus with linear occurrences of the unknowns is NP-complete. This result shows that context matching and second-order matching in the linear λ-calculus are, in fact, two different problems.

XML Schema, Tree Logic and Sheaves Automata
Silvano Dal-Zilio, Denis Lugiez

XML documents, and other forms of semi-structured data, may be roughly described as edge labeled trees; it is therefore natural to use tree automata to reason on them. This idea has already been successfully applied in the context of Document Type Definition (DTD), the simplest standard for defining XML documents validity, but additional work is needed to take into account XML Schema, a more advanced standard, for which regular tree automata are not satisfactory. In this paper, we define a tree logic that directly embeds XML Schema as a plain subset as well as a new class of automata for unranked trees, used to decide this logic, which is well-suited to the processing of XML documents and schemas.

Size-Change Termination for Term Rewriting
René Thiemann, Jürgen Giesl

In [13], a new size-change principle was proposed to verify termination of functional programs automatically. We extend this principle in order to prove termination and innermost termination of arbitrary term rewrite systems (TRSs). Moreover, we compare this approach with existing techniques for termination analysis of TRSs (such as recursive path orderings or dependency pairs). It turns out that the size-change principle on its own fails for many examples that can be handled by standard techniques for rewriting, but there are also TRSs where it succeeds whereas existing rewriting techniques fail. In order to benefit from their respective advantages, we show how to combine the size-change principle with classical orderings and with dependency pairs. In this way, we obtain a new approach for automated termination proofs of TRSs which is more powerful than previous approaches.

Monotonic AC-Compatible Semantic Path Orderings
Cristina Borralleras, Albert Rubio

Polynomial interpretations and RPO-like orderings allow one to prove termination of Associative and Commutative (AC-)rewriting by only checking the rules of the given rewrite system. However, these methods have important limitations as termination proving tools.
To overcome these limitations, more powerful methods like the dependency pair method have been extended to the AC-case. Unfortunately, in order to ensure AC-termination, the so-called extended rules, which, in general, are hard to prove, must be added to the rewrite system.
In this paper we present a fully monotonic AC-compatible semantic path ordering. This monotonic AC-ordering defines a new automatable termination proving method for AC-rewriting which does not need to consider extended rules. As a hint of the power of this method, we can easily prove several non-trivial examples appearing in the literature, including one that, to our knowledge, can be handled by no other automatic method.

Relating Derivation Lengths with the Slow-Growing Hierarchy Directly
Georg Moser, Andreas Weiermann

In this article we introduce the notion of a generalized system of fundamental sequences and we define its associated slow-growing hierarchy. We claim that these concepts are genuinely related to the classification of the complexity -the derivation length- of rewrite systems for which termination is provable by a standard termination ordering. To substantiate this claim, we re-obtain multiple recursive bounds on the the derivation length for rewrite systems terminating under lexicographic path ordering, originally established by the second author.

Tsukuba Termination Tool
Nao Hirokawa, Aart Middeldorp

We present a tool for automatically proving termination of first-order rewrite systems. The tool is based on the dependency pair method of Arts and Giesl. It incorporates several new ideas that make the method more efficient. The tool produces high-quality output and has a convenient web interface.

Liveness in Rewriting
Jürgen Giesl, Hans Zantema

In this paper, we show how the problem of verifying liveness properties is related to termination of term rewrite systems (TRSs). We formalize liveness in the framework of rewriting and present a sound and complete transformation to transform particular liveness problems into TRSs. Then the transformed TRS terminates if and only if the original liveness property holds. This shows that liveness and termination are essentially equivalent. To apply our approach in practice, we introduce a simpler sound transformation which only satisfies the 'only if'-part. By refining existing techniques for proving termination of TRSs we show how liveness properties can be verified automatically. As examples, we prove a liveness property of a waiting line protocol for a network of processes and a liveness property of a protocol on a ring of processes.

Validation of the JavaCard Platform with Implicit Induction Techniques
Gilles Barthe, Sorin Stratulat

The bytecode verifier (BCV), which performs a static analysis to reject potentially insecure programs, is a key security function of the Java(Card) platform. Over the last few years there have been numerous projects to prove formally the correctness of bytecode verification, but relatively little effort has been made to provide methodologies, techniques and tools that help such formalisations. In earlier work, we develop a methodology and a specification environment featuring a neutral mathematical language based on conditional rewriting, that considerably reduce the cost of specifying virtual machines.
In this work, we show that such a neutral mathematical language based on conditional rewriting is also beneficial for performing automatic verifications on the specifications, and illustrate in particular how implicit induction techniques can be used for the validation of the Java(Card) Platform. More precisely, we report on the use of SPIKE, a first-order theorem prover based on implicit induction, to establish the correctness of the BCV. The results are encouraging, as many of the intermediate lemmas required to prove the BCV correct can be proved with SPIKE.

Term Partition for Mathematical Induction
Pascal Urso, Emmanuel Kounalis

A key new concept, term partition, allows to design a new method for proving theorems whose proof usually requires mathematical induction. A term partition of a term t is a well-defined splitting of t into a pair (a, b) of terms that describes the language of normal forms of the ground instances of t.
If A is a monomorphic set of axioms (rules) and (a, b) is a term partition of t, then the normal form (obtained by using A ) of any ground instance of t can be "divided" into the normal forms (obtained by using A ) of the corresponding ground instances of a and b. Given a conjecture t = s to be checked for inductive validity in the theory of A, a partition (a, b) of t and a partition (c, d) of s is computed. If a = c and b = d, then t = s is an inductive theorem for A .
The method is conceptually different to the classical theorem proving approaches. It allows to obtain proofs of a large number of conjectures (including non-linear ones) without additional lemmas or generalizations.

Equational Prover of THEOREMA
Temur Kutsia

The equational prover of the Theorema system is described. It is implemented on Mathematica and is designed for unit equalities in the first order or in the applicative higher order form. A (restricted) usage of sequence variables and Mathematica built-in functions is allowed.

Termination of Simply Typed Term Rewriting by Translation and Labelling
Takahito Aoto, Toshiyuki Yamada

Simply typed term rewriting proposed by Yamada (RTA 2001) is a framework of term rewriting allowing higher-order functions. In contrast to the usual higher-order term rewriting frameworks, simply typed term rewriting dispenses with bound variables. This paper presents a method for proving termination of simply typed term rewriting systems (STTRSs, for short). We first give a translation of STTRSs into many-sorted first-order TRSs and show that termination problem of STTRSs is reduced to that of many-sorted first-order TRSs. Next, we introduce a labelling method which is applied to first-order TRSs obtained by the translation to facilitate termination proof of them; our labelling employs an extension of semantic labelling where terms are interpreted on a many-sorted algebra.

Rewriting Modulo in Deduction Modulo
Frédéric Blanqui

We study the termination of rewriting modulo a set of equations in the Calculus of Algebraic Constructions, an extension of the Calculus of Constructions with functions and predicates defined by higher-order rewrite rules. In a previous work, we defined general syntactic conditions based on the notion of computability closure for ensuring the termination of the combination of rewriting and β-reduction.
Here, we show that this result is preserved when considering rewriting modulo a set of equations if the equivalence classes generated by these equations are finite, the equations are linear and satisfy general syntactic conditions also based on the notion of computability closure. This includes equations like associativity and commutativity and provides an original treatment of termination modulo equations.

Termination of String Rewriting Rules That Have One Pair of Overlaps
Alfons Geser

This paper presents a partial solution to the long standing open problem whether termination of one-rule string rewriting is decidable. Overlaps between the two sides of the rule play a central role in existing termination criteria. We characterize termination of all one-rule string rewriting systems that have one such overlap at either end. This both completes a result of Kurth and generalizes a result of Shikishima-Tsuji et al.

Environments for Term Rewriting Engines for Free
Mark van den Brand, Pierre-Étienne Moreau, Jurgen J. Vinju

Term rewriting can only be applied if practical implementations of term rewriting engines exist. New rewriting engines are designed and implemented either to experiment with new (theoretical) results or to be able to tackle new application areas. In this paper we present the Meta-Environment: an environment for rapidly implementing the syntax and semantics of term rewriting based formalisms. We provide not only the basic building blocks, but complete interactive programming environments that only need to be instantiated by the details of a new formalism.

A Rewriting Alternative to Reidemeister-Schreier
Neil Ghani, Anne Heyworth

One problem in computational group theory is to find a presentation of the subgroup generated by a set of elements of a group. The Reidemeister-Schreier algorithm was developed in the 1930's and gives a solution based upon enumerative techniques. This however means the algorithm can only be applied to finite groups. This paper proposes a rewriting based alternative to the Reidemeister-Schreier algorithm which has the advantage of being applicable to infinite groups.

Stable Computational Semantics of Conflict-Free Rewrite Systems (Partial Orders with Duplication)
Zurab Khasidashvili, John R. W. Glauert

We study orderings ⊴S on reductions in the style of Lévy reflecting the growth of information w.r.t. (super) stable sets S of 'values' (such as head-normal forms or Böhm-trees). We show that sets of co-initial reductions ordered by ⊴S form finitary ω-algebraic complete lattices, and hence form computation and Scott domains. As a consequence, we obtain a relativized version of the computational semantics proposed by Boudol for term rewriting systems. Furthermore, we give a pure domain-theoretic characterization of the orderings ⊴S in the spirit of Kahn and Plotkin's concrete domains. These constructions are carried out in the framework of Stable Deterministic Residual Structures, which are abstract reduction systems with an axiomatized residual relations on redexes, that model all orthogonal (or conflict-free) reduction systems as well as many other interesting computation structures.

Recognizing Boolean Closed A-Tree Languages with Membership Conditional Rewriting Mechanism
Hitoshi Ohsaki, Hiroyuki Seki, Toshinori Takai

This paper provides an algorithm to compute the complement of tree languages recognizable with A-TA (tree automata with associativity axioms [16]). Due to this closure property together with the previously obtained results, we know that the class is boolean closed, while keeping recognizability of A-closures of regular tree languages. In the proof of the main result, a new framework of tree automata, called sequence-tree automata, is introduced as a generalization of Lugiez and Dal Zilio's multi-tree automata [14] of an associativity case. It is also shown that recognizable A-tree languages are closed under a one-step rewrite relation in case of ground A-term rewriting. This result allows us to compute an under-approximation of A-rewrite descendants of recognizable A-tree languages with arbitrary accuracy.

Testing Extended Regular Language Membership Incrementally by Rewriting
Grigore Roşu, Mahesh Viswanathan

In this paper we present lower bounds and rewriting algorithms for testing membership of a word in a regular language described by an extended regular expression. Motivated by intuitions from monitoring and testing, where the words to be tested (execution traces) are typically much longer than the size of the regular expressions (patterns or requirements), and by the fact that in many applications the traces are only available incrementally, on an event by event basis, our algorithms are based on an event-consumption idea: a just arrived event is "consumed" by the regular expression, i.e., the regular expression modifies itself into another expression discarding the event. We present an exponential space lower bound for monitoring extended regular expressions and argue that the presented rewriting-based algorithms, besides their simplicity and elegance, are practical and almost as good as one can hope. We experimented with and evaluated our algorithms in Maude.

2002

Axiomatic Rewriting Theory VI Residual Theory Revisited
Paul-André Melliès

Residual theory is the algebraic theory of confluence for the λ-calculus, and more generally conflict-free rewriting systems (=without critical pairs). The theory took its modern shape in Lévy's PhD thesis, after Church, Rosser and Curry's seminal steps. There, Lévy introduces a permutation equivalence between rewriting paths, and establishes that among all confluence diagrams P → N ← Q completing a span P ← M → Q, there exists a minimum such one, modulo permutation equivalence. Categorically, the diagram is called a pushout.
In this article, we extend Lévy's residual theory, in order to enscope "border-line" rewriting systems, which admit critical pairs but enjoy a strong Church-Rosser property (=existence of pushouts.) Typical examples are the associativity rule and the positive braid rewriting systems. Finally, we show that the resulting theory reformulates and clarifies Lévy's optimality theory for the λ-calculus, and its so-called "extraction procedure".

Static Analysis of Modularity of beta-Reduction in the Hyperbalanced lambda-Calculus
Richard Kennaway, Zurab Khasidashvili, Adolfo Piperno

We investigate the degree of parallelism (or modularity) in the hyperbalanced λ-calculus, λH, a subcalculus of λ-calculus containing all simply typable terms (up to a restricted η-expansion). In technical terms, we study the family relation on redexes in λH, and the contribution relation on redex-families, and show that the latter is a forest (as a partial order). This means that hyperbalanced λ-terms allow for maximal possible parallelism in computation. To prove our results, we use and further refine, for the case of hyperbalanced terms, some well known results concerning paths, which allow for static analysis of many fundamental properties of β-reduction.

Exceptions in the Rewriting Calculus
Germain Faure, Claude Kirchner

In the context of the rewriting calculus, we introduce and study an exception mechanism that allows us to express in a simple way rewriting strategies and that is therefore also useful for expressing theorem proving tactics. The proposed exception mechanism is expressed in a confluent calculus which gives the ability to simply express the semantics of the first tactical and to describe in full details the expression of conditional rewriting.

Deriving Focused Lattice Calculi
Georg Struth

We derive rewrite-based ordered resolution calculi for semilattices, distributive lattices and boolean lattices. Using ordered resolution as a metaprocedure, theory axioms are first transformed into independent bases. Focused inference rules are then extracted from inference patterns in refutations. The derivation is guided by mathematical and procedural background knowledge, in particular by ordered chaining calculi for quasiorderings (forgetting the lattice structure), by ordered resolution (forgetting the clause structure) and by Knuth-Bendix completion for non-symmetric transitive relations (forgetting both structures). Conversely, all three calculi are derived and proven complete in a transparent and generic way as special cases of the lattice calculi.

Layered Transducing Term Rewriting System and Its Recognizability Preserving Property
Hiroyuki Seki, Toshinori Takai, Youhei Fujinaka, Yuichi Kaji

A term rewriting system which effectively preserves recognizability (EPR-TRS) has good mathematical properties. In this paper, a new subclass of TRSs, layered transducing TRSs (LT-TRSs) is defined and its recognizability preserving property is discussed. The class of LT-TRSs contains some EPR-TRSs, e.g., f(x) → f(g(x)) which do not belong to any of the known decidable subclasses of EPR-TRSs. Bottom-up linear tree transducer, which is a well-known computation model in the tree language theory, is a special case of LT-TRS. We present a sufficient condition for an LT-TRS to be an EPR-TRS. Also some properties of LT-TRSs including reachability are shown to be decidable.

Decidability and Closure Properties of Equational Tree Languages
Hitoshi Ohsaki, Toshinori Takai

Equational tree automata provide a powerful tree language framework that facilitates to recognize congruence closures of tree languages. In the paper we show the emptiness problem for AC-tree automata and the intersection-emptiness problem for regular AC-tree automata, each of which was open in our previous work [20], are decidable, by a straightforward reduction to the reachability problem for ground AC-term rewriting. The newly obtained results generalize decidability of so-called reachable property problem of Mayr and Rusinowitch [17]. We then discuss complexity issue of AC-tree automata. Moreover, in order to solve some other questions about regular A- and AC-tree automata, we recall the basic connection between word languages and tree languages.

Regular Sets of Descendants by Some Rewrite Strategies
Pierre Réty, Julie Vuotto

For a constructor-based rewrite system R, a regular set of ground terms E, and assuming some additional restrictions, we build a finite tree automaton that recognizes the descendants of E, i.e. the terms issued from E by rewriting, according to innermost, innermost-leftmost, and outermost strategies.

Rewrite Games
Johannes Waldmann

For a terminating rewrite system R, and a ground term t1, two players alternate in doing R-reductions t1R t2R t3R ... That is, player 1 choses the redex in t1, t3,..., and player 2 choses the redex in t2, t4,... The player who cannot move (because tn is a normal form), loses.
In this note, we propose some challenging problems related to certain rewrite games. In particular, we re-formulate an open problem from combinatorial game theory (do all finite octal games have an ultimately periodic Sprague-Grundy sequence?) as a question about rationality of some tree languages.
We propose to attack this question by methods from set constraint systems, and show some cases where this works directly.
Finally we present rewrite games from to combinatory logic, and their relation to algebraic tree languages.

An Extensional Böhm Model
Paula Severi, Fer-Jan de Vries

We show the existence of an infinitary confluent and normalising extension of the finite extensional lambda calculus with beta and eta. Besides infinite beta reductions also infinite eta reductions are possible in this extension, and terms without head normal form can be reduced to bottom. As corollaries we obtain a simple, syntax based construction of an extensional Böhm model of the finite lambda calculus; and a simple, syntax based proof that two lambda terms have the same semantics in this model if and only if they have the same eta-Böhm tree if and only if they are observationally equivalent wrt to beta normal forms. The confluence proof reduces confluence of beta, bottom and eta via infinitary commutation and postponement arguments to confluence of beta and bottom and confluence of eta.
We give counterexamples against confluence of similar extensions based on the identification of the terms without weak head normal form and the terms without top normal form (rootactive terms) respectively.

A Weak Calculus with Explicit Operators for Pattern Matching and Substitution
Julien Forest

In this paper we propose a Weak Lambda Calculus called λPw having explicit operators for Pattern Matching and Substitution. This formalism is able to specify functions defined by cases via pattern matching constructors as done by most modern functional programming languages such as OCAML. We show the main property enjoyed by λPw, namely subject reduction, confluence and strong normalization.

Tradeoffs in the Intensional Representation of Lambda Terms
Chuck Liang, Gopalan Nadathur

Higher-order representations of objects such as programs, specifications and proofs are important to many metaprogramming and symbolic computation tasks. Systems that support such representations often depend on the implementation of an intensional view of the terms of suitable typed lambda calculi. Refined lambda calculus notations have been proposed that can be used in realizing such implementations. There are, however, choices in the actual deployment of such notations whose practical consequences are not well understood. Towards addressing this lacuna, the impact of three specific ideas is examined: the de Bruijn representation of bound variables, the explicit encoding of substitutions in terms and the annotation of terms to indicate their independence on external abstractions. Qualitative assessments are complemented by experiments over actual computations. The empirical study is based on λProlog programs executed using suitable variants of a low level, abstract machine based implementation of this language.

Improving Symbolic Model Checking by Rewriting Temporal Logic Formulae
David Déharbe, Anamaria Martins Moreira, Christophe Ringeissen

A factor in the complexity of conventional algorithms for model checking Computation Tree Logic (CTL) is the size of the formulae, and, more precisely, the number of fixpoint operators. This paper addresses the following questions: given a CTL formula f, is there an equivalent formula with fewer fixpoint operators? and how term rewriting techniques may be used to find it? Moreover, for some sublogics of CTL, e.g. the sub-logic NF-CTL (no fixpoint computation tree logic), more efficient verification procedures are available. This paper also addresses the problem of testing whether an expression belongs or not to NF-CTL, and providing support in the choice of the most efficient amongst different available verification algorithms. In this direction, we propose a rewrite system modulo AC, and discuss its implementation in ELAN, showing how this rewriting process can be plugged in a formal verification tool.

Conditions for Efficiency Improvement by Tree Transducer Composition
Janis Voigtländer

We study the question of efficiency improvement or deterioration for a semantic-preserving program transformation technique based on macro tree transducer composition. By annotating functional programs to reflect the internal property "computation time" explicitly in the computed output, and by manipulating such annotations, we formally prove syntactic conditions under which the composed program is guaranteed to be more efficient than the original program, with respect to call-by-need reduction to normal form. The developed criteria can be checked automatically, and thus are suitable for integration into an optimizing functional compiler.

Rewriting Strategies for Instruction Selection
Martin Bravenboer, Eelco Visser

Instruction selection (mapping IR trees to machine instructions) can be expressed by means of rewrite rules. Typically, such sets of rewrite rules are highly ambiguous. Therefore, standard rewriting engines based on fixed, exhaustive strategies are not appropriate for the execution of instruction selection. Code generator generators use special purpose implementations employing dynamic programming. In this paper we show how rewriting strategies for instruction selection can be encoded concisely in Stratego, a language for program transformation based on the paradigm of programmable rewriting strategies. This embedding obviates the need for a language dedicated to code generation, and makes it easy to combine code generation with other optimizations.

Probabilistic Rewrite Strategies. Applications to ELAN
Olivier Bournez, Claude Kirchner

Recently rule based languages focussed on the use of rewriting as a modeling tool which results in making specifications executable. To extend the modeling capabilities of rule based languages, we explore the possibility of making the rule applications subject to probabilistic choices.
We propose an extension of the ELAN strategy language to deal with randomized systems. We argue through several examples that we propose indeed a natural setting to model systems with randomized choices. This leads us to interesting new problems, and we address the generalization of the usual concepts in abstract reduction systems to randomized systems.

Loops of Superexponential Lengths in One-Rule String Rewriting
Alfons Geser

Loops are the most frequent cause of non-termination in string rewriting. In the general case, non-terminating, non-looping string rewriting systems exist, and the uniform termination problem is undecidable. For rewriting with only one string rewriting rule, it is unknown whether non-terminating, non-looping systems exist and whether uniform termination is decidable. If in the one-rule case, non-termination is equivalent to the existence of loops, as McNaughton conjectures, then a decision procedure for the existence of loops also solves the uniform termination problem. As the existence of loops of bounded lengths is decidable, the question is raised how long shortest loops may be. We show that string rewriting rules exist whose shortest loops have superexponential lengths in the size of the rule.

Recursive Derivational Length Bounds for Confluent Term Rewrite Systems
Elias Tahhan-Bittar

Let F be a signature and R a term rewrite system on ground terms of F. We define the concepts of a context-free potential redex in a term and of bounded confluent terms. We bound recursively the lengths of derivations of a bounded confluent term t by a function of the length of derivations of context-free potential redexes of this term. We define the concept of inner redex and we apply the recursive bounds that we obtained to prove that, whenever R is a confluent overlay term rewrite system, the derivational length bound for arbitrary terms is an iteration of the derivational length bound for inner redexes.

Termination of (Canonical) Context-Sensitive Rewriting
Salvador Lucas

Context-sensitive rewriting (CSR) is a restriction of rewriting which forbids reductions on selected arguments of functions. A replacement map discriminates, for each symbol of the signature, the argument positions on which replacements are allowed. If the replacement restrictions are less restrictive than those expressed by the so-called canonical replacement map, then CSR can be used for computing (infinite) normal forms of terms. Termination of such canonical CSR is desirable when using CSR for these purposes. Existing transformations for proving termination of CSR fulfill a number of new properties when used for proving termination of canonical CSR.

Atomic Set Constraints with Projection
Witold Charatonik, Jean-Marc Talbot

We investigate a class of set constraints defined as atomic set constraints augmented with projection. This class subsumes some already studied classes such as atomic set constraints with left-hand side projection and INES constraints. All these classes enjoy the nice property that satisfiability can be tested in cubic time. This is in contrast to several other classes of set constraints, such as definite set constraints and positive set constraints, for which satisfiability ranges from DEXPTIME-complete to NEXPTIME-complete. However, these latter classes allow set operators such as intersection or union which is not the case for the class studied here. In the case of atomic set constraints with projection one might expect that satisfiability remains polynomial. Unfortunately, we show that that the satisfiability problem for this class is no longer polynomial, but CoNP-hard. Furthermore, we devise a PSPACE algorithm to solve this satisfiability problem.

Currying Second-Order Unification Problems
Jordi Levy, Mateu Villaret

The Curry form of a term, like f(a, b), allows us to write it, using just a single binary function symbol, as @(@(f,a),b). Using this technique we prove that the signature is not relevant in second-order unification, and conclude that one binary symbol is enough.
By currying variable applications, like X(a), as @(X,a), we can transform second-order terms into first-order terms, but we have to add beta-reduction as a theory. This is roughly what it is done in explicit unification. We prove that by currying only constant applications we can reduce second-order unification to second-order unification with just one binary function symbol. Both problems are already known to be undecidable, but applying the same idea to context unification, for which decidability is still unknown, we reduce the problem to context unification with just one binary function symbol.
We also discuss about the difficulties of applying the same ideas to third or higher order unification.

A Decidable Variant of Higher Order Matching
Daniel J. Dougherty, Tomasz Wierzbicki

A lambda term is k-duplicating if every occurrence of a lambda abstractor binds at most k variable occurrences. We prove that the problem of higher order matching where solutions are required to be k-duplicating (but with no constraints on the problem instance itself) is decidable. We also show that the problem of higher order matching in the affine lambda calculus (where both the problem instance and the solutions are constrained to be 1-duplicating) is in NP, generalizing de Groote's result for the linear lambda calculus [4].

Combining Decision Procedures for Positive Theories Sharing Constructors
Franz Baader, Cesare Tinelli

This paper addresses the following combination problem: given two equational theories E1 and E2 whose positive theories are decidable, how can one obtain a decision procedure for the positive theory of E1 ∪ E2? For theories over disjoint signatures, this problem was solved by Baader and Schulz in 1995. This paper is a first step towards extending this result to the case of theories sharing constructors. Since there is a close connection between positive theories and unification problems, this also extends to the non-disjoint case the work on combining decision procedures for unification modulo equational theories.

JITty: A Rewriter with Strategy Annotations
Jaco van de Pol

We demonstrate JITty, a simple rewrite implementation with strategy annotations, along the lines of the Just-In-Time rewrite strategy, explained and justified in [4]. Our tool has the following distinguishing features:

Autowrite: A Tool for Checking Properties of Term Rewriting Systems
Irène Durand

Huet and Lévy [6] showed that for the class of orthogonal term rewriting systems (TRSs) every term not in normal form contains a needed redex (i.e., a redex contracted in every normalizing rewrite sequence) and that repeated contraction of needed redexes results in a normal form if it exists. However, neededness is in general undecidable. In order to obtain a decidable approximation to neededness Huet and Lévy introduced the subclass of strongly sequential TRSs and showed that strong sequentiality is a decidable property of orthogonal TRSs.

TTSLI: An Implementation of Tree-Tuple Synchronized Languages
Benoit Lecland, Pierre Réty

Tree-Tuple Synchronized Languages have first been introduced by means of Tree-Tuple Synchronized Grammars (TTSG) [3], and have been reformulated recently by means of (so-called) Constraint Systems (CS), which allowed to prove more properties [2,1]. A number of applications to rewriting and to concurrency have been presented (see [5] for a survey).

in2 : A Graphical Interpreter for Interaction Nets
Sylvain Lippi

in2 can be considered as an attractive and didactic tool to approach the interaction net paradigm. But it is also an implementation in C of the core of a real programming language featuring a user-friendly graphical syntax and an efficient garbage collector free execution.

2001

Universal Interaction Systems with Only Two Agents
Denis Béchet

In the framework of interaction nets [6], Yves Lafont has proved [8] that every interaction system can be simulated by a system composed of 3 symbols named γ, δ and ε. One may wonder if it is possible to find a similar universal system with less symbols. In this paper, we show a way to simulate every interaction system with a specific interaction system constituted of only 2 symbols. By transitivity, we prove that we can find a universal interaction system with only 2 agents. Moreover, we show how to find such a system where agents have no more than 3 auxiliary ports.

General Recursion on Second Order Term Algebras
Alessandro Berarducci, Corrado Böhm

Extensions of the simply typed lambda calculus have been used as a metalanguage to represent "higher order term algebras", such as, for instance, formulas of the predicate calculus. In this representation bound variables of the object language are represented by bound variables of the metalanguage. This choice has various advantages but makes the notion of "recursive definition" on higher order term algebras more subtle than the corresponding notion on first order term algebras. Despeyroux, Pfenning and Schürmann pointed out the problems that arise in the proof of a canonical form theorem when one combines higher order representations with primitive recursion.
In this paper we consider a stronger scheme of recursion and we prove that it captures all partial recursive functions on second order term algebras. We illustrate the system by considering typed programs to reduce to normal form terms of the untyped lambda calculus, encoded as elements of a second order term algebra. First order encodings based on de Bruijn indexes are also considered. The examples also show that a version of the intersection type disciplines can be helpful in some cases to prove the existence of a canonical form. Finally we consider interpretations of our typed systems in the pure lambda calculus and a new gödelization of the pure lambda calculus.

Beta Reduction Constraints
Manuel Bodirsky, Katrin Erk, Alexander Koller, Joachim Niehren

The constraint language for lambda structures (CLLS) can model lambda terms that are known only partially. In this paper, we introduce beta reduction constraints to describe beta reduction steps between partially known lambda terms. We show that beta reduction constraints can be expressed in an extension of CLLS by group parallelism. We then extend a known semi-decision procedure for CLLS to also deal with group parallelism and thus with beta-reduction constraints.

From Higher-Order to First-Order Rewriting
Eduardo Bonelli, Delia Kesner, Alejandro Ríos

We show how higher-order rewriting may be encoded into first-order rewriting modulo an equational theory ε. We obtain a characterization of the class of higher-order rewriting systems which can be encoded by first-order rewriting modulo an empty theory (that is, ε = 0). This class includes of course the λ-calculus. Our technique does not rely on a particular substitution calculus but on a set of abstract properties to be verified by the substitution calculus used in the translation.

Combining Pattern E-Unification Algorithms
Alexandre Boudet, Évelyne Contejean

We present an algorithm for unification of higher-order patterns modulo combinations of disjoint first-order equational theories. This algorithm is highly non-deterministic, in the spirit of those by Schmidt-Schauß [20] and Baader-Schulz [1] in the first-order case. We redefine the properties required for elementary pattern unification algorithms of pure problems in this context, then we show that some theories of interest have elementary unification algorithms fitting our requirements. This provides a unification algorithm for patterns modulo the combination of theories such as the free theory, commutativity, one-sided distributivity, associativity-commutativity and some of its extensions, including Abelian groups.

Matching Power
Horatiu Cirstea, Claude Kirchner, Luigi Liquori

In this paper we give a simple and uniform presentation of the rewriting calculus, also called Rho Calculus. In addition to its simplicity, this formulation explicitly allows us to encode complex structures such as lists, sets, and objects. We provide extensive examples of the calculus, and we focus on its ability to represent some object oriented calculi, namely the Lambda Calculus of Objects of Fisher, Honsell, and Mitchell, and the Object Calculus of Abadi and Cardelli. Furthermore, the calculus allows us to get object oriented constructions unreachable in other calculi. In summa, we intend to show that because of its matching ability, the Rho Calculus represents a lingua franca to naturally encode many paradigms of computations. This enlightens the capabilities of the rewriting calculus based language ELAN to be used as a logical as well as powerful semantical framework.

Dependency Pairs for Equational Rewriting
Jürgen Giesl, Deepak Kapur

The dependency pair technique of Arts and Giesl [1,2,3] for termination proofs of term rewrite systems (TRSs) is extended to rewriting modulo equations. Up to now, such an extension was only known in the special case of AC-rewriting [15,17]. In contrast to that, the proposed technique works for arbitrary non-collapsing equations (satisfying a certain linearity condition). With the proposed approach, it is now possible to perform automated termination proofs for many systems where this was not possible before. In other words, the power of dependency pairs can now also be used for rewriting modulo equations.

Termination Proofs by Context-Dependent Interpretations
Dieter Hofbauer

Proving termination of a rewrite system by an interpretation over the natural numbers directly implies an upper bound on the derivational complexity of the system. In this way, however, the derivation height of terms is often heavily overestimated.
Here we present a generalization of termination proofs by interpretations that can avoid this drawback of the traditional approach. A number of simple examples illustrate how to achieve tight or even optimal bounds on the derivation height. The method is general enough to capture cases where simplification orderings fail.

Uniform Normalisation beyond Orthogonality
Zurab Khasidashvili, Mizuhito Ogawa, Vincent van Oostrom

A rewrite system is called uniformly normalising if all its steps are perpetual, i.e. are such that if s → t and s has an infinite reduction, then t has one too. For such systems termination (SN) is equivalent to normalisation (WN). A well-known fact is uniform normalisation of orthogonal non-erasing term rewrite systems, e.g. the λI-calculus. In the present paper both restrictions are analysed. Orthogonality is seen to pertain to the linear part and non-erasingness to the non-linear part of rewrite steps. Based on this analysis, a modular proof method for uniform normalisation is presented which allows to go beyond orthogonality. The method is shown applicable to biclosed first- and second-order term rewrite systems as well as to a λ-calculus with explicit substitutions.

Verifying Orientability of Rewrite Rules Using the Knuth-Bendix Order
Konstantin Korovin, Andrei Voronkov

We consider two decision problems related to the Knuth-Bendix order (KBO). The first problem is orientability: given a system of rewrite rules R, does there exist some KBO which orients every ground instance of every rewrite rule in R. The second problem is whether a given KBO orients a rewrite rule. This problem can also be reformulated as the problem of solving a single ordering constraint for the KBO. We prove that both problems can be solved in polynomial time. The algorithm builds upon an algorithm for solving systems of homogeneous linear inequalities over integers. Also we show that if a system is orientable using a real-valued KBO, then it is also orientable using an integer-valued KBO.

Relating Accumulative and Non-accumulative Functional Programs
Armin Kühnemann, Robert Glück, Kazuhiko Kakehi

We study the problem to transform functional programs, which intensively use append functions (like inefficient list reversal), into programs, which use accumulating parameters instead (like eficient list reversal). We give an (automatic) transformation algorithm for our problem and identify a class of functional programs, namely restricted 2-modular tree transducers, to which it can be applied. Moreover, since we get macro tree transducers as transformation result and since we also give the inverse transformation algorithm, we have a new characterization for the class of functions induced by macro tree transducers.

Context Unification and Traversal Equations
Jordi Levy, Mateu Villaret

Context unification was originally defined by H. Comon in ICALP'92, as the problem of finding a unifier for a set of equations containing first-order variables and context variables. These context variables have arguments, and can be instantiated by contexts. In other words, they are second-order variables that are restricted to be instantiated by linear terms (a linear term is a λ-expression λx1...λxn.t where every xi occurs exactly once in t).
In this paper, we prove that, if the so called rank-bound conjecture is true, then the context unification problem is decidable. This is done reducing context unification to solvability of traversal equations (a kind of word unification modulo certain permutations) and then, reducing traversal equations to word equations with regular constraints.

Weakly Regular Relations and Applications
Sébastien Limet, Pierre Réty, Helmut Seidl

A new class of tree-tuple languages is introduced: the weakly regular relations. It is an extension of the regular case (regular relations) and a restriction of tree-tuple synchronized languages, that has all usual nice properties, except closure under complement. Two applications are presented: to unification modulo a rewrite system, and to one-step rewriting.

On the Parallel Complexity of Tree Automata
Markus Lohrey

We determine the parallel complexity of several (uniform) membership problems for recognizable tree languages. Furthermore we show that the word problem for a fixed finitely presented algebra is in DLOGTIME-uniform NC1.

Transfinite Rewriting Semantics for Term Rewriting Systems
Salvador Lucas

We provide some new results concerning the use of transfinite rewriting for giving semantics to rewrite systems. We especially (but not only) consider the computation of possibly infinite constructor terms by transfinite rewriting due to their interest in many programming languages. We reconsider the problem of compressing transfinite rewrite sequences into shorter (possibly finite) ones. We also investigate the role that (finitary) con uence plays in transfinite rewriting. We consider different (quite standard) rewriting semantics (mappings from input terms to sets of reducts obtained by (transfinite-rewriting) in a unified framework and investigate their algebraic structure. Such a framework is used to formulate, connect, and approximate different properties of TRSs.

Goal-Directed E-Unification
Christopher Lynch, Barbara Morawska

We give a general goal directed method for solving the E-unification problem. Our inference system is a generalization of the inference rules for Syntactic Theories, except that our inference system is proved complete for any equational theory. We also show how to easily modify our inference system into a more restricted inference system for Syntactic Theories, and show that our completeness techniques prove completeness there also.

The Unification Problem for Confluent Right-Ground Term Rewriting Systems
Michio Oyamaguchi, Yoshikatsu Ohta

The unification problem for term rewriting systems(TRSs) is the problem of deciding, for a given TRS R and two terms M and N, whether there exists a substitution θ such that Mθ and Nθ are congruent modulo R (i.e., Mθ ↔*R Nθ). In this paper, the unification problem for con uent right-ground TRSs is shown to be decidable. To show this, the notion of minimal terms is introduced and a new unification algorithm of obtaining a substitution whose range is in minimal terms is proposed. Our result extends the decidability of unification for canonical (i.e., confluent and terminating) right-ground TRSs given by Hullot (1980) in the sense that the termination condition can be omitted. It is also exemplified that Hullot's narrowing technique does not work in this case. Our result is compared with the undecidability of the word (and also unification) problem for terminating right-ground TRSs.

On Termination of Higher-Order Rewriting
Femke van Raamsdonk

We discuss the termination methods using the higher-order recursive path ordering and the general scheme for higher-order rewriting systems and combinatory reduction systems.

Matching with Free Function Symbols - A Simple Extension of Matching
Christophe Ringeissen

Matching is a solving process which is crucial in declarative (rule-based) programming languages. In order to apply rules, one has to match the left-hand side of a rule with the term to be rewritten. In several declarative programming languages, programs involve operators that may also satisfy some structural axioms. Therefore, their evaluation mechanism must implement powerful matching algorithms working modulo equational theories. In this paper, we show the existence of an equational theory where matching is decidable (resp. finitary) but matching in presence of additional (free) operators is undecidable (resp. infinitary). The interest of this result is to easily prove the existence of a frontier between matching and matching with free operators.

Deriving Focused Calculi for Transitive Relations
Georg Struth

We propose a new method for deriving focused ordered resolution calculi, exemplified by chaining calculi for transitive relations. Previously, inference rules were postulated and a posteriori verified in semantic completeness proofs. We derive them from the theory axioms. Completeness of our calculi then follows from correctness of this synthesis. Our method clearly separates deductive and procedural aspects: relating ordered chaining to Knuth-Bendix completion for transitive relations provides the semantic background that drives the synthesis towards its goal. This yields a more restrictive and transparent chaining calculus. The method also supports the development of approximate focused calculi and a modular approach to theory hierarchies.

A Formalised First-Order Confluence Proof for the λ-Calculus Using One-Sorted Variable Names
René Vestergaard, James Brotherston

We present the titular proof development which has been implemented in Isabelle/HOL. As a first, the proof is conducted exclusively by the primitive induction principles of the standard syntax and the considered reduction relations: the naive way, so to speak. Curiously, the Barendregt Variable Convention takes on a central technical role in the proof. We also show (i) that our presentation coincides with Curry's and Hindley's when terms are considered equal up-to α and (ii) that the con uence properties of all considered calculi are equivalent.

A Normal Form for Church-Rosser Language Systems
Jens R. Woinowski

n this paper the context-splittable normal form for rewritings systems defining Church-Rosser languages is introduced. Context-splittable rewriting rules look like rules of context-sensitive grammars with swapped sides. To be more precise, they have the form uvw → uxw with u, v, w being words, v being nonempty and x being a single letter or the empty word. It is proved that this normal form can be achieved for each Church-Rosser language and that the construction is effective.

Confluence and Termination of Simply Typed Term Rewriting Systems
Toshiyuki Yamada

We propose simply typed term rewriting systems (STTRSs), which extend first-order rewriting by allowing higher-order functions. We study a simple proof method for con uence which employs a characterization of the diamond property of a parallel reduction. By an application of the proof method, we obtain a new con uence result for orthogonal conditional STTRSs. We also discuss a semantic method for proving termination of STTRSs based on monotone interpretation.

Parallel Evaluation of Interaction Nets with MPINE
Jorge Sousa Pinto

We describe the MPINE tool, a multi-threaded evaluator for Interaction Nets. The evaluator is an implementation of the present author's Abstract Machine for Interaction Nets [5] and uses POSIX threads to achieve concurrent execution. When running on a multi-processor machine (say an SMP architecture), parallel execution is achieved effortlessly, allowing for desktop parallelism on commonly available machines.

Stratego: A Language for Program Transformation Based on Rewriting Strategies
Eelco Visser

Program transformation is used in many areas of software engineering. Examples include compilation, optimization, synthesis, refactoring, migration, normalization and improvement [15]. Rewrite rules are a natural formalism for expressing single program transformations. However, using a standard strategy for normalizing a program with a set of rewrite rules is not adequate for implementing program transformation systems. It may be necessary to apply a rule only in some phase of a transformation, to apply rules in some order, or to apply a rule only to part of a program. These restrictions may be necessary to avoid non-termination or to choose a specific path in a non-con uent rewrite system.
Stratego is a language for the specification of program transformation systems based on the paradigm of rewriting strategies. It supports the separation of strategies from transformation rules, thus allowing careful control over the application of these rules. As a result of this separation, transformation rules are reusable in multiple difierent transformations and generic strategies capturing patterns of control can be described independently of the transformation rules they apply. Such strategies can even be formulated independently of the object language by means of the generic term traversal capabilities of Stratego.
In this short paper I give a description of version 0.5 of the Stratego system, discussing the features of the language (Section 2), the library (Section 3), the compiler (Section 4) and some of the applications that have been built (Section 5). Stratego is available as free software under the GNU General Public License from http://www.stratego-language.org.

2000

Absolute Explicit Unification
Nikolaj Bjørner, César A. Muñoz

This paper presents a system for explicit substitutions in Pure Type Systems (PTS). The system allows to solve type checking, type inhabitation, higher-order unification, and type inference for PTS using purely first-order machinery. A novel feature of our system is that it combines substitutions and variable declarations. This allows as a side-effect to type check let-bindings. Our treatment of meta-variables is also explicit, such that instantiations of meta-variables is internalized in the calculus. This produces a confluent λ-calculus with distinguished holes and explicit substitutions that is insensitive to α-conversion, and allows directly embedding the system into rewriting logic.

Termination and Confluence of Higher-Order Rewrite Systems
Frédéric Blanqui

In the last twenty years, several approaches to higher-order rewriting have been proposed, among which Klop's Combinatory Rewrite Systems (CRSs), Nipkow's Higher-order Rewrite Systems (HRSs) and Jouannaud and Okada's higher-order algebraic specification languages, of which only the last one considers typed terms. The later approach has been extended by Jouannaud, Okada and the present author into Inductive Data Type Systems (IDTSs). In this paper, we extend IDTSs with the CRS higher-order pattern-matching mechanism, resulting in simply-typed CRSs. Then, we show how the termination criterion developed for IDTSs with first-order pattern-matching, called the General Schema, can be extended so as to prove the strong normalization of IDTSs with higher-order pattern-matching. Next, we compare the unified approach with HRSs. We first prove that the extended General Schema can also be applied to HRSs. Second, we show how Nipkow's higher-order critical pair analysis technique for proving local confluence can be applied to IDTSs.

A de Bruijn Notation for Higher-Order Rewriting
Eduardo Bonelli, Delia Kesner, Alejandro Ríos

We propose a formalism for higher-order rewriting in de Bruijn notation. This notation not only is used for terms (as usually done in the literature) but also for metaterms, which are the syntactical objects used to express general higher-order rewrite systems. We give formal translations from higher-order rewriting with names to higher-order rewriting with de Bruijn indices, and vice-versa. These translations can be viewed as an interface in programming languages based on higher-order rewrite systems, and they are also used to show some properties, namely, that both formalisms are operationally equivalent, and that confluence is preserved when translating one formalism into the other.

Rewriting Techniques in Theoretical Physics
Évelyne Contejean, Antoine Coste, Benjamin Monate

This paper presents a general method for studying some quotients of the special linear group SL2 over the integers, which are of fundamental interest in the field of statistical physics. Our method automatically helps in validating some conjectures due to physicists, such as conjectures stating that a set of equations completely describes a finite given quotient of SL2. In a first step, we show that in the cases we are interested in, the usual presentation of finitely generated groups with some constant generators and a binary concatenation can be turned into an equivalent one with unary generators. In a second step, when the completion of the transformed set of equations terminates, we show how to compute directly the associated normal forms automaton. According to the presence of loops, we are able to decide the finiteness of the quotient, and to compute its cardinality. When the quotient is infinite, the automaton gives some hints on what kind of equations are needed in order to insure the finiteness of the quotient.

Normal Forms and Reduction for Theories of Binary Relations
Daniel J. Dougherty, Claudio Gutiérrez

We consider equational theories of binary relations, in a language expressing composition, converse, and lattice operations. We treat the equations valid in the standard model of sets and also define a hierarchy of equational axiomatisations stratifying the standard theory. By working directly with a presentation of relation-expressions as graphs we are able to define a notion of reduction which is confluent and strongly normalising, in sharp contrast to traditional treatments based on first-order terms. As consequences we obtain unique normal forms, decidability of the decision problem for equality for each theory. In particular we show a non-deterministic polynomial-time upper bound for the complexity of the decision problems.

Parallelism Constraints
Katrin Erk, Joachim Niehren

Parallelism constraints are logical descriptions of trees. They are as expressive as context unification, i.e. second-order linear unification. We present a semi-decision procedure enumerating all 'most general unifiers' of a parallelism constraint and prove it sound and complete. In contrast to all known procedures for context unification, the presented procedure terminates for the important fragment of dominance constraints and performs reasonably well in a recent application to underspecified natural language semantics.

Linear Higher-Order Matching Is NP-Complete
Philippe de Groote

We consider the problem of higher-order matching restricted to the set of linear λ-terms (i.e., λ-terms where each abstraction λx. M is such that there is exactly one free occurrence of x in M). We prove that this problem is decidable by showing that it belongs to NP. Then we prove that this problem is in fact NP-complete. Finally, we discuss some heuristics for a practical algorithm.

Standardization and Confluence for a Lambda Calculus with Generalized Applications
Felix Joachimski, Ralph Matthes

As a minimal environment for the study of permutative reductions an extension ΛJ of the untyped λ-calculus is considered. In this non-terminating system with non-trivial critical pairs, confluence is established by studying triangle properties that allow to treat permutative reductions modularly and could be extended to more complex term systems with permutations. Standardization is shown by means of an inductive definition of standard reduction that closely follows the inductive term structure and captures the intuitive notion of standardness even for permutative reductions.

Linear Second-Order Unification and Context Unification with Tree-Regular Constraints
Jordi Levy, Mateu Villaret

Linear Second-Order Unification and Context Unification are closely related problems. However, their equivalence was never formally proved. Context unification is a restriction of linear second-order unification. Here we prove that linear second-order unification can be reduced to context unification with tree-regular constraints.
Decidability of context unification is still an open question. We comment on the possibility that linear second-order unification is decidable, if context unification is, and how to get rid of the tree-regular constraints. This is done by reducing rank-bound tree-regular constraints to word-regular constraints.

Word Problems and Confluence Problems for Restricted Semi-Thue Systems
Markus Lohrey

We investigate word problems and confluence problems for the following four classes of terminating semi-Thue systems: length-reducing systems, weight-reducing systems, length-lexicographic systems, and weight-lexicographic systems. For each of these four classes we determine the complexity of several variants of the word problem and confluence problem. Finally we show that the variable membership problem for quasi context-sensitive grammars is EXPSPACE-complete.

The Explicit Representability of Implicit Generalizations
Reinhard Pichler

In [9], implicit generalizations over some Herbrand universe H were introduced as constructs of the form I = t/t1 ∨ ... ∨ tm with the intended meaning that I represents all H-ground instances of t that are not instances of any term ti on the right-hand side. More generally, we can also consider disjunctions I = I1 ∨ ... ∨ In of implicit generalizations, where I contains all ground terms from H that are contained in at least one of the implicit generalizations Ij. Implicit generalizations have applications to many areas of Computer Science. For the actual work, the so-called finite explicit representability problem plays an important role, i.e.: Given a disjunction of implicit generalizations I = I1 ∨ ... ∨ In, do there exist terms r1,...,rl, s.t. the ground terms represented by I coincide with the union of the H-ground instances of the terms rj? In this paper, we prove the coNP-completeness of this decision problem.

On the Word Problem for Combinators
Richard Statman

In 1936 Alonzo Church observed that the "word problem" for combinators is undecidable. He used his student Kleene's representation of partial recursive functions as lambda terms. This illustrates very well the point that "word problems" are good problems in the sense that a solution either way - decidable or undecidable - can give useful information. In particular, this undecidability proof shows us how to program arbitrary partial recursive functions as combinators.
I never thought that this result was the end of the story for combinators. In particular, it leaves open the possibility that the unsolvable problem can be approximated by solvable ones. It also says nothing about word problems for interesting fragments i.e., sets of combinators not combinatorially complete.
Perhaps the most famous subproblem is the problem for S terms. Recently, Waldmann has made significant progress on this problem. Prior, we solved the word problem for the Lark, a relative of S. Similar solutions can be given for the Owl (S*) and Turing's bird U. Familiar decidable fragments include linear combinators and various sorts of typed combinators. Here we would like to consider several fragments of much greater scope. We shall present several theorems and an open problem.

An Algebra of Resolution
Georg Struth

We propose an algebraic reconstruction of resolution as Knuth-Bendix completion. The basic idea is to model propositional ordered Horn resolution and resolution as rewrite-based solutions to the uniform word problems for semilattices and distributive lattices. Completion for non-symmetric transitive relations and a variant of symmetrization normalize and simplify the presentation. The procedural content of resolution, its refutational completeness and redundancy elimination techniques thereby reduce to standard algebraic completion techniques. The reconstruction is analogous to that of the Buchberger algorithm by equational completion.

Deriving Theory Superposition Calculi from Convergent Term Rewriting Systems
Jürgen Stuber

We show how to derive refutationally complete ground superposition calculi systematically from convergent term rewriting systems for equational theories, in order to make automated theorem proving in these theories more effective. In particular we consider abelian groups and commutative rings. These are difficult for automated theorem provers, since their axioms of associativity, commutativity, distributivity and the inverse law can generate many variations of the same equation. For these theories ordering restrictions can be strengthened so that inferences apply only to maximal summands, and superpositions into the inverse law that move summands from one side of an equation to the other can be replaced by an isolation rule that isolates the maximal terms on one side. Additional inferences arise from superpositions of extended clauses, but we can show that most of these are redundant. In particular, none are needed in the case of abelian groups, and at most one for any pair of ground clauses in the case of commutative rings.

Right-Linear Finite Path Overlapping Term Rewriting Systems Effectively Preserve Recognizability
Toshinori Takai, Yuichi Kaji, Hiroyuki Seki

Right-linear finite path overlapping TRS are shown to effectively preserve recognizability. The class of right-linear finite path overlapping TRS properly includes the class of linear generalized semi-monadic TRS and the class of inverse left-linear growing TRS, which are known to effectively preserve recognizability. Approximations by inverse right-linear finite path overlapping TRS are also discussed.

System Description: The Dependency Pair Method
Thomas Arts

The dependency pair method refers to the approach for proving (innermost) termination of term rewriting systems by showing that no infinite chain of so-called dependency pairs exists (for an overview of the method see [AG00]). The method generates inequalities that should be satisfied by a suitable well-founded ordering. Well-known techniques for searching simplification orderings (such as path orderings or polynomial interpretations) may be used to find such an ordering. The key point is that, even if the TRS is not simply terminating, the dependency pair method often generates a set of inequalities that can be satisfied by a simplification ordering and herewith can prove termination of the TRS.

REM (Reduce Elan Machine): Core of the New ELAN Compiler
Pierre-Étienne Moreau

ELAN is a powerful language and environment for specifying and prototyping deduction systems in a language based on rewrite rules controlled by strategies. It offers a natural and simple logical framework for the combination of the computation and deduction paradigms. It supports the design of theorem provers, logic programming languages, constraint solvers and decision procedures.

TALP: A Tool for the Termination Analysis of Logic Programs
Enno Ohlebusch, Claus Claves, Claude Marché

In the last decade, the automatic termination analysis of logic programs has been receiving increasing attention. Among other methods, techniques have been proposed that transform a well-moded logic program into a term rewriting system (TRS) so that termination of the TRS implies termination of the logic program under Prolog's selection rule. In [Ohl99] it has been shown that the two-stage transformation obtained by combining the transformations of [GW93] into deterministic conditional TRSs (CTRSs) with a further transformation into TRSs [CR93] yields the transformation proposed in [AZ96], and that these three transformations are equally powerful. In most cases simplification orderings are not sufficient to prove termination of the TRSs obtained by the two-stage transformation. However, if one uses the dependency pair method [AG00] in combination with polynomial interpretations instead, then most of the examples described in the literature can automatically be proven terminating. Based on these observations, we have implemented a tool for proving termination of logic programs automatically. This tool consists of a front-end which implements the two-stage transformation and a back-end, the CiME system [CiM], for proving termination of the generated TRS. Experiments show that our tool can compete with other tools [DSV99 ]based on sophisticated norm-based approaches.

1999

Solved Forms for Path Ordering Constraints
Robert Nieuwenhuis, José Miguel Rivero

A usual technique in symbolic constraint solving is to apply transformation rules until a solved form is reached for which the problem becomes simple. Ordering constraints are well-known to be reducible to (a disjunction of) solved forms, but unfortunately no polynomial algorithm deciding the satisfiability of these solved forms is known.
Here we deal with a different notion of solved form, where fundamental properties of orderings like transitivity and monotonicity are taken into account. This leads to a new family of constraint solving algorithms for the full recursive path ordering with status (RPOS), and hence as well for other path orderings like LPO, MPO, KNS and RDO, and for all possible total precedences and signatures. Apart from simplicity and elegance from the theoretical point of view, the main contribution of these algorithms is on efficiency in practice. Since guessing is minimized, and, in particular, no linear orderings between the subterms are guessed, a practical improvement in performance of several orders of magnitude over previous algorithms is obtained, as shown by our experiments.

Jeopardy
Nachum Dershowitz, Subrata Mitra

We consider functions defined by ground-convergent left-linear rewrite systems. By restricting the depth of left sides and disallowing defined symbols at the top of right sides, we obtain an algorithm for function inversion.

Strategic Pattern Matching
Eelco Visser

Stratego is a language for the specification of transformation rules and strategies for applying them. The basic actions of transformations are matching and building instantiations of first-order term patterns. The language supports concise formulation of generic and data type-specific term traversals. One of the unusual features of Stratego is the separation of scope from matching, allowing sharing of variables through traversals. The combination of first-order patterns with strategies forms an expressive formalism for pattern matching. In this paper we discuss three examples of strategic pattern matching: (1) Contextual rules allow matching and replacement of a pattern at an arbitrary depth of a subterm of the root pattern. (2) Recursive patterns can be used to characterize concisely the structure of languages that form a restriction of a larger language. (3) Overlays serve to hide the representation of a language in another (more generic) language. These techniques are illustrated by means of specifications in Stratego.

On the Strong Normalisation of Natural Deduction with Permutation-Conversions
Philippe de Groote

We present a modular proof of the strong normalisation of intuitionistic logic with permutation-conversions. This proof is based on the notions of negative translation and CPS-simulation.

Normalisation in Weakly Orthogonal Rewriting
Vincent van Oostrom

A rewrite sequence is said to be outermost-fair if every outermost redex occurrence is eventually eliminated. Outermost-fair rewriting is known to be (head-)normalising for almost orthogonal rewrite systems. We study (head-)normalisation for the larger class of weakly orthogonal rewrite systems. (Infinitary) normalisation is established and a counterexample against head-normalisation is given.

Strong Normalization of Proof Nets Modulo Structural Congruences
Roberto Di Cosmo, Stefano Guerrini

This paper proposes a notion of reduction for the proof nets of Linear Logic modulo an equivalence relation on the contraction links, that essentially amounts to consider the contraction as an associative commutative binary operator that can float freely in and out of proof net boxes. The need for such a system comes, on one side, from the desire to make proof nets an even more parallel syntax for Linear Logic, and on the other side from the application of proof nets to l-calculus with or without explicit substitutions, which needs a notion of reduction more flexible than those present in the literature. The main result of the paper is that this relaxed notion of rewriting is still strongly normalizing.

Undecidability of the exists*forall* Part of the Theory of Ground Term Algebra Modulo an AC Symbol
Jerzy Marcinkowski

We show that the ∃** part of the equational theory modulo an AC symbol is undecidable. This solves the open problem 25 from the RTA list ([DJK91],[DJK93],[DJK95]).

Deciding the Satisfiability of Quantifier free Formulae on One-Step Rewriting
Anne-Cécile Caron, Franck Seynhaeve, Sophie Tison, Marc Tommasi

We consider quantifier free formulae of a first order theory without functions and with predicates x rewrites to y in one step with given rewrite systems. Variables are interpreted in the set of finite trees. The full theory is undecidable [Tre96] and recent results [STT97], [Mar97], [Vor97] have strengthened the undecidability result to formulae with small prefixes (∃**) and very restricted classes of rewriting systems (e.g. linear, shallow and convergent in [STTT98]). Decidability of the positive existential fragment has been shown in [NPR97]. We give a decision procedure for positive and negative existential formulae in the case when the rewrite systems are quasi-shallow, that is all variables in the rewrite rules occur at depth one. Our result extends to formulae with equalities and memberships relations of the form x ∈ L where L is a recognizable set of terms.

A New Result about the Decidability of the Existential One-Step Rewriting Theory
Sébastien Limet, Pierre Réty

We give a decision procedure for the whole existential fragment of one-step rewriting first-order theory, in the case where rewrite systems are linear, non left-left-overlapping (i.e. without critical pairs), and non ∈-left-right-overlapping (i.e. no left-hand-side overlaps on top with the right-hand-side of the same rewrite rule). The procedure is defined by means of tree-tuple synchronized grammars.

A Fully Syntactic AC-RPO
Albert Rubio

We present the first fully syntactic (i.e., non-interpretationbased) AC-compatible recursive path ordering (RPO). It is simple, and hence easy to implement, and its behaviour is intuitive as in the standard RPO. The ordering is AC-total, and defined uniformly for both ground and non-ground terms, as well as for partial precedences. More importantly, it is the first one that can deal incrementally with partial precedences, an aspect that is essential, together with its intuitive behaviour, for interactive applications like Knuth-Bendix completion.

Theory Path Orderings
Jürgen Stuber

We introduce the notion of a theory path ordering (TPO), which simplifies the construction of term orderings for superposition theorem proving in algebraic theories. To achieve refutational completeness of such calculi we need total, E-compatible and E-antisymmetric simplification quasi-orderings. The construction of a TPO takes as its ingredients a status function for interpreted function symbols and a precedence that makes the interpreted function symbols minimal. The properties of the ordering then follow from related properties of the status function. Theory path orderings generalize associative path orderings.

A Characterisation of Multiply Recursive Functions with Higman's Lemma
Hélène Touzet

We prove that string rewriting systems which reduce by Higman's lemma exhaust the multiply recursive functions. This result provides a full characterisation of the expressiveness of Higman's lemma when applied to rewriting theory. The underlying argument of our construction is to connect the order type and the derivation length via the Hardy hierarchy.

Deciding the Word Problem in the Union of Equational Theories Sharing Constructors
Franz Baader, Cesare Tinelli

The main contribution of this paper is a new method for combining decision procedures for the word problem in equational theories sharing "constructors." The notion of constructors adopted in this paper has a nice algebraic definition and is more general than a related notion introduced in previous work on the combination problem.

Normalization via Rewrite Closures
Leo Bachmair, C. R. Ramakrishnan, I. V. Ramakrishnan, Ashish Tiwari

We present an abstract completion-based method for finding normal forms of terms with respect to given rewrite systems. The method uses the concept of a rewrite closure, which is a generalization of the idea of a congruence closure. Our results generalize previous results on congruence closure-based normalization methods. The description of known methods within our formalism also allows a better understanding of these procedures.

Test Sets for the Universal and Existential Closure of Regular Tree Languages
Dieter Hofbauer, Maria Huber

Finite test sets are a useful tool for deciding the membership problem for the universal closure of a given tree language, that is, for deciding whether a term has all its ground instances in the given language. A uniform test set for the universal closure must serve the following purpose: In order to decide membership of a term, it is sufficient to check whether all its test set instances belong to the underlying language. A possible application, and our main motivation, is ground reducibility, an essential concept for many approaches to inductive reasoning. Ground reducibility modulo some rewrite system is membership in the universal closure of the set of reducible ground terms. Here, test sets always exist, and several algorithmic approaches are known. The resulting sets, however, are often unnecessarily large.
In this paper we consider regular languages and linear closure operators. We prove that universal as well as existential closure, defined analogously, preserve regularity. By relating test sets to tree automata and to appropriate congruence relations, we show how to characterize, how to compute, and how to minimize ground and non-ground test sets. In particular, optimal solutions now replace previous ad hoc approximations for the ground reducibility problem.

The Maude System
Manuel Clavel, Francisco Durán, Steven Eker, Patrick Lincoln, Narciso Martí-Oliet, José Meseguer, Jose F. Quesada

TODO

TOY : A Multiparadigm Declarative System
Francisco Javier López-Fraguas, Jaime Sánchez-Hernández

TOY is the concrete implementation of CRWL, a wide theoretical framework for declarative programming whose basis is a constructor based rewriting logic with lazy non-deterministic functions as the core notion. Other aspects of CRWL supported by TOY are: polymorphic types; HO features; equality and disequality constraints over terms and linear constraints over real numbers; goal solving by needed narrowing combined with constraint solving. The implementation is based on a compilation of TOY programs into Prolog.

UNIMOK: A System for Combining Equational Unification Algorithm
Stephan Kepser, Jörn Richts

TODO

LR2 : A Laboratory for Rapid Term Graph Rewriting
Rakesh M. Verma, Shalitha Senanayake

TODO

Decidability for Left-Linaer Growing Term Rewriting Systems
Takashi Nagaya, Yoshihito Toyama

A term rewriting system is called growing if each variable occurring both the left-hand side and the right-hand side of a rewrite rule occurs at depth zero or one in the left-hand side. Jacquemard showed that the reachability and the sequentiality of linear (i.e., left-right-linear) growing term rewriting systems are decidable. In this paper we show that Jacquemard's result can be extended to left-linear growing rewriting systems that may have right-non-linear rewrite rules. This implies that the reachability and the joinability of some class of right-linear term rewriting systems are decidable, which improves the results for rightground term rewriting systems by Oyamaguchi. Our result extends the class of left-linear term rewriting systems having a decidable call-by-need normalizing strategy. Moreover, we prove that the termination property is decidable for almost orthogonal growing term rewriting systems.

Transforming Context-Sensitive Rewrite Systems
Jürgen Giesl, Aart Middeldorp

We present two new transformation techniques for proving termination of context-sensitive rewriting. Ourfirst method is simple, sound, and more powerful than previously suggested transformations. However, it is not complete, i.e., there are terminating context-sensitive rewrite systems that are transformed into non-terminating term rewrite systems. The second method that we present in this paper is both sound and complete. This latter result can be interpreted as stating that from a termination perspective there is no reason to study context-sensitive rewriting.

Context-Sensitive AC-Rewriting
Maria C. F. Ferreira, A. L. Ribeiro

Context-sensitive rewriting was introduced in [7] and consists of syntactical restrictions imposed on a Term Rewriting System indicating how reductions can be performed. So context-sensitive rewriting is a restriction of the usual rewrite relation which reduces the reduction space and allows for a finer control of the reductions of a term. In this paper we extend the concept of context-sensitive rewriting to the framework rewriting modulo an associative-commutative theory in two ways: by restricting reductions and restricting AC-steps, and we then study this new relation with respect to the property of termination.

The Calculus of algebraic Constructions
Frédéric Blanqui, Jean-Pierre Jouannaud, Mitsuhiro Okada

This paper is concerned with the foundations of the Calculus of Algebraic Constructions (CAC), an extension of the Calculus of Constructions by inductive data types. CAC generalizes inductive types equipped with higher-order primitive recursion, by providing definitions of functions by pattern-matching which capture recursor definitions for arbitrary non-dependent and non-polymorphic inductive types satisfying a strictly positivity condition. CAC also generalizes the first-order framework of abstract data types by providing dependent types and higher-order rewrite rules.

HOL-λσ : An Intentional First-Order Expression of Higher-Order Logic
Gilles Dowek, Thérèse Hardin, Claude Kirchner

We propose a first-order presentation of higher-order logic based on explicit substitutions. It is intentionally equivalent to the usual presentation of higher-order logic based on λ-calculus, i.e. a proposition can be proved without the extensionality axioms in one theory if and only if it can in the other. The tiExtended Narrowing and Resolution first-order proof-search method can be applied to this theory. This allows to simulate higher-order resolution step by step and furthermore leaves room for further optimizations and extensions.

A Rewrite System Associated with Quadratic Pisot Units
Christiane Frougny, Jacques Sakarovitch

The main point is to show that the rewrite system made up by the relations that generate γθ, though non-confluent, behaves as if it were confluent.

Fast Rewriting of Symmetric Polynomials
Manfred Göbel

This note presents a fast version of the classical algorithm to represent any symmetric function in a unique way as a polynomial in the elementary symmetric polynomials by using power sums of variables. We analyze the worst case complexity for both algorithms, the original and the fast version, and confirm our results by empirical run-time experiments. Our main result is a fast algorithm with a polynomial worst case complexity w.r.t. the total degree of the input polynomial compared to the classical algorithm with its exponential worst case complexity.

On Implementation of Tree Synchronized Languages
Frédéric Saubion, Igor Stéphan

Tree languages have been extensively studied and have many applications related to the rewriting framework such as order sorted specifications, higher order matching or unification. In this paper, we focus on the implementation of such languages and, inspired by the Definite Clause Grammars that allows to write word grammars as Horn clauses in a Prolog environment, we propose to build a similar framework for particular tree languages (TTSG) which introduces a notion of synchronization between production rules. Our main idea is to define a proof theoretical semantics for grammars and thus to change from syntactical tree manipulations to logical deduction. This is achieved by a sequent calculus proof system which can be refined and translated into Prolog Horn clauses. This work provides a scheme to build goal directed procedures for the recognition of tree languages.

1998

Simultaneous Critical Pairs and Church-Rosser Property
Satoshi Okui

We introduce simultaneous critical pairs, which account for simultaneous overlapping of several rewrite rules. Based on this, we introduce a new CR-criterion widely applicable to arbitrary left-linear term rewriting systems. Our result extends the well-known criterion given by Huet (1980), Toyama (1988), and Oostrom (1997) and incomparable with other well-known criteria for left-linear systems.

Church-Rosser Theorems for Abstract Reduction Modulo an Equivalence Relation
Enno Ohlebusch

A very powerful method for proving the Church-Rosser property for abstract rewriting systems has been developed by van Oostrom. In this paper, his technique is extended in two ways to abstract rewriting modulo an equivalence relation. It is shown that known Church-Rosser theorems can be viewed as special cases of the new criteria. Moreover, applications of the new criteria yield several new results.

Automatic Monoids Versus Monoids with Finite Convergent Presentations
Friedrich Otto, Andrea Sattler-Klein, Klaus Madlener

Due to their many nice properties groups with automatic structure (automatic groups) have received a lot of attention in the literature. The multiplication of an automatic group can be realized through finite automata based on a regular set of (not necessarily unique) representatives for the group, and hence, each automatic group has a tractable word problem and low derivational complexity. Consequently it has been asked whether corresponding results also hold for monoids with automatic structure. Here we show that there exist finitely presented monoids with automatic structure that cannot be presented through finite and convergent string-rewriting systems, thus answering a question in the negative that is still open for the class of automatic groups. Secondly, we present an automatic monoid that has an exponential derivational complexity, which establishes another difference to the class of automatic groups. In fact, both our example monoids are bi-automatic. In addition, it follows from the first of our examples that a monoid which is given through a finite, noetherian, and weakly confluent string-rewriting system need not have finite derivation type.

Decidable and Undecidable Second-Order Unification Problems
Jordi Levy

There is a close relationship between word unification and second-order unification. This similarity has been exploited for instance for proving decidability of monadic second-order unification. Word unification. can be easily decided by transformation rules (similar to the ones applied in higher-order unification procedures) when variables are restricted to occur at most twice. Hence a well-known open question was the decidability of second-order unification under this same restriction. Here we answer this question negatively by reducing simultaneous rigid E-unification to second-order unification. This reduction, together with an inverse reduction found by Degtyarev and Voronkov, states an equivalence relationship between both unification problems.
Our reduction is in some sense reversible, providing decidability results for cases when simultaneous rigid E-unification is decidable. This happens, for example, for one-variable problems where the variable occurs at most twice (because rigid E-unification is decidable for just one equation). We also prove decidability when no variable occurs more than once, hence significantly narrowing the gap between decidable and undecidable second-order unification problems with variable occurrence restrictions.

On the Exponent of Periodicity of Minimal Solutions of Context Equation
Manfred Schmidt-Schauß, Klaus U. Schulz

Context unification is a generalisation of string unification where words are generalized to terms with one hole. Though decidability of string unification was proved by Makanin, the decidability of context unification is currently an open question. This paper provides a step in understanding the complexity of context unification and the structure of unifiers. It is shown, that if a context unification problem of size d is unifiable, then there is also a unifier with an exponent of periodicity smaller than O(21.07d). We also prove NP-hardness for restricted cases of the context unification problem and compare the complexity of general context unification with that of general string unification.

Unification in Extension of Shallow Equational Theories
Florent Jacquemard, Christoph Meyer, Christoph Weidenbach

We show that unification in certain extensions of shallow equational theories is decidable. Our extensions generalize the known classes of shallow or standard equational theories. In order to prove decidability of unification in the extensions, a class of Horn clause sets called sorted shallow equational theories is introduced. This class is a natural extension of tree automata with equality constraints between brother subterms as well as shallow sort theories. We show that saturation under sorted superposition is effective on sorted shallow equational theories. So called semi-linear equational theories can be effectively transformed into equivalent sorted shallow equational theories and generalize the classes of shallow and standard equational theories.

Unification and Matching in Process Algebras
Qing Guo, Paliath Narendran, Sandeep K. Shukla

We consider the compatibility checking problem for a simple fragment of CCS, called BCCSP[12], using equational unification techniques. Two high-level specifications given as two process algebraic terms with free variables are said to be compatible modulo some equivalence relation if a substitution on the free variables can make the resulting terms equivalent modulo that relation. We formulate this compatibility (modulo an equivalence relation) checking problems as unification problems in the equational theory of the the corresponding equivalence relation. We use van Glabbeek's equational axiomatizations [12] for some interesting process algebraic relations. Specifically, we consider equational axiomatizations for bisimulation equivalence and trace equivalence and establish complexity lower bounds and upper bounds for the corresponding unification and matching problems. We also show some special cases for which efficient algorithmic solutions exist.

E-Unification for Subsystems of S4
Renate A. Schmidt

This paper is concerned with the unification problem in the path logics associated by the optimised functional translation method with the propositional modal logics K, KD, KT, KD4, S4 and S5. It presents improved unification algorithms for certain forms of the right identity and associativity laws. The algorithms employ mutation rules, which have the advantage that terms are worked off from the outside inward, making paramodulating into terms superfluous.

Solving Disequations Modulo Some Class of Rewrite Systems
Sébastien Limet, Pierre Réty

This paper gives a procedure for solving disequations modulo equational theories, and to decide existence of solutions. For this, we assume that the equational theory is specified by a confluent and constructor-based rewrite system, and that four additional restrictions are satisfied. The procedure represents the possibly infinite set of solutions thanks to a grammar, and decides existence of solutions thanks to an emptiness test. As a consequence, checking whether a linear equality is an inductive theorem is decidable, if assuming moreover sufficient completeness.

Normalization of S-Terms is Decidable
Johannes Waldmann

The combinator S has the reduction rule S x y z ↝ x z (y z). We investigate properties of ground terms built from S alone. We show that it is decidable whether such an S-term has a normal form. The decision procedure makes use of rational tree languages. We also exemplify and summarize other properties of S-terms and hint at open questions.

Decidable Approximations of Sets of Descendants and Sets of Normal Forms
Thomas Genet

We present here decidable approximations of sets of descendants and sets of normal forms of Term Rewriting Systems, based on specific tree automata techniques. In the context of rewriting logic, a Term Rewriting System is a program, and a normal form is a result of the program. Thus, approximations of sets of descendants and sets of normal forms provide tools for analysing a few properties of programs: we show how to compute a superset of results, to prove the sufficient completeness property, or to find a criterion for proving termination under a specific strategy, the sequential reduction strategy. The main technical contribution of the paper is the construction of an approximation automaton which recognises a superset of the set of normal forms of terms in a set E, w.r.t. a Term Rewriting System R.

Algorithms and Reductions for Rewriting Problems
Rakesh M. Verma, Michaël Rusinowitch, Denis Lugiez

In this paper we initiate a study of polynomial-time reductions for some basic decision problems of rewrite systems. We then give a polynomial-time algorithm for Unique-normal-form property of ground systems for the first time. Next we prove undecidability of these problems for a fixed string rewriting system using our reductions. Finally, we prove partial decidability results for Confluence of commutative semi-thue systems. The Confluence and Unique-normal-form property are shown Expspace-hard for commutative semi-thue systems. We also show that there is a family of string rewrite systems for which the word problem is trivially decidable but confluence undecidable, and we show a linear equational theory with decidable word problem but undecidable linear equational matching.

The Decidability of Simultaneous Rigid E-Unification with One Variable
Anatoli Degtyarev, Yuri Gurevich, Paliath Narendran, Margus Veanes, Andrei Voronkov

We show that simultaneous rigid E-unification, or SREU for short, is decidable and in fact EXPTIME-complete in the case of one variable. This result implies that the ∀*∃∀* fragment of intuitionistic logic with equality is decidable. Together with a previous result regarding the undecidability of the ∃∃-fragment, we obtain a complete classification of decidability of the prenex fragment of intuitionistic logic with equality, in terms of the quantifier prefix. It is also proved that SREU with one variable and a constant bound on the number of rigid equations is P-complete.

Ordering Constraints over Feature Trees Expressed in Second-Order Monadic Logic
Martin Müller, Joachim Niehren

The system FT of ordering constraints over feature trees has been introduced as an extension of the system FT of equality constraints over feature trees. We investigate decidability and complexity questions for fragments of the first-order theory of FT. It is well-known that the first-order theory of FT is decidable and that several of its fragments can be decided in quasi-linear time, including the satisfiability problem of FT and its entailment problem with existential quantification φ⊧∃x1..xnφ'. Much less is known on the first-order theory of FT. The satisfiability problem of FT can be decided in cubic time, as well as its entailment problem without existential quantification. Our main result is that the entailment problem of FT with existential quantifiers is decidable but PSPACE-hard. Our decidability proof is based on a new technique where feature constraints are expressed in second-order monadic logic with countably many successors SωS. We thereby reduce the entailment problem of FT with existential quantification to Rabin's famous theorem on tree automata.

Co-definite Set Constraints
Witold Charatonik, Andreas Podelski

In this paper, we introduce the class of co-definite set constraints. This is a natural subclass of set constraints which, when satisfiable, have a greatest solution. It is practically motivated by the set-based analysis of logic programs with the greatest-model semantics. We present an algorithm solving co-definite set constraints and show that their satisfiability problem is DEXPTIME-complete.

Modularity of Termination Using Dependency pairs
Thomas Arts, Jürgen Giesl

The framework of dependency pairs allows automated termination and innermost termination proofs for many TRSs where such proofs were not possible before. In this paper we present a refinement of this framework in order to prove termination in a modular way. Our modularity results significantly increase the class of term rewriting systems where termination resp. innermost termination can be proved automatically. Moreover, the modular approach to dependency pairs yields new modularity criteria which extend previous results in this area. In particular, existing results for modularity of innermost termination can easily be obtained as direct consequences of our new criteria.

Termination of Associative-Commutative Rewriting by Dependency Pairs
Claude Marché, Xavier Urbain

A new criterion for termination of rewriting has been described by Arts and Giesl in 1997. We show how this criterion can be generalized to rewriting modulo associativity and commutativity. We also show how one can build weak AC-compatible reduction orderings which may be used in this criterion.

Termination Transformation by Tree Lifting Ordering
Takahito Aoto, Yoshihito Toyama

An extension of a modular termination result for term rewriting systems (TRSs, for short) by A. Middeldorp (1989) is presented. We intended to obtain this by adapting the dummy elimination transformation by M. C. F. Ferreira and H. Zantema (1995) under the presence of a non-collapsing non-duplicating terminating TRS whose function symbols are all to be eliminated. We propose a tree lifting ordering induced from a reduction order and a set G of function symbols, and use this ordering to transform a TRS R into R'; termination of R' implies that of RUS for any non-collapsing non-duplicating terminating TRS S whose function symbols are contained in G, provided that for any l → r in R (1) the root symbol of r is in G whenever that of l is in G; and (2) no variable appears directly below a symbol from G in l when G contains a constant. Because of conditions (1) and (2), our technique covers only a part of the dummy elimination technique; however, even when S is empty, there are cases that our technique has an advantage over the dummy elimination technique.

Towards Automated Termination Proofs through "Freezing"
Hongwei Xi

We present a transformation technique called freezing to facilitate automatic termination proofs for left-linear term rewriting systems. The significant merits of this technique lie in its simplicity, its amenability to automation and its effectiveness, especially, when combined with other well-known methods such as recursive path orderings and polynomial interpretations. We prove that applying the freezing technique to a left-linear term rewriting system always terminates. We also show that many interesting TRSs in the literature can be handled with the help of freezing while they elude a lot of other approaches aiming for generating termination proofs automatically for term rewriting systems. We have mechanically verified all the left-linear examples presented in this paper.

Higher-Order Rewriting and Partial Evaluation
Olivier Danvy, Kristoffer Høgsbro Rose

We demonstrate the usefulness of higher-order rewriting techniques for specializing programs, i.e., for partial evaluation. More precisely, we demonstrate how casting program specializers as combinatory reduction systems (CRSs) makes it possible to formalize the corresponding program transformations as meta-reductions, i.e., reductions in the internal "substitution calculus." For partial-evaluation problems, this means that instead of having to prove on a case-by-case basis that one's "two-level functions" operate properly, one can concisely formalize them as a combinatory reduction system and obtain as a corollary that static reduction does not go wrong and yields a well-formed residual program.
We have found that the CRS substitution calculus provides an adequate expressive power to formalize partial evaluation: it provides sufficient termination strength while avoiding the need for additional restrictions such as types that would complicate the description unnecessarily (for our purpose).
In addition, partial evaluation provides a number of examples of higher-order rewriting where being higher order is a central (rather than an occasional or merely exotic) property. We illustrate this by demonstrating how standard but non-trivial partial-evaluation examples are handled with higher-order rewriting.

SN Combinators and Partial Combinatory Algebras
Yohji Akama

We introduce an intersection typing system for combinatory logic. We prove the soundness and completeness for the class of partial combinatory algebras. We derive that a term of combinatory logic is typeable iff it is SN. Let F be the class of non-empty filters which consist of types. Then F is an extensional non-total partial combinatory algebra. Furthermore, it is a fully abstract model with respect to the set of sn c terms of combinatory logic. By F, we can solve Bethke-Klop's question; "find a suitable representation of the finally collapsed partial combinatory algebra of P". Here, P is a partial combinatory algebra, and is the set of closed sn terms of combinatory logic modulo the inherent equality. Our solution is the following: the finally collapsed partial combinatory algebra of P is representable in F. To be more precise, it is isomorphically embeddable into F.

Coupling Saturation-Based Provers by Exchanging Positive/Negative Information
Dirk Fuchs

We examine different possibilities of coupling saturationbased theorem provers by exchanging positive/negative information. Positive information is given by facts that should be employed for proving a proof goal, negative information is represented by facts that do not appear to be useful. We introduce a basic model for cooperative theorem proving employing both kinds of information. We present theoretical results regarding the exchange of positive/negative information as well as practical methods that allow for a gain of efficiency in comparison with sequential provers. Finally, we report on experimental studies conducted in the areas unfailing completion and superposition.

An On-line Problem Database
Nachum Dershowitz, Ralf Treinen

The Robbins problem was solved in October 1996 [7] by the equational theorem prover EQP [6]. Although the solution was automatic in the sense that the user of the program did not know a solution, it was not a simple matter of giving the conjecture and pushing a button. The user made many computer runs, observed the output, adjusted the search parameters, and made more computer runs. The goal of this kind of iteration is to achieve a well-behaved search. Several of the searches were successful.
The purpose of this presentation is to convey some of the methods that have led to well-behaved searches in our experiments and to speculate on automating the achievement of well-behaved search. First, I give some background on the Robbins problem and its solution.

1997

Goal-Directed Completion Using SOUR Graphs
Christopher Lynch

We give the first Goal-Directed version of the Knuth Bendix Completion Procedure. Our procedure is based on Basic Completion and SOUR Graphs. There are two phases to the procedure. The first phase, which runs in polynomial time, compiles the equations and the goal into a constrained tree automata representing the completed system, and a set of constraints representing goal solutions. The second phase starts with the goal solutions and works its way back to the original equations, solving constraints along the way.

Shostak's Congruence Closure as Completion
Deepak Kapur

Shostak's congruence closure algorithm is demystified, using the framework of ground completion on (possibly nonterminating, non-reduced) rewrite rules. In particular, the canonical rewriting relation induced by the algorithm on ground terms by a given set of ground equations is precisely constructed. The main idea is to extend the signature of the original input to include new constant symbols for nonconstant subterms appearing in the input. A byproduct of this approach is (i) an algorithm for associating a confluent rewriting system with possibly nonterminating ground rewrite rules, and (ii) a new quadratic algorithm for computing a canonical rewriting system from ground equations.

Conditional Equational Specifications of Data Types with Partial Operations for Inductive Theorem Proving
Ulrich Kühler, Claus-Peter Wirth

We propose a specification language for the formalization of data types with partial or non-terminating operations as part of a rewrite-based framework for inductive theorem proving. The language requires constructors for designating data items and admits positive/negative conditional equations as axioms in specifications. The (total algebra) semantics for such specifications is based on so-called data models. We develop admissibility conditions that guarantee the unique existence of a distinguished data model. Since admissibility of a specification requires confluence of the induced rewrite relation, we provide an effectively testable confluence criterion which does not presuppose termination.

Cross-Sections for Finitely Presented Monoids with Decidable Word Problems
Friedrich Otto, Masashi Katsura, Yuji Kobayashi

A finitely presented monoid has a decidable word problem if and only if it has a recursive cross-section if and only if it can be presented by some left-recursive convergent string-rewriting system. However, regular cross-sections or even context-free cross-sections do not suffice. This is shown by presenting examples of finitely presented monoids with decidable word problems that do not admit regular cross-sections, and that, hence, cannot be presented by left-regular convergent stringrewriting systems. Also examples of finitely presented monoids with decidable word problems are presented that do not even admit context-free cross-sections. On the other hand, it is shown that each finitely presented monoid with a decidable word problem has a finite presentation that admits a cross-section which is a Church-Rosser language.

New Undecidablility Results for Finitely Presented Monoids
Andrea Sattler-Klein

For finitely presented monoids the finiteness problem, the free monoid problem, the trivial monoid problem, the group problem and the problem of commutativity are undecidable in general. On the other hand, these problems are all decidable for the class of finite presentations involving complete string rewriting systems. Thus the question arises whether these problems are also decidable for the class of finite presentations describing monoids which have decidable word problems. In this paper we present some results from the author's Doctoral Dissertation [Sa96] which answer this question in the negative and show that each of these problems is undecidable for the class of finitely presented monoids with decidable word problems admitting regular complete presentations as well as for the class of finitely presented monoids with tractable word problems.

On the Property of Preserving Regularity for String-Rewriting Systems
Friedrich Otto

Some undecidability results concerning the property of preserving regularity are presented that strengthen corresponding results of Gilleron and Tison (1995). In particular, it is shown that it is undecidable in general whether a finite, length-reducing, and confluent stringre-writing system yields a regular set of normal forms for each regular language.

Rewrite Systems for Natural, Integral, and Rational Arithmetic
Évelyne Contejean, Claude Marché, Landy Rabehasaina

We give algebraic presentations of the sets of natural numbers, integers, and rational numbers by convergent rewrite systems which moreover allow efficient computations of arithmetical expressions. We then use such systems in the general normalised completion algorithm, in order to compute Gröbner bases of polynomial ideals over Q.

D-Bases for Polynomial Ideals over Commutative Noetherian Rings
Leo Bachmair, Ashish Tiwari

We present a completion-like procedure for constructing D-bases for polynomial ideals over commutative Noetherian rings with. unit. The procedure is described at an abstract level, by transition rules. Its termination is proved under certain assumptions about the strategy that controls the application of the transition rules. Correctness is established by proof simplification techniques.

On the Word Problem for Free Lattices
Georg Struth

We prove completeness of a rewrite-based algorithm for the word problem in the variety of lattices and discuss the method of non-symmetric completion with regard to this variety.

A Total, Ground path Ordering for Proving Termination of AC-Rewrite Systems
Deepak Kapur, G. Sivakumar

A new path ordering for showing termination of associative-commutative (AC) rewrite systems is defined. If the precedence relation on function symbols is total, the ordering is total on ground terms, but unlike the ordering proposed by Rubio and Nieuwenhuis, this ordering can orient the distributivity property in the proper direction. The ordering is defined in a natural way using recursive path ordering with status as the underlying basis. This settles a longstanding problem in termination orderings for AC rewrite systems. The ordering can be used to define an ordering on nonground terms.

Proving Innermost Normalisation Automatically
Thomas Arts, Jürgen Giesl

We present a technique to prove innermost normalisation of term rewriting systems (TRSs) automatically. In contrast to previous methods, our technique is able to prove innermost normalisation of TRSs that are not terminating.
Our technique can also be used for termination proofs of all TRSs where innermost normalisation implies termination, such as non-overlapping TRSs or locally confluent overlay systems. In this way, termination of many (also non-simply terminating) TRSs can be verified automatically.

Termination of Context-Sensitive Rewriting
Hans Zantema

Context-sensitive term rewriting is a kind of term rewriting in which reduction is not allowed inside some fixed arguments of some function symbols. We introduce two new techniques for proving termination of context-sensitive rewriting. The first one is a modification of the technique of interpretation in a well-founded order, the second one is implied by a transformation in which context-sensitive termination of the original system can be concluded from termination of the transformed one. In combination with purely automatic techniques for proving ordinary termination, the latter technique is purely automatic too.

A New Parallel Closed Condition for Church-Rossser of Left-Linear Term Rewriting Systems
Michio Oyamaguchi, Yoshikatsu Ohta

G. Huet (1980) showed that a left-linear term-rewriting system (TRS) is Church-Rosser (CR) if P -|→ Q for every critical pair where P -|→ Q is a parallel reduction from P to Q. But, it remains open whether it is CR when Q -|→ P for every critical pair . In this paper, we give a partial solution to this problem, that is, a left-linear TRS is CR if Q -|→W P for every critical pair where Q -|→W P is a parallel reduction with the set W of redex occurrences satisfying that if the critical pair is generated from two rules overlapping at an occurrence u, then the length |w| ≤ |u| for every w ∈ W. Furthermore, a left-linear TRS is CR if Q -|→W P or P -|→ε Q for every critical pair where W satisfies the same condition as the above and P -|→ε Q is a reduction whose redex occurrence is ε (i.e., the root).
As a corollary, any left-linear TRS is CR if P = Q, P -|→W Q or Q -|→W P for every critical pair , so that we have a critical pair completion procedure for left-linear TRS's which needs no regard for termination, compared with the Knuth-Bendix procedure.

Innocuous Constructor-Sharing Combinations
Nachum Dershowitz

We investigate conditions under which confluence and/or termination are preserved for constructor-sharing and hierarchical combinations of rewrite systems, one of which is left-linear and convergent.

Scott's Conjecture is True, Position Sensitive Weights
Samuel M. H. W. Perlo-Freeman, Péter Pröhle

The classification of total reduction orderings for strings over a 2-letter alphabet w.r.t. monoid presentations with 2 generators was published by U. Martin, see [9], and used the hypothetical truth of Scott's conjecture, which was 3 years old in 1996.

A Complete Axiomatisation for the Inclusion of Series-Parallel Partial Orders
Denis Béchet, Philippe de Groote, Christian Retoré

Series-parallel orders are defined as the least class of partial orders containing the one-element order and closed by ordinal sum and disjoint union. From this inductive definition, it is almost immediate that any series-parallel order may be represented by an algebraic expression, which is unique up to the associativity of ordinal sum and to the associativivity and commutativity of disjoint union. In this paper, we introduce a rewrite system acting on these algebraic expressions that axiomatises completely the sub-ordering relation for the class of series-parallel orders.

Undecidability of the First Order Theory of One-Step Right Ground Rewriting
Jerzy Marcinkowski

The problem of decidability of the first order theory of one-step rewriting was stated in [CCD93]. One can find the problem on the lists of open problems in rewriting in [DJK93] and [DJK95]. In 1995 Ralf Treinen proved that the theory is undecidable.
In this paper we show that the theorem of Treinen is an easy consequence of a powerful tool first used to establish undecidability results in the theory of logic programming [MP92]. Then we use the tool to give strong refinements of the result of Treinen:
We show that (the ∃** part of) the first order theory of one-step rewriting is undecidable for linear Noetherian rewriting systems.
Then we prove, what we consider to be quite a striking result, that (the ∃** part of) the first order theory of one-step rewriting is undecidable even if the rewriting system is Noetherian and right-ground. Up to our knowledge it is the first known undecidable property of right-ground systems.

The First-Order Theory of One Step Rewriting in Linear Noetherian Systems is Undecidable
Sergei G. Vorobyov

We construct a finite linear finitely terminating rewrite rule system with undecidable theory of one step rewriting.

Solving Linear Diophantine Equations Using the Geometric Structure of the Solution Space
Ana Paula Tomás, Miguel Filgueiras

In the development of algorithms for finding the minimal solutions of systems of linear Diophantine equations, little use has been made (to our knowledge) of the results by Stanley using the geometric properties of the solution space. Building upon these results, we present a new algorithm, and we suggest the use of geometric properties of the solution space in finding bounds for searching solutions and in having a qualitative evaluation of the difficulty in solving a given system.

A Criterion for Intractability of E-unification with Free Function Symbols and Its Relevance for Combination Algorithms
Klaus U. Schulz

All applications of equational unification in the area of term rewriting and theorem proving require algorithms for general E-unification, i.e., E-unification with free function symbols. On this background, the complexity of general E-unification algorithms has been investigated for a large number of equational theories. For most of the relevant cases, the problem of deciding solvability of general E-unification problems was found to be NP-hard. We offer a partial explanation. A criterion is given that characterizes a large class K of equational theories E where general E-unification is always NP-hard. We show that all regular equational theories E that contain a commutative or an associative function symbol belong to K. Other examples of equational theories in K concern non-regular cases as well.
The combination algorithm described in [BS92] can be used to reduce solvability of general E-unification algorithms to solvability of E- and free (Robinson) unification problems with linear constant restrictions. We show that for E ∈ K there exists no polynomial optimization of this combination algorithm for deciding solvability of general E-unification problems, unless P=NP. This supports the conjecture that for E ∈ K there is no polynomial algorithm for combining E-unification with constants with free unification.

Effective Reduction and Conversion Strategies for Combinators
Richard Statman

We imagine that we are computing with combinators or lambda terms and that successive terms are related by one-step reduction or expansion. We have a term on our screen and we want to click on a redex to be reduced (or expanded) to obtain the next screenful. The choice of redex is determined by a strategy for achieving our ends. Such a strategy must be effective in the sense of being computable. Memory is a serious constraint, since only one term fits on our screen at a time.
We shall prove the following results concerning effective reduction and conversion strategies for combinators. These results constitute answers to certain questions which have appeared in the literature (with one exception). We believe that these questions are interesting because they get right to the heart of the memory problems inherent in one-step reduction and expansion. These questions appear to be even more subtle for lambda terms where each one-step reduction can create a much more radical change of structure than for combinators.
(1) There is an effective one-step cofinal reduction strategy (answering a question of Barendregt [2] 13.6.6). This is in Section 2.
(2) There is no effective confluence function but there is an effective one-step confluence strategy (answering'a question of Isles reported in [1]). This is in Section 3.
(3) There is an effective one-step enumeration strategy (answering an obvious question). This is in Section 4.
(4) There is an effective one-step Church-Rosser conversion strategy (ldquoalmostrdquo answering a question of Bergstra and Klop [3]). This is in Section 5.

Finite Family Developments
Vincent van Oostrom

Associate to a rewrite system R having rules l → r, its labelled version Rω having rules lm+1° → rm, for any natural number m ∈ ω. These rules roughly express that a left-hand side l carrying labels all larger than m can be replaced by its right-hand side r carrying labels all smaller than or equal to m. A rewrite system R enjoys finite family developments (FFD) if Rω is terminating. We show that the class of higher order pattern rewrite systems enjoys FFD, extending earlier results for the lambda calculus and first order term rewrite systems.

Prototyping Combination of Unification Algorithms with the ELAN Rule-Based Programming Language
Christophe Ringeissen

The implementation of the general non-deterministic method provides now a convenient and useful ELAN platform to tackle some non-disjoint cases [5] (only one additional non-deterministic step) or to realize other combination algorithms based on the same techniques but in different contexts: satisfiability, the word-problem, matching [6], free amalgamation [1],... Most of them can be viewed as a way to decrease the non-determinism inherent in the general method.

The Invariant Package of MAS
Manfred Göbel

TODO

Opal: A System for Computing Noncommutative Gröbner Bases
Edward L. Green, Lenwood S. Heath, Benjamin J. Keller

TODO

TRAM: An Abstract Machine for Order-Sorted Conditioned Term Rewriting Systems
Kazuhiro Ogata, Koichi Ohhara, Kokichi Futatsugi

TODO

1996

Fine-Grained Concurrent Completion
Claude Kirchner, Christopher Lynch, Christelle Scharff

We present a concurrent Completion procedure based on the use of a SOUR graph as data structure. The procedure has the following characteristics. It is asynchronous, there is no need for a global memory or global control, equations are stored in a SOUR graph with maximal structure sharing, and each vertex is a process, representing a term. Therefore, the parallelism is at the term level. Each edge is a communication link, representing a (subterm, ordering, unification or rewrite) relation between terms. Completion is performed on the graph as local graph transformations by cooperation between processes. We show that this concurrent Completion procedure is sound and complete with respect to the sequential one, provided that the information is locally time stamped in order to detect out of date information.

AC-Complete Unification and its Application to Theorem Proving
Alexandre Boudet, Évelyne Contejean, Claude Marché

The inefficiency of AC-completion is mainly due to the doubly exponential number of AC-unifiers and thereby of critical pairs generated. We present AC-complete E-unification, a new technique whose goal is to reduce the number of AC-critical pairs inferred by performing unification in a extension E of AC (e.g. ACU, Abelian groups, Boolean rings, ...) in the process of normalized completion [24, 25]. The idea is to represent complete sets of AC-unifiers by (smaller) sets of E-unifiers. Not only do the theories E used for unification have exponentially fewer most general unifiers than AC, but one can remove from a complete set of E-unifiers those solutions which have no E-instance which is an AC-unifier.

Superposition Theorem Proving for Albelian Groups Represented as Integer Modules
Jürgen Stuber

We define a superposition calculus specialized for abelian groups represented as integer modules, and show its refutational completeness. This allows to substantially reduce the number of inferences compared to a standard superposition prover which applies the axioms directly. Specifically, equational literals are simplified, so that only the maximal term of the sums is on the left-hand side. Only certain minimal superpositions need to be considered; other superpositions which a standard prover would consider become redundant. This not only reduces the number of inferences, but also reduces the size of the AC-unification problems which are generated. That is, AC-unification is not necessary at the top of a term, only below some non-AC-symbol. Further, we consider situations where the axioms give rise to variable overlaps and develop techniques to avoid these explosive cases where possible.

Symideal Gröbner Bases
Manfred Göbel

This paper presents a completion technique for a set of polynomials in K[X1,..., Xn] which is closed under addition and under multiplication with symmetric polynomials as well as a solution for the corresponding membership problem. Our algorithmic approach is based on a generalization of a novel rewriting technique for the computation of bases for rings of permutation-invariant polynomials.

Termination of Constructor Systems
Thomas Arts, Jürgen Giesl

We present a method to prove termination of constructor systems automatically. Our approach takes advantage of the special form of these rewrite systems because for constructor systems instead of left- and right-hand sides of rules it is sufficient to compare so-called dependency pairs [Art96]. Unfortunately, standard techniques for the generation of well-founded orderings cannot be directly used for the automation of the dependency pair approach. To solve this problem we have developed a transformation technique which enables the application of known synthesis methods for well-founded orderings to prove that dependency pairs are decreasing. In this way termination of many (also non-simply terminating) constructor systems can be proved fully automatically.

Dummy Elimination in Equational Rewriting
Maria C. F. Ferreira

In [5] we introduced the concept of dummy elimination in term rewriting: a transformation on terms which eliminates function symbols simplifying the rewrite rules and making, in general, the task of proving termination easier. Here we consider the more general setting of rewriting modulo an equational theory; we show that, in contrast with most techniques developed for proving termination of rewrite systems, dummy elimination remains valid in the presence of equational theories. Furthermore using the same proof technique, the soundness of a family of transformations (containing dummy elimination) can be shown. This work was motivated by an application in the area of Process Algebra.

On Proving Termination by Innermost Termination
Bernhard Gramlich

We present a new approach for proving termination of rewrite systems by innermost termination. From the resulting abstract criterion we derive concrete conditions, based on critical peak properties, under which innermost termination implies termination (and confluence). Finally, we show how to apply the main results for providing new sufficient conditions for the modularity of termination.

A Recursive Path Ordering for Higher-Order Terms in eta-Long beta-Normal Form
Jean-Pierre Jouannaud, Albert Rubio

This paper extends the termination proof techniques based on rewrite orderings to a higher-order setting, by defining a recursive path ordering for simply typed higher-order terms in η-long β-normal form. This ordering is powerful enough to show termination of several complex examples.

Higher-Order Superposition for Dependent Types
Roberto Virga

We describe a proof of the Critical Pair Lemma for Plotkin's LF calculus [4]. Our approach basically follows the one used by Nipkow [12] for the simply-typed case, though substantial modifications and some additional theoretical machinery are needed to ensure well-typedness of rewriting in this richer type system. We conclude the paper presenting some significant applications of the theory.

Higher-Order Narrowing with Definitional Trees
Michael Hanus, Christian Prehofer

Functional logic languages with a sound and complete operational semantics are mainly based on narrowing. Due to the huge search space of simple narrowing, steadily improved narrowing strategies have been developed in the past. Needed narrowing is currently the best narrowing strategy for first-order functional logic programs due to its optimality properties w.r.t. the length of derivations and the number of computed solutions. In this paper, we extend the needed narrowing strategy to higher-order functions and λ-terms as data structures. By the use of definitional trees, our strategy computes only incomparable solutions. Thus, it is the first calculus for higher-order functional logic programming which provides for such an optimality result. Since we allow higher-order logical variables denoting λ-terms, applications go beyond current functional and logic programming languages.

A Compiler for Nondeterministic Term Rewriting Systems
Marian Vittek

This work presents the design and the implementation of a compiler for the ELAN specification language. The language is based on rewriting logic and permits, in particular, the combination of computations where deterministic evaluations are mixed with a nondeterministic search for solutions. The implementation combines compilation methods of term rewriting systems, functional and logic programming languages, and some new original techniques producing, in summary, an efficient code in an imperative language. The efficiency of the compiler is demonstrated on experimental results.

Combinatory Reduction Systems with Explicit Substitution that Preserve Strong Nomalisation
Roel Bloo, Kristoffer Høgsbro Rose

We generalise the notion of explicit substitution from the λ-calculus to higher order rewriting, realised by combinatory reduction systems (CRSs). For every confluent CRS, R, we construct an explicit substitution variant, Rx, which we prove confluent.
We identify a large subset of the CRSs, the structure-preserving CRSs, and show for any structure-preserving CRS R that Rx preserves strong normalisation of R.
We believe that this is a significant first step towards providing a methodology for reasoning about the operational properties of higher-order rewriting in general, and higher-order program transformations in particular, since confluence ensures correctness of such transformations and preservation of strong normalisation ensures that the transformations are always safe, in both cases independently of the used reduction strategy.

Confluence Properties of Extensional and Non-Extensional λ-Calculi with Explicit Substitutions (Extended Abstract)
Delia Kesner

This paper studies confluence properties of extensional and non-extensional λ-calculi with explicit substitutions, where extensionality is interpreted by η-expansion. For that, we propose a general scheme for explicit substitutions which describes those abstract properties that are sufficient to guarantee confluence. Our general scheme makes it possible to treat at the same time many well-known calculi such as λσ, λσ and λυ, or some other new calculi that we propose in this paper. We also show for those calculi not fitting in the general scheme that can be translated to another one fining the scheme, such as λs, how to reason about confluence properties of their extensional and non-extensional versions.

On the Power of Simple Diagrams
Roberto Di Cosmo

In this paper we focus on a set of abstract lemmas that are easy to apply and turn out to be quite valuable in order to establish confluence and/or normalization modularly, especially when adding rewriting rules for extensional equalities to various calculi. We show the usefulness of the lemmas by applying them to various systems, ranging from simply typed lambda calculus to higher order lambda calculi, for which we can establish systematically confluence and/or normalization (or decidability of equality) in a simple way. Many result are new, but we also discuss systems for which our technique allows to provide a much simpler proof than what can be found in the literature.

Coherence for Sharing Proof Nets
Stefano Guerrini, Simone Martini, Andrea Masini

Sharing graphs are a way of representing linear logic proof-nets in such a way that their reduction never duplicates a redex. In their usual presentations, they present a problem of coherence: if the proof-net N reduces by standard cut-elimination to N', then, by reducing the sharing graph of N we do not obtain the sharing graph of N'. We solve this problem by changing the way the information is coded into sharing graphs and introducing a new reduction rule (absorption). The rewriting system is confluent and terminating.

Modularity of Termination in Term Graph Rewriting
M. R. K. Krishna Rao

Term rewriting is generally implemented using graph rewriting for efficiency reasons. Graph rewriting allows sharing of common structures thereby saving both time and space. This implementation is sound in the sense that computation of a normal form of a graph yields a normal form of the corresponding term. In this paper, we study modularity of termination of the graph reduction. Unlike in the case of term rewriting, termination is modular in graph rewriting for a large class of systems. Our results generalize the results of Plump [14] and Kurihara and Ohuchi [10].

Confluence of Terminating Conditional Rewrite Systems Revisited
Bernhard Gramlich, Claus-Peter Wirth

We present a new and powerful criterion for confluence of terminating (terms in) join conditional term rewriting systems. This criterion is based on a certain joinability property for shared parallel critical peaks and does neither require the systems to be decreasing nor left-linear nor normal, but only terminating.

Compositional Term Rewriting: An Algebraic Proof of Toyama's Theorem
Christoph Lüth

This article proposes a compositional semantics for term rewriting systems, i.e. a semantics preserving structuring operations such as the disjoint union. The semantics is based on the categorical construct of a monad, adapting the treatment of universal algebra in category theory to term rewriting systems.
As an example, the preservation of confluence under the disjoint union of two term rewriting systems is shown, obtaining an algebraic proof of Toyama's theorem, generalised slightly to term rewriting systems introducing variables on the right-hand side of the rules.

The First-Order Theory of One-Step Rewriting is Undecidable
Ralf Treinen

The theory of one-step rewriting for a given rewrite system R and signature Σ is the first-order theory of the following structure: Its universe consists of all Σ-ground terms, and its only predicate is the relation "x rewrites to y in one step by R". The structure contains no function symbols and no equality. We show that there is no algorithm deciding the ∃**-fragment of this theory for an arbitrary rewrite system. The proof uses both non-linear and non-shallow rewrite rules.

An Algorithm for Distributive Unification
Manfred Schmidt-Schauß

The purpose of this paper is to describe a decision algorithm for unifiability of equations w.r.t. the equational theory of two distributive axioms: x*(y+z)=x*y+x*z and (x+y)*z=x*z+y*z. The algorithm is described as a set of non-deterministic transformation rules. The equations given as input are eventually transformed into a conjunction of two further problems: One is an AC1-unification-problem with linear constant restrictions. The other one is a second-order unification problem that can be transformed into a word-unification problem and then can be decided using Makanin's decision algorithm. Since the algorithm terminates, this is a solution for an open problem in the field of unification.

On the Termination Problem for One-Rule Semi-Thue System
Géraud Sénizergues

We solve the u-termination and the termination problems for the one-rule semi-Thue systems S of the form 0p1q → v, (p,q ∈ N-{0}, v ∈ {0,1}*). We obtain a structure theorem about a monoid that we call the termination-monoid of 5. As a consequence, for every fixed system S of the above form, the termination-problem has a linear time-complexity.

Efficient Second-Order Matching
Régis Curien, Zhenyu Qian, Hui Shi

The standard second-order matching algorithm by Huet may be expansive in matching a flexible-rigid pair. On one hand, many fresh free variables may need to be introduced; on the other hand, attempts are made to match the heading free variable on the flexible side with every "top layer" on the rigid side and every argument of the heading free variable with every subterm covered by the "top layer". We propose a new second-order matching algorithm, which introduces no fresh free variables and just considers some selected "top layers", arguments of the heading free variable and subterms covered by the corresponding "top layers". A first implementation shows that the new algorithm is more efficient both in time and space than the standard one for a great number of matching problems.

Linear Second-Order Unification
Jordi Levy

We present a new class of second-order unification problems, which we have called linear. We deal with completely general second-order typed unification problems, but we restrict the set of unifiers under consideration: they instantiate free variables by linear terms, i.e. terms where any λ-abstractions bind one and only one occurrence of a bound variable. Linear second-order unification properly extends context unification studied by Comon and Schmidt-Schauß. We describe a sound and complete procedure for this class of unification problems and we prove termination for three different subcases of them. One of these subcases is obtained requiring Comon's condition, another corresponds to Schmidt-Schauß's condition, (both studied previously for the case of context unification, and applied here to a larger class of problems), and the third one is original, namely that free variables occur at most twice.

Unification of Higher-Order patterns in a Simply Typed Lambda-Calculus with Finite Products and terminal Type
Roland Fettig, Bernd Löchner

We develop a higher-order unification algorithm for a restricted class of simply typed lambda terms with function space and product type constructors. It is based on an inference system manipulating so called higher-order product-patterns which is proven to be sound and complete. Allowing tuple constructors in lambda binders provides elegant notations. We show that our algorithm terminates on each input and produces a most general unifier if one exists. The approach also extends smoothly to a calculus with terminal type.

Decidable Approximations of Term Rewriting Systems
Florent Jacquemard

A linear term rewriting system R is growing when, for every rule l→r∈R, each variable which is shared by l and r occurs at depth one in l. We show that the set of ground terms having a normal form w.r.t. a growing rewrite system is recognized by a finite tree automaton. This implies in particular that reachability and sequentiality of growing rewrite systems are decidable. Moreover, the word problem is decidable for related equational theories. We prove that our conditions are actually necessary: relaxing them yields undecidability of reachability.
Concerning sequentiality, the result may be restated in terms of approximations of term rewriting systems. An approximation of a system R is a renaming of the variables in the right hand sides which yields a growing rewrite system. This gives the decidability of a new sufficient condition for sequentiality of left-linear rewrite systems, which encompasses known decidable properties such as strong, NV and NVNF sequentiality.

Semantics and Strong Sequentiality of Priority Term Rewriting Systems
Masahiko Sakai, Yoshihito Toyama

This paper gives an operational semantics of priority term rewriting systems (PRS) by using conditional systems, whose reduction is decidable and stable under substitution. We also define the class of strong sequential PRSs and show that this class is decidable. Moreover, we show that the index rewriting of strong sequential PRSs gives a normalizing strategy.

Higher-Order Families
Vincent van Oostrom

A redex family is a set of redexes which are lsquocreated in the same wayrsquo. Families specify which redexes should be shared in any so-called optimal implementation of a rewriting system. We formalise the notion of family for orthogonal higher-order term rewriting systems (OHRSs). In order to comfort our formalisation of the intuitive concept of family, we actually provide three conceptually different formalisations, via labelling, extraction and zigzag and show them to be equivalent. This generalises the results known from literature and gives a firm theoretical basis for the optimal implementation of OHRSs.

A New Proof Manager and Graphic Interface for Larch Prover
Frédéric Voisin

We present PLP, a proof management system and graphic interface for the "Larch Prover" (LP). The system provides additional support for interactive use of LP, by letting the user control the order in which goals are proved. We offer improved ways to investigate, compare and communicate proofs by allowing independent attempts at proving a goal, a better access to the information associated with goals and an additional script mechanism. All the features are accessible through a graphic system that makes the proof structure accessible to the user.

ReDuX 1.5: New Facets of Rewriting
Reinhard Bündgen, Carsten Sinz, Jochen Walter

TODO

CiME: Completion Modulo E
Évelyne Contejean, Claude Marché

TODO

Distributed Larch Prover (DLP): An Experiment in Parallelizing a Rewrite-Rule Based Prover
Mark T. Vandevoorde, Deepak Kapur

The Distributed Larch Prover, DLP, is a distributed and parallel version of LP, an interactive prover. DLP helps users find proofs by creating and managing many proof attempts that run in parallel. Parallel attempts may work independently on different subgoals of an inference method, and they may compete by using different inference methods to prove the same goal. DLP runs on a network of workstations.

EPIC: An Equational Language -Abstract Machine Supporting Tools-
H. R. Walters, J. F. Th. Kamperman

TODO

SPIKE-AC: A System for Proofs by Induction in Associative-Commutative Theories
Narjes Berregeb, Adel Bouhoula, Michaël Rusinowitch

Automated verification problems often involve Associative-Commutative (AC) operators. But, these operators are hard to handle since they cause combinational explosion. Many provers simply consider AC axioms as additional properties kept in a library, adding a burden to the proof search control. Term rewriting modulo AC is a basic approach to remedy this problem. Based on it, we have developed techniques to treat properly AC operators, that have been integrated in SPIKE, an automatic theorem prover in theories expressed by conditional equations [4]. The system SPIKE-AC obtained, written in Caml Light, have demonstrated that induction proofs become more natural and require less interaction. In contrast with other inductive completion methods [6, 9, 11] needing AC unification to compute critical pairs, our method does not need AC unification which is doubly exponential [10]; only AC matching is required. An other advantage is that our system refutes all false conjectures, under reasonable assumptions on the given theories. Experiments have shown that in presence of AC operators, less input from the user is needed, comparing with related systems such as LP [8], NQTHM [5] and PVS [12].

On Gaining Efficiency in Completion-Based Theorem Proving
Thomas Hillenbrand, Arnim Buch, Roland Fettig

Gaining efficiency in completion-based theorem proving requires improvements on three levels: fast inference step execution, careful aggregation into an inference machine, and sophisticated control strategies, all that combined with space saving representation of derived facts. We introduce the new Waldmeister prover which shows an increase in overall system performance of more than one order of magnitude as compared with standard techniques.

1995

Modularity of Completeness Revisited
Massimo Marchiori

One of the key results in the field of modularity for Term Rewriting Systems is the modularity of completeness for left-linear TRSs established by Toyama, Klop and Barendregt in [TKB89]. The proof, however, is quite long and involved. In this paper, a new proof of this basic result is given which is both short and easy, employing the powerful technique of lsquopile and deletersquo already used with success in proving the modularity of UN. Moreover, the same proof is shown to extend the result in [TKB89] proving modularity of termination for left-linear and consistent with respect to reduction TRSs.

Automatic Termination Proofs With Transformation Orderings
Joachim Steinbach

Transformation orderings are a powerful tool for proving termination of term rewriting systems. However, it is rather hard to establish their applicability to a given rule system. We introduce an algorithm which automatically generates transformation orderings for many non-trivial systems including Hercules & hydra and sorting algorithms.

A Termination Ordering for Higher Order Rewrite System
Olav Lysne, Javier Piris

We present an extension of the recursive path ordering for the purpose of showing termination of higher order rewrite systems. Keeping close to the general path ordering of Dershowitz and Hoot, we demonstrate sufficient properties of the termination functions for our method to apply. Thereby we describe a class of different orderings. Finally we compare our method to previously published extensions of the recursive path ordering into the higher order setting.

A Complete Characterization of Termination of Op1q → 1rOs
Hans Zantema, Alfons Geser

We characterize termination of one-rule string rewriting systems of the form 0p1q → 1r0s for every choice of positive integers p, q, r, and s. For the simply terminating cases, we give the precise complexity of derivation lengths.

On Narrowing, Refutation Proofs and Constraints
Robert Nieuwenhuis

We develop a proof technique for dealing with narrowing and refutational theorem proving in a uniform way, clarifying the exact relationship between the existing results in both fields and allowing us to obtain several new results. Refinements of narrowing (basic, LSE, etc.) are instances of the technique, but are also defined here for arbitrary (possibly ordering and/or equality constrained or not yet convergent or saturated) Horn clauses, and shown compatible with simplification and other redundancy notions. By narrowing modulo equational theories like AC, compact representations of solutions, expressed by AC-equality constraints, can be obtained. Computing AC-unifiers is only needed at the end if one wants to "compress" such a constraint into its (doubly exponentially many) concrete substitutions.

Completion for Multiple Reduction Orderings
Masahito Kurihara, Hisashi Kondo, Azuma Ohuchi

We present a completion procedure (called MKB) which works with multiple reduction orderings. Given equations and a set of reduction orderings, the procedure simulates a computation performed by the parallel processes each of which executes the standard Kuuth-Bendix completion procedure (KB) with one of the given orderings. To gain efficiency, however, we develop new inference rules working on objects called nodes, which are data structure consisting of a pair s: t of terms associated with the information to show which processes contain the rule s → t (or t → s) and which processes contain the equation s harr t. The idea is based on the observation that some of the inferences made in the processes are closely related, so we can design inference rules that simulate multiple KB inferences in several processes all in a single operation. Our experiments show that MKB is significantly more efficient than the naive simulation of parallel execution of KB procedures, when the number of reduction orderings is large enough.

Towards an Efficient Construction of Test Sets for Deciding Ground Reducability
Klaus Schmid, Roland Fettig

We propose a method for constructing test sets for deciding whether a term is ground reducible w.r.t. an arbitrary, many-sorted, unconditional term rewriting system. Our approach is based on a suitable characterization of such test sets using a certain notion of transnormality. It generates very small test sets and shows some promise to be an important step towards a practicable implementation.

δo!ε = 1 - Optimizing Optimal λ-Calculus Implementations
Andrea Asperti

In [As94], a correspondence between Lamping-Gonthier's operators for Optimal Reduction of the lambda-calculus [Lam90, GAL92a] and the operations associated with the comonad "!" of Linear Logic was established. In this paper, we put this analogy at work, adding new rewriting rules directly suggested by the categorical equations of the comonad. These rules produce an impressive improvement of the performance of the reduction system, and provide a first step towards the solution of the well known and crucial problem of accumulation of control operators.

Substitution Tree Indexing
Peter Graf

Sophisticated maintenance and retrieval of first-order predicate calculus terms is a major key to efficient automated reasoning. We present a new indexing technique, which accelerates the speed of the basic retrieval operations such as finding complementary literals in resolution theorem proving or finding critical pairs during completion. Subsumption and reduction are also supported. Moreover, the new technique not only provides maintenance and efficient retrieval of terms but also of idem-potent substitutions. Substitution trees achieve maximal search speed paired with minimal memory requirements in various experiments and outperform traditional techniques such as path indexing, discrimination tree indexing, and abstraction trees by combining their advantages and adding some new features.

Concurrent Garbage Collection for Concurrent Rewriting
Ilies Alouini

We describe an algorithm that achieves garbage collection when performing concurrent rewriting. We show how this algorithm follows the implementation model of concurrent graph rewriting. This model has been studied and directly implemented on MIMD machines where nodes of the graph are distributed over a set of processors. A distinguishing feature of our algorithm is that it collects garbage concurrently with the rewriting process. Furthermore, our garbage collection algorithm never blocks the process of rewriting; in particular it does not involve synchronisation primitives. In contrast to a classical garbage collection algorithm reclaiming unused blocks of memory, the presented algorithm collects active nodes of the graph (i.e. nodes are viewed as processes). Finally, we present different results of experimentations based on our implementation (RECO) of concurrent rewriting using the concurrent garbage collection algorithm and show that significant speed-ups can be obtained when computing normal forms of terms. Keywords: Concurrent rewriting, Graph rewriting, MIMD architectures, Concurrent garbage collection algorithms.

Lazy Rewriting and Eager Machinery
J. F. Th. Kamperman, H. R. Walters

We define Lazy Term Rewriting Systems and show that they can be realized by local adaptations of an eager implementation of conventional term rewriting systems. The overhead of lazy evaluation is only incurred when lazy evaluation is actually performed.
Our method is modelled by a transformation of term rewriting systems, which concisely expresses the intricate interaction between pattern matching and lazy evaluation. The method easily extends to term graph rewriting. We consider only left-linear, confluent term rewriting systems, but we do not require orthogonality.

A Rewrite Mechanism for Logic Programs with Negation
Siva Anantharaman, Gilles Richard

Pure logic programs can be interpreted as rewrite programs, executable with a version of the Knuth-Bendix completion procedure called linear completion. The main advantage here is in avoiding many of the loops inherent in the resolution approach: for most productive loops, linear completion yields a finite set of answers and a finite set of rewrite rules (involving just one predicate), from which all the remaining answers can be deduced. And this 'program synthesizing' aspect can be easily combined with other loop avoiding techniques using 'marked' literals and substitutions. It is thus natural to ask how much of the rewrite mechanism carries through for deducing negative information from pure programs, and more generally for any normal logic program. In this paper we show that such an extension can be built in a natural way, with ideas from the Clark completion for normal logic programs, and the domain of constrained rewriting. The correction and completeness of this extended mechanism is proved w.r.t. the 3-valued declarative semantics of Künen for normal programs. We also point out how the semantics of a normal program can in a certain sense be 'parametrized', in terms of the 'meta-reduction' rule set of our approach.

Level-Confluence of Conditional Rewrite Systems with Extra Variables in Right-Hand Sides
Taro Suzuki, Aart Middeldorp, Tetsuo Ida

Level-confluence is an important property of conditional term rewriting systems that allow extra variables in the rewrite rule because it guarantees the completeness of narrowing for such systems. In this paper we present a syntactic condition ensuring level-confluence for orthogonal, not necessarily terminating, conditional term rewriting systems that have extra variables in the right-hand sides of the rewrite rules. To this end we generalize the parallel moves lemma. Our result bears practical significance since the class of systems that fall within its scope can be viewed as a computational model for functional logic programming languages with local definitions, such as let-expressions and where-constructs.

A Polynomial Algorithm Testing Partial Confluence of Basic Semi-Thue Systems
Géraud Sénizergues

We give a polynomial algorithm solving the problem "is S partially confluent on the rational set R ?" for finite, basic, noetherian semi-Thue systems. The algorithm is obtained by a polynomial reduction of this problem to the equivalence-problem for deterministic 2-tape finite automata, which has been shown to be polynomially decidable in [Fri-Gre82].

Problems in Rewriting Applied to Categorical Concepts by the Example of a Computational Comonad
Wolfgang Gehrke

We present a canonical system for comonads which can be extended to the notion of a computational comonad [BG92] where the crucial point is to find an appropriate representation. These canonical systems are checked with the help of the Larch Prover [GG91] exploiting a method by G. Huet [Hue90a] to represent typing within an untyped rewriting system. The resulting decision procedures are implemented in the programming language Elf [Pfe89] since typing is directly supported by this language. Finally we outline an incomplete attempt to solve the problem which could be used as a benchmark for rewriting tools.

Relating Two Categorial Models of Term Rewriting
Andrea Corradini, Fabio Gadducci, Ugo Montanari

In the last years there has been a growing interest towards categorical models for term rewriting systems (trs's). In our opinion, very interesting are those associating to each trs's a cat-enriched structure: a category whose hom-sets are categories. Interpreting rewriting steps as morphisms in hom-categories, these models provide rewriting systems with a concurrent semantics in a clean algebraic way. In this paper we provide a unified presentation of two models recently proposed in literature by José Meseguer [Mes90, Mes92, MOM93] and John Stell [Ste92, Ste94], respectively, pursuing a critical analysis of both of them. More precisely, we show why they are to a certain extent unsatisfactory in providing a concurrent semantics for rewriting systems. It turns out that the derivation space of Meseguer's Rewriting Logic associated with each term (i.e., the set of coinitial computations) fails in general to form a prime algebraic domain: a condition that is generally considered as expressing a directly implementable model of concurrency for distributed systems (see [Win89]). On the contrary, the resulting derivation space in Stell's model is actually a prime algebraic domain, but too few computations are identified: only disjoint concurrency can be expressed, limiting the degree of parallelism described by the model.

Towards a Domain Theory for Termination Proofs
Stefan Kahrs

We present a general framework for termination proofs for Higher-Order Rewrite Systems. The method is tailor-made for having simple proofs showing the termination of enriched λ-calculi.

Infinitary Lambda Calculi and Böhm Models
Richard Kennaway, Jan Willem Klop, M. Ronan Sleep, Fer-Jan de Vries

In a previous paper we have established the theory of transfinite reduction for orthogonal term rewriting systems. In this paper we perform the same task for the lambda calculus. This results in several new Böhm-like models of the lambda calculus, and new descriptions of existing models.

Proving the Genericity Lemma by Leftmost Reduction is Simple
Jan Kuper

The Genericity Lemma is one of the most important motivations to take in the untyped lambda calculus the notion of solvability as a formal representation of the informal notion of undefinedness. We generalise solvability towards typed lambda calculi, and we call this generalisation: usability. We then prove the Genericity Lemma for un-usable terms. The technique of the proof is based on leftmost reduction, which strongly simplifies the standard proof.

(Head-) Normalization of Typeable Rewrite Systems
Steffen van Bakel, Maribel Fernández

In this paper we study normalization properties of rewrite systems that are typeable using intersection types with ω and with sorts. We prove two normalization properties of typeable systems. On one hand, for all systems that satisfy a variant of the Jouannaud-Okada Recursion Scheme, every term typeable with a type that is not ω is head normalizable. On the other hand, non-Curryfied terms that are typeable with a type that does not contain ω, are normalizable.

Explicit Substitutions with de Bruijn's Levels
Pierre Lescanne, Jocelyne Rouyer-Degli

TODO

A Restricted Form on Higher-Order Rewriting Applied to an HDL Semantics
Richard J. Boulton

An algorithm for a restricted form of higher-order matching is described. The intended usage is for rewrite rules that use function-valued variables in place of some unknown term structure. The matching algorithm instantiates these variables with suitable λ-abstractions when given the term to be rewritten. Each argument of one of the variables is expected to match some unique substructure. Multiple solutions are avoided by making fixed choices when alternative ways to match arise. The algorithm was motivated by correctness proofs of designs written in a hardware description language. The feature of the language's semantics that necessitates the higher-order rewriting is described.

Rewrite Systems for Integer Arithmetic
H. R. Walters, Hans Zantema

We present three term rewrite systems for integer arithmetic with addition, multiplication, and, in two cases, subtraction. All systems are ground confluent and terminating; termination is proved by semantic labelling and recursive path order.
The first system represents numbers by successor and predecessor. In the second, which defines non-negative integers only, digits are represented as unary operators. In the third, digits are represented as constants. The first and the second system are complete; the second and the third system have logarithmic space and time complexity, and are parameterized for an arbitrary radix (binary, decimal, or other radices). Choosing the largest machine representable single precision integer as radix, results in unbounded arithmetic with machine efficiency.

General Solution of Systems of Linear Diophantine Equations and Inequations
Habib Abdulrab, Marianne Maksimenko

Given a system gs of linear diophantine equations and inequations of the form Li #i Mi, i=1,...,n, where #i ∈ {=,<,>,≠,≥,≤} we compute a finite set S of numerical and parametric solutions describing the set of all the solutions of gs (i.e. its general solution). Our representation of the general solution gives direct and simple functions generating the set of all the solutions: this is obtained by giving all the nonnegative natural values to the integer variables of the right hand-side of the parametric solutions, without any linear combination. In particular, unlike the usual representation based on minimal solutions, our representation of the general solution is nonambiguous: given any solution s of gs, it can be deduced from a unique numerical or parametric solution of S.

Combination of Constraint Solving Techniques: An Algebraic POint of View
Franz Baader, Klaus U. Schulz

In a previous paper we have introduced a method that allows one to combine decision procedures for unifiability in disjoint equational theories. Lately, it has turned out that the prerequisite for this method to apply - namely that unification with so-called linear constant restrictions is decidable in the single theories - is equivalent to requiring decidability of the positive fragment of the first order theory of the equational theories. Thus, the combination method can also be seen as a tool for combining decision procedures for positive theories of free algebras defined by equational theories. Complementing this logical point of view, the present paper isolates an abstract algebraic property of free algebras - called combinability - that clarifies why our combination method applies to such algebras. We use this algebraic point of view to introduce a new proof method that depends on abstract notions and results from universal algebra, as opposed to technical manipulations of terms (such as ordered rewriting, abstraction functions, etc.) With this proof method, the previous combination results for unification can easily be extended to the case of constraint solvers that also take relational constraints (such as ordering constraints) into account. Background information from universal algebra about free structures is given to clarify the algebraic meaning of our results.

Some Independent Results for Equational Unification
Friedrich Otto, Paliath Narendran, Daniel J. Dougherty

For finite convergent term-rewriting systems the equational unification problem is shown to be recursively independent of the equational matching problem, the word matching problem, and the (simultaneous) 2nd-order equational matching problem. We also present some new decidability results for simultaneous equational unification and 2nd-order equational matching.

Regular Substitution Sets: A Means of Controlling E-Unification
Jochen Burghardt

A method for selecting solution constructors in narrowing is presented. The method is based on a sort discipline that describes regular sets of ground constructor terms as sorts. It is extended to cope with regular sets of ground substitutions, thus allowing different sorts to be computed for terms with different variable bindings. An algorithm for computing signatures of equationally defined functions is given that allows potentially infinite overloading. Applications to formal program development are sketched.

DISCOUNT: A SYstem for Distributed Equational Deduction
Jürgen Avenhaus, Jörg Denzinger, Matthias Fuchs

TODO

ASTRE: Towards a Fully Automated Program Transformation System
Françoise Bellegarde

Astre has achieved its initial goal: it is fully automatic for deforestation and two-loops fusion of functional programs. Other fully automatic algorithms for deforestation does not include two-loops fusion. They reject all deforestations that necessitate laws but Astre can perform most of them using additional synthesis. Moreover they do not extend easily to include laws or other strategies. The limitation of Astre is the termination obligation of the input rewrite system. Also Astre does not process easily a large amount of rules. The present prototype has been used so far up-to 500 rules input. At this size, it becomes intractable to perform all the syntheses. We plan to automatize derecursion tactical and automatic insertion of laws [3] in a near future.

Parallel ReDuX → PaReDuX
Reinhard Bündgen, Manfred Göbel, Wolfgang Küchlin

TODO

STORM: A Many-to-One Associative-Commutative Matcher
Ta Chen, Siva Anantharaman

TODO

LEMMA: A System for Automated Synthesis of Recursive Programs in Equational Theories
Jacques Chazarain, Serge Muller

TODO

Generating Polynomial Orderings for Termination Proofs
Jürgen Giesl

Most systems for the automation of termination proofs using polynomial orderings are only semi-automatic, i.e. the "right" polynomial ordering has to be given by the user. We show that a variation of Lank-ford's partial derivative technique leads to an easier and slightly more powerful method than most other semi-automatic approaches. Based on this technique we develop a method for the automated synthesis of a suited polynomial ordering.

Disguising Recursively Chained Rewrite Rules as Equational Theorems, as Implemented in the Prover EFTTP Mark 2
M. Randall Holmes

TODO

Prototyping Completion with Constraints Using Computational Systems
Hélène Kirchner, Pierre-Étienne Moreau

We use computational systems to express a completion with constraints procedure that gives priority to simplifications. Computational systems are rewrite theories enriched by strategies. The implementation of completion in ELAN, an interpretor of computational systems, is especially convenient for experimenting with different simplification strategies, thanks to the powerful strategy language of ELAN.

Guiding Term Reduction Through a Neural Network: Some Prelimanary Results for the Group Theory
Alberto Paccanaro

Some experiments have been carried out in order to build Neural Networks which, given a term belonging to an Equational Theory, could suggest which rewrite rules belonging to the Completed TRS for that theory represent the best choice at each reduction step in order to minimize the number of reductions needed to reach the normal form. For the Groups Theory a net was built which had an accuracy of 61%. Moreover the same net in the 71% of the cases could correctly suggest a rule applicable to the term.

Studying Quasigroup Identities by Rewriting Techniques: Problems and First Results
Mark E. Stickel, Hantao Zhang

Finite quasigroups in the form of Latin squares have been extensively studied in design theory. Some quasigroups satisfy constraints in the form of equations, called quasigroup identities. In this note, we propose some questions concerning quasigroup identities that can sometimes be answered by the rewriting techniques.

Problems in Rewriting III
Nachum Dershowitz, Jean-Pierre Jouannaud, Jan Willem Klop

We presented lists of open problems in the theory of rewriting in the proceedings of the previous two conferences [36; 37]. We continue with that tradition this year. We give references to solutions to eleven problems from the previous lists, report on progress on several others, provide a few reformulations of old problems, and include ten new problems.

1993

Redundancy Criteria for Constrained Completion
Christopher Lynch, Wayne Snyder

We study the problem of completion in the case of equations with constraints consisting of first-order formulae over equations, disequations, and an irreducibility predicate. We present several inference systems which show in a very precise way how to take advantage of redundancy notions in the context of constrained equational reasoning. A notable feature of these systems is the variety of tradeoffs they present for removing redundant instances of the equations involved in an inference. This combines in one consistent framework almost all practical critical pair criteria, including the notion of Basic Completion. In addition strict improvements of currently known criteria are developed.

Bi-rewriting, a Term Rewriting Technique for Monotonic Order Relations
Jordi Levy, Jaume Agustí-Cullell

We propose an extension of rewriting techniques to derive inclusion relations a⊆b between terms built from monotonic operators. Instead of using only a rewriting relation → and rewriting a to b, we use another rewriting relation → as well and seek a common expression c such that a→⊆*c and b→⊇*c. Each component of the bi-rewriting system (→,→) is allowed to be a subset of the corresponding inclusion ⊆ or ⊇. In order to assure the decidability and completeness of the proof procedure we study the commutativity of → and →. We also extend the existing techniques of rewriting modulo equalities to bi-rewriting modulo a set of inclusions. We present the canonical bi-rewriting system corresponding to the theory of non-distributive lattices.

A Case Study of Completion Modulo Distributivity and Abelian Groups
Hantao Zhang

We propose an approach for building equational theories with the objective of improving the performance of the completion procedure, even though there exist canonical rewrite systems for these theories. As a test case of our approach, we show how to build the free Abelian groups and distributivity laws in the completion procedure. The empirical results of our experiment on proving many identities in alternative rings show clearly that the gain of this approach is substantial. More than 30 identities which are valid in any alternative ring are taken from the book "Rings that are nearly associative" by K. A. Zhevlakov et al., and include the Moufang identities and the skew-symmetry of the Kleinfeld function. The proofs of these identities are obtained by Herky, a descendent of RRL and a high-performance rewriting-based theorem prover.

A Semantic Approach to Order-Sorted Rewriting
Andreas Werner

Order-sorted rewriting builds a nice framework to handle partially defined functions and subtypes (see [Smolka & al 87]). In the previous works about order-sorted rewriting the term rewriting system needs to be sort decreasing in order to be able to prove a critical pair lemma and Birkhoff's completeness theorem. However, this approach is too restrictive.
Therefore, we generalize well-sorted terms to semantically well-sorted terms and well-sorted substitutions to some kind of semantically wellsorted substitutions. Semantically well-sorted terms with respect to a set of equations E are terms that denote well-defined elements in every algebra satisfying E.
We prove a critical pair lemma and Birkhoff's completeness theorem for so-called range unique signatures and arbitrary order-sorted rewriting systems. A transformation is given which allows to obtain an equivalent range unique signature from each non-range-unique one. We also show some decidability results.

Distributing Equational Theorem Proving
Jürgen Avenhaus, Jörg Denzinger

In this paper we show that distributing the theorem proving task to several experts is a promising idea. We describe the team work method which allows the experts to compete for a while and then to cooperate. In the cooperation phase the best results derived in the competition phase are collected and the less important results are forgotten. We describe some useful experts and explain in detail how they work together. We establish fairness criteria and so prove the distributed system to be both complete and correct. We have implemented our system and show by non-trivial examples that drastical time speed-ups are possible for a cooperating team of experts compared to the time needed by the best expert in the team.

On the Correctness of a Distributed Memory Gröbner basis Algorithm
Soumen Chakrabarti, Katherine A. Yelick

We present an asynchronous MIMD algorithm for Gröbner basis computation. The algorithm is based on the well-known sequential algorithm of Buchberger. Two factors make the correctness of our algorithm nontrivial: the nondeterminism that is inherent with asynchronous parallelism, and the distribution of data structures which leads to inconsistent views of the global state of the system. We demonstrate that by describing the algorithm as a nondeterministic sequential algorithm, and presenting the optimized parallel algorithm through a series of refinements to that algorithm, the algorithm is easier to understand and the correctness proof becomes manageable. The proof does, however, rely on algebraic properties of the polynomials in the computation, and does not follow directly from the proof of Buchberger's algorithm.

Improving Transformation Systems for General E-Unification
Max Moser

In this paper we motivate and present a new and improved transformation system for general E-unification. It can be seen as a modification of the original transformation system by Gallier and Snyder refined by ordinary unification and basic paramodulation. We present a short proof of completeness. Besides completeness we can also show an important property of the transformation system which is not known for the original system: independence of the selection rule. This motivates the abstraction of transformation sequences to equational proof trees thus obtaining static proof objects which facilitates finding further refinements of the procedure.

Equational and Membership Constraints for Finite Trees
Joachim Niehren, Andreas Podelski, Ralf Treinen

We present a new constraint system with equational and membership constraints over infinite trees. It provides for complete and correct satisfiability and entailment tests and is therefore suitable for the use in concurrent constraint programming systems which are based on cyclic data structures.
Our set defining devices are greatest fixpoint solutions of regular systems of equations with a deterministic form of union. As the main technical particularity of the algorithms we present a novel memorization technique. We believe that both satisfiability and entailment tests can be implemented in an efficient and incremental manner.

Regular Path Expression in Feature Logic
Rolf Backofen

We examine the existential fragment of a feature logic, which is extended by regular path expressions. A regular path expression is a subterm relation, where the allowed paths for the subterms are restricted by a regular language. We will prove that satisfiability is decidable. This is achieved by setting up a quasi-terminating rewrite system.

Some Lambda Calculi with Categorial Sums and Products
Daniel J. Dougherty

We consider the simply typed λ-calculus with primitive recursion operators and types corresponding to categorical products and coproducts.. The standard equations corresponding to extensionality and to surjectivity of pairing and its dual are oriented as expansion rules. Strong normalization and ground (base-type) confluence is proved for the full calculus; full confluence is proved for the calculus omitting the rule for strong sums. In the latter case, fixed-point constructors may be added while retaining confluence.

Paths, Computations and Labels in the Lambda-Calculus
Andrea Asperti, Cosimo Laneve

We provide a new characterization of Lévy's redex-families in the λ-calculus [11] as suitable paths in the initial term of the derivation. The idea is that redexes in a same family are created by "contraction" (via β-reduction) of a unique common path in the initial term. This fact gives new evidence about the "common nature" of redexes in a same family, and about the possibility of sharing their reduction. From this point of view, our characterization underlies all recent works on optimal graph reduction techniques for the λ-calculus [9,6,7,1], providing an original and intuitive understanding of optimal implementations.
As an easy by-product, we prove that neither overlining nor underlining are required in Lévy's labelling.

Confluence and Superdevelopments
Femke van Raamsdonk

In this paper a short proof is presented for confluence of a quite general class of reduction systems, containing λ-calculus and term rewrite systems: the orthogonal combinatory reduction systems. Combinatory reduction systems (CRSs for short) were introduced by Klop generalizing an idea of Aczel. In CRSs, the usual first-order term rewriting format is extended with binding structures for variables. This permits to express besides first order term rewriting also λ-calculus, extensions of λ-calculus and proof normalizations. Confluence will be proved for orthogonal CRSs, that is, for those CRSs having left-linear rules and no critical pairs. The proof proceeds along the lines of the proof of Tait and Martin-Löf for confluence of λ-calculus, but uses a different notion of 'parallel reduction' as employed by Aczel. It gives rise to an extended notion of development, called 'superdevelopment'. A superdevelopment is a reduction sequence in which besides redexes that descend from the initial term also some redexes that are created during reduction may be contracted. For the case of λ-calculus, all superdevelopments are proved to be finite. A link with the confluence proof is provided by proving that superdevelopments characterize exactly the Aczel's notion of 'parallel reduction' used in order to obtain confluence.

Relating Graph and Term Rewriting via Böhm Models
Zena M. Ariola

Dealing properly with sharing is important for expressing some of the common compiler optimizations, such as common subexpressions elimination, lifting of free expressions and removal of invariants from a loop, as source-to-source transformations. Graph rewriting is a suitable vehicle to accommodate these concerns. In [4] we have presented a term model for graph rewriting systems (GRSs) without interfering rules, and shown the partial correctness of the aforementioned optimizations. In this paper we define a different model for GRSs, which allows us to prove total correctness of those optimizations. Differently from [4] we will discard sharing from our observations and introduce more restrictions on the rules. We will introduce the notion of Böhm tree for GRSs, and show that in a system without interfering and non-left linear rules (orthogonal GRSs), Böhm tree equivalence defines a congruence. Total correctness then follows in a straightforward way from showing that if a program M contains less sharing than a program N, then both M and N have the same Böhm tree.
We will also show that orthogonal GRSs are a correct implementation of orthogonal TRSs. The basic idea of the proof is to show that the behavior of a graph can be deduced from its finite approximations, that is, graph rewriting is a continuous operation. Our approach differs from that of other researchers [6, 9], which is based on infinite rewriting.

Topics in Termination
Nachum Dershowitz, Charles Hoot

We generalize the various path orderings and the conditions under which they work, and describe an implementation of this general ordering. We look at methods for proving termination of orthogonal systems and give a new solution to a problem of Zantema's.

Total Termination of Term Rewriting
Maria C. F. Ferreira, Hans Zantema

We investigate proving termination of term rewriting systems by interpretation of terms in a compositional way in a total wellfounded order. This kind of termination is called total termination. On one hand it is more restrictive than simple termination, on the other it generalizes most of the usual techniques for proving termination. For total termination it turns out that below ε0 the only orders of interest are built from the natural numbers by lexicographic product and the multiset construction. By examples we show that both constructions are essential. For a wide class of term rewriting systems we prove that total termination is a modular property. Most of our techniques are based on ordinal arithmetic.

Simple Termination is Difficult
Aart Middeldorp, Bernhard Gramlich

A terminating term rewriting system is called simply terminating if its termination can be shown by means of a simplification ordering, an ordering with the property that a term is always bigger than its proper subterms. Almost all methods for proving termination yield, when applicable, simple termination. We show that simple termination is an undecidable property, even for one-rule systems. This contradicts a result by Jouannaud and Kirchner. The proof is based on the ingenious construction of Dauchet who showed the undecidability of termination for one-rule systems.

Optimal Normalization in Orthogonal Term Rewriting Systems
Zurab Khasidashvili

We design a normalizing strategy for orthogonal term rewriting systems (OTRSs), which is a generalization of the call-by-need strategy of Huet-Lévy [4]. The redexes contracted in our strategy are essential in the sense that they have "descendants" under any reduction of a given term. There is an essential redex in any term not in normal form. We further show that contraction of the innermost essential redexes gives an optimal reduction to normal form, if it exists. We classify OTRSs depending on possible kinds of redex creation as non-creating, persistent, inside-creating, non-left-absorbing, etc. All these classes are decidable. TRSs in these classes are sequential, but they do not need to be strongly sequential. For non-creating and persistent OTRSs, we show that our optimal strategy is efficient as well.

A Graph Reduction Approach to Incremental Term Rewriting (Preliminary Report)
John Field

Our concern is incremental term rewriting: efficient normalization of a sequence of terms that are related to one another by some set of disjoint subterm replacements. Such sequences of similar terms arise frequently in practical applications of term rewriting systems. Previous approaches to this problem [9, 10], have applied only to a limited class of reduction systems and rewriting strategies. In this paper, we present a new algorithm, INCfR, for carrying out incremental term rewriting in an arbitrary left-linear term rewriting system possessing a non-parallel normalizing rewriting strategy fR. This algorithm is based on a novel variant of graph rewriting.

Generating Tables for Bottom-Up Matching
Ernst Lippe

Matching forms a bottle-neck in most implementations of rewrite systems. Bottom-up matching is a very fast form of matching. This paper presents a new approach to bottom-up matching that replaces match-sets by their unifiers. In this way it is possible to define a subsumption ordering on the states. A new algorithm is presented that uses the subsumption graph to compute the bottom-up tables. Its time complexity per table entry is O(rank × wd) where rank is the maximum arity of a function symbol and wd is the maximum number of immediate predecessors in the subsumption graph.

Combination Techniques and Decision Problems for Disunification
Franz Baader, Klaus U. Schulz

Former work on combination techniques was concerned with combining unification algorithms for disjoint equational theories E1,..., En in order to obtain a unification algorithm for the union E1, ∪ ... ∪ En of the theories. Here we show that variants of this method may be applied to disunification as well. Solvability of disunification problems in the free algebra of the combined theory E1 ∪ ... ∪ En is shown to be decidable if solvability of disunification problems with linear constant restrictions in the free algebras of the theories Ei is decidable. In order to decide ground solvability (i.e., solvability in the initial algebra) of disunification problems in E1 ∪ ... ∪ En we have to consider a new kind of subproblem for the particular theories Ei, namely solvability (in the free algebra) of disunification problems with linear constant restriction under the additional constraint that values of variables are not Ei -equivalent to variables. The correspondence between ground solvability and this new kind of solvability holds, (1) if one theory Ei is the free theory with at least one function symbol and one constant, or (2) if the initial algebras of all theories Ei are infinite. Our results can be used to show that the existential fragment of the theory of the (ground) term algebra modulo associativity of a finite number of function symbols is decidable; the same result follows for function symbols which are associative and commutative, or associative, commutative and idempotent.

The Negation Elimination from Syntactic Equational Formula is Decidable
Mohamed Tajine

In this paper we introduce the property of finitely generated sets in an algebra. This property generalizes several notions of rewriting and logical programming; for example unification and disunification are specific cases of this notion. We use this property to formalize the problem of negation elimination in a syntactic equational formulae (i.e first order formulae whose only predicate is syntactic equality) and we prove that this problem is decidable.

Encompassment Properties and Automata with Constraints
Anne-Cécile Caron, Jean-Luc Coquidé, Max Dauchet

We introduce a class of tree automata with constraints which gives an algebraic and algorithmic framework in order to extend the theorem of decidability of inductive reducibility. We use automata with equality constraints in order to solve encompassment constraints and we combine such automata in order to solve every first order formulas built up with unary predicates "x encompasses t" denoted by encompt(x).

Recursively Defined Tree Transductions
Jean-Claude Raoult

we give an equational definition for relations over trees, show that they can be described by rational expressions and give sufficient restrictions on the generated relations to ensure the rationality of their domain and range, and their stability under inverse, composition and substitution. We get in this way "rational tree transductions" extending to the case of trees the well known rational transductions over words.

AC Complement Problems: Satisfiability and Negation Elimination
Maribel Fernández

We show that negation elimination is decidable for linear complement problems interpreted in, where AC is a set of associative and commutative axioms. For this, we present a system of rewrite rules that transforms any linear complement problem into a simple formula, and we give a test for deciding whether a simple formula is satisfiable in or not. This test serves as a basis for the development of a negation elimination algorithm.

A Precedence-Based Total AC-Compatible Ordering
Albert Rubio, Robert Nieuwenhuis

TODO
An important difference w.r.t. their work is that our ordering is not based on polynomial interpretations, but on a total (arbitrary) precedence on the function symbols, like in LPO or RPO (this solves an open question posed e.g. by Bachmair [Bac91]).
A second difference is that we define an extension to terms with variables, which makes the ordering applicable in practice for complete theorem proving strategies with built-in AC-unification and for orienting non-ground rewrite systems.
Our ordering is defined in a simple way by means of rewrite rules, and can be easily (and efficiently) implemented, since its main component is RPO.

Extension of the Associative Path Ordering to a Chain of Associative Commutative Symbols
Catherine Delor, Laurence Puel

In this paper, we give a generalization of the associative path ordering. This ordering has been introduced by Bachmair and Plaisted [5] and is a restricted variant of the recursive path ordering which can be used for proving the termination of associative-commutative term rewriting systems. This ordering requires strong conditions on the precedence on the alphabet. In this article, we treat the case of a precedence which contains a chain of AC symbols. We also introduce some unary symbols comparable with AC symbols.

Polynomial Time Termination and Constraint Satisfaction Tests
David A. Plaisted

We show that the termination of ground term-rewriting systems is decidable in polynomial time. This result is extended to ground rational term-rewriting systems. We apply this result to show that the problem of determining whether there exists a simplification ordering over a possibly extended signature, satisfying a set of stict inequalities between terms, is decidable in polynomial time. As a simple consequence, it is decidable in polynomial time whether there exists a simplification ordering which shows that a ground term rewriting system terminates.

Linear Interpretations by Counting Patterns
Ursula Martin

We introduce a new family of well-founded monotonic orderings on terms, constructed bu counting certain patterns in terms called zig-zags. These extend the familiar Knuth Bendix orderings, providing in general continuum many distinct new orderings with a given choice of Knuth-Bendix weight.

Some Undecidable Termination Problems for Semi-Thue Systems (Abstract)
Géraud Sénizergues

We show that the uniform termination problem is undecidable for length-preserving semi-Thue systems having 10 rules. We then give an explicit uniformly-terminating semi-Thue system having 9 rules which is "universal with respect to termination problems" in some sense. It follows that there exists a fixed rule (u0,v0) such that T ∪ {(u0, v0)} has 10 rules and undecidable termination problem.

Saturation of First-Order (Constrained) Clauses with the Saturate System
Pilar Nivela, Robert Nieuwenhuis

TODO

MERILL: An Equational Reasoning System in Standard ML
Brian Matthews

TODO

Reduce the Redex → ReDuX
Reinhard Bündgen

The ReDuX-system is a work-bench for programming and experimenting with term rewriting systems. It is focused towards the implementation of completion procedures with special emphasis on inductive completion. From the programmer's point of view ReDuX provides a large library of data types and algorithms (over 450) which allows for high level programming. The experimentalist also finds a collection of ready-to-run programs (see Table 1, and [WB91]). geDuX has been developed as an extension of the TC- and IC-systems [Kiic82a, B/in87] and has been used as a research tool over the last years. For the last two years it has also been employed as a tutorial system for courses on term rewriting systems at the University of Tiibingen.

AGG - An Implementation of Algebraic Graph Rewriting
Michael Löwe, Martin Beyer

TODO

Smaran: A Congruence-Closure Based System for Equational Computations
Rakesh M. Verma

TODO

LAMBDALG: Higher Order Algebraic Specification Language
Yexuan Gui, Mitsuhiro Okada

TODO

More Problems in Rewriting
Nachum Dershowitz, Jean-Pierre Jouannaud, Jan Willem Klop

Two years ago, in the proceedings of the previous conference, we presented a list of open problems in the theory of rewriting [Dershowitz et al., 1991a]. This time, we report on progress made during the intervening time, and then list some new problems. (A few additional questions on the subject appear in the back of [Diekert, 1990].) We also mention a couple of long-standing open problems which have recently been answered. The last section contains a partisan list of interesting areas for future research. A new, comprehensive survey of the field is [Klop, 1992].
Please send any contributions by electronic or ordinary mail to any of us. We hope to continue periodically publicizing new problems and solutions to old ones. We thank all the individuals who contributed questions, updates and solutions

1991

Transfinite Reductions in Orthogonal Term Rewriting Systems (Extended Abstract)
Richard Kennaway, Jan Willem Klop, M. Ronan Sleep, Fer-Jan de Vries

Strongly convergent reduction is the fundamental notion of reduction in infinitary orthogonal term rewriting systems (OTRSs). For these we prove the Transfinite Parallel Moves Lemma and the Compressing Lemma. Strongness is necessary as shown by counterexamples. Normal forms, which we allow to be infinite, are unique, in contrast to ω-normal forms. Strongly converging fair reductions result in normal forms.
In general OTRSs the infinite Church-Rosser Property fails for strongly converging reductions. However for Böhm reduction (as in Lambda Calculus, subterms without head normal forms may be replaced by ⊥) the infinite Church-Rosser property does hold. The infinite Church-Rosser Property for non-unifiable OTRSs follows. The top-terminating OTRSs of Dershowitz c.s. are examples of non-unifiable OTRSs.

Redex Capturing in Term Graph Rewriting (Concise Version)
William M. Farmer, Ronald J. Watro

Term graphs are a natural generalization of terms in which structure sharing is allowed. Structure sharing makes term graph rewriting a time- and space-efficient method for implementing term rewrite systems. Certain structure sharing schemes can lead to a situation in which a term graph component is rewritten to another component that contains the original. This phenomenon, called redex capturing, introduces cycles into the term graph which is being rewritten - even when the graph and the rule themselves do not contain cycles. In some applications, redex capturing is undesirable, such as in contexts where garbage collectors require that graphs be acyclic. In other applications, for example in the use of the fixed-point combinator Y, redex capturing acts as a rewriting optimization. We show, using results about infinite rewritings of trees, that term graph rewriting with arbitrary structure sharing (including redex capturing) is sound for left-linear term rewrite systems.

Rewriting, and Equational Unification: the Higher-Order Cases
David A. Wolfram

We give here a general definition of term rewriting in the simply typed lambda-calculus, and use it to define higher-order forms of term rewriting systems, and equational unification and their properties. This provides a basis for generalizing the first- and restricted higher-order results for these concepts. As examples, we generalize Plotkin's criteria for building-in equational theories, and show that pure third-order equational matching is undecidable. This approach simplifies computations in applications involving lexical scoping, and equations. We discuss open problems and summarize future research directions.

Adding Algebraic Rewriting to the Untyped Lambda Calculus (Extended Abstract)
Daniel J. Dougherty

We investigate the system obtained by adding an algebraic rewriting system R to the untyped lambda calculus. On certain classes of terms, called here "stable", we prove that the resulting calculus is confluent if R is confluent, and terminating if R is terminating. The termination result has the corresponding theorems for several typed calculi as corollaries. The proof of the confluence result yields a general method for proving confluence of typed β reduction plus rewriting; we sketch the application to the polymorphic calculus Fω.

Incremental Termination Proofs and the Length of Derivations
Frank Drewes, Clemens Lautemann

Incremental termination proofs, a concept similar to termination proofs by quasi-commuting orderings, are investigated. In particular, we show how an incremental termination proof for a term rewriting system T can be used to derive upper bounds on the length of derivations in T. A number of examples show that our results can be applied to yield (sharp) low-degree polynomial complexity bounds.

Time Bounded Rewrite Systems and Termination Proofs by Generalized Embedding
Dieter Hofbauer

It is shown that term rewriting systems with primitive recursively bounded derivation heights can be simulated by rewriting systems that have termination proofs using generalized embedding, a very restricted class of simplification orderings. As a corollary we obtain a characterization of the class of relations computable by rewrite systems having primitive recursively bounded derivation heights using recent results on termination proofs by multiset path orderings.

Detecting Redundant Narrowing Derivations by the LSE-SL Reducability Test
Stefan Krischer, Alexander Bockmayr

Rewriting and narrowing provide a nice theoretical framework for the integration of logic and functional programming. For practical applications however, narrowing is still much too inefficient. In this paper we show how reducibility tests can be used to detect redundant narrowing derivations. We introduce a new narrowing strategy, LSE-SL left-to-right basic normal narrowing, prove its completeness for arbitrary canonical term rewriting systems, and demonstrate how it increases the efficiency of the narrowing process.

Unification, Weak Unification, Upper Bound, Lower Bound, and Generalization Problems
Franz Baader

We introduce E-unification, weak E-unification, E-upper bound, E-lower bound, and E-generalization problems, and the corresponding notions of unification, weak unification, upper bound, lower bound, and generalization type of an equational theory. When defining instantiation preorders on solutions of these problems, one can compared substitutions w.r.t. their behaviour on all variables or on finite sets of variables. We shall study the effect which these different instantiation preorders have on the existence of most general or most specific solutions of E-unification, weak E-unification, and E-generalization problems. In addition, we shall elucidate the subtle difference between most general unifiers and coequalizers, and we shall consider generalization in the class of commutative theories.

AC Unification Through Order-Sorted AC1 Unification
Eric Domenjoud

We design in this paper a new algorithm to perform unification modulo Associativity and Commutativity. This problem is known to be NP-Complete, and none of the solutions proposed until now is very satisfying because of the huge amount of minimal unifiers of some equations. Unlike many authors, we did not try to speed up computations by optimizing some parts of the algorithm, but we tried to design an extension of the algebra in which unification would be less complex. This goal is achieved by adding axioms expressing the existence of an identity for every AC-operator and working in the AC1 theory. In order to get a conservative extension of the quotient algebra, we work in an order-sorted framework, and thus have to deal with order-sorted unification in a collapsing theory.

Narrowing Directed by a Graph of Terms
Jacques Chabin, Pierre Réty

Narrowing provides a complete procedure to solve equations modulo confluent and terminating rewriting systems. But it seldom terminates. This paper presents a method to improve the termination. The idea consists in using a finite graph of terms built from the rewriting system and the equation to be solved, which helps one to know the narrowing derivations possibly leading to solutions. Thus, the other derivations are not computed. This method is proved complete. An example is given and some improvements are proposed.

Adding Homomorphisms to Commutative/Monoidal Theories or How Algebra Can Help in Equational Unification
Franz Baader, Werner Nutt

In this paper we consider the class of theories for which solving unification problems is equivalent to solving systems of linear equations over a semiring. This class has been introduced by the authors independently of each other as commutative theories (Baader) and monoidal theories (Nutt). The class encompasses important examples like the theories of abelian monoids, idempotent abelian monoids, and abelian groups.
We identify a large subclass of commutative/monoidal theories that are of unification type zero by studying equations over the corresponding semiring. As a second result, we show with methods from linear algebra that unitary and finitary commutative/monoidal theories do not change their unification type when they are augmented by a finite monoid of homomorphisms, and how algorithms for the extended theory can be obtained from algorithms for the basic theory. The two results illustrate how using algebraic machinery can lead to general results and elegant proofs in unification theory.

Undecidable Properties of Syntactic Theories
Francis Klay

Since we are looking for unification algorithms for a large enough class of equational theories, we are interested in syntactic theories because they have a nice decomposition property which provides a very simple unification procedure. A presentation is said resolvent if any equational theorem can be proved using at most one equality step at the top position. A theory which has a finite and resolvent presentation is called syntactic. In this paper we give decidability results about open problems in syntactic theories: unifiability in syntactic theories is not decidable, resolventness of a presentation and syntacticness of a theory are even not semidecidable. Therefore we claim that the condition of syntacticness is too weak to get unification algorithms directly.

Goal Directed Strategies for Paramodulation
Wayne Snyder, Christopher Lynch

It is well-known that the set of support strategy is incomplete in paramodulation theorem provers if paramodulation into variables is forbidden. In this paper, we present a paramodulation calculus for which the combination of these two restrictions is complete, based on a lazy form of the paramodulation rule which delays parts of the unification step. The refutational completeness of this method is proved by transforming proofs given by other paramodulation strategies into set of support proofs using this new inference rule. Finally, we consider the completeness of various refinements of the method, and conclude by discussing related work and future directions.

Minimal Solutions of Linear Diophantine Systems: Bounds and Algorithms
Loic Pottier

We give new bounds and algorithms for minimal solutions of linear diophantine systems. These bounds are simply exponential, while previous known bounds were, at least until recently, doubly exponential.

Proofs in Parameterized Specification
Hélène Kirchner

Theorem proving in parameterized specifications has strong connections with inductive theorem proving. An equational theorem holds in the generic theory of the parameterized specification if and only if it holds in the so-called generic algebra. Provided persistency, for any specification morphism, the translated equality holds in the initial algebra of the instantiated specification. Using a notion of generic ground reducibility, a persistency proof can be reduced to a proof of a protected enrichment. Effective tools for these proofs are studied in this paper.

Completeness of Combinations of Constructor Systems
Aart Middeldorp, Yoshihito Toyama

A term rewriting system is called complete if it is both confluent and strongly normalizing. Barendregt and Klop showed that the disjoint union of complete term rewriting systems does not need to be complete. In other words, completeness is not a modular property of term rewriting systems. Toyama, Klop and Barendregt showed that completeness is a modular property of left-linear TRS's. In this paper we show that it is sufficient to impose the constructor discipline for obtaining the modularity of completeness. This result is a simple consequence of a quite powerful divide and conquer technique for establishing completeness of such constructor systems. Our approach is not limited to systems which are composed of disjoint parts. The importance of our method is that we may decompose a given constructor system into parts which possibly share function symbols and rewrite rules in order to infer completeness. We obtain a similar technique for semi-completeness, i.e. the combination of confluence and weak normalization.

Modular Higher-Order E-Unification
Tobias Nipkow, Zhenyu Qian

The combination of higher-order and first-order unification algorithms is studied. We present algorithms to compute a complete set of unifiers of two simply typed λ-terms w.r.t. the union of α, β and η conversion and a first-order equational theory E. The algorithms are extensions of Huet's work and assume that a complete unification algorithm for E is given. Our completeness proofs require E to be at least regular.

On Confluence for Weakly Normalizing Systems
Pierre-Louis Curien, Giorgio Ghelli

We present a general, abstract method to show confluence of weakly normalizing systems. The technique consists in constructing an interpretation of the source system into a target system which is already confluent. If the interpretation satisfies certain simple conditions, then the source system is confluent. The method has been used implicitly in a number of applications, but does not seem to have been presented so far in its generality. We present, as digressions, two other methods for proving confluence.

Program Transformation and Rewriting
Françoise Bellegarde

We present a basis for program transformation using term rewriting tools. A specification is expressed hierarchically by successive enrichments as a signature and a set of equations. A term can be computed by rewriting. Transformations come from applying a partial unfailing completion procedure to the original set of equations augmented by inductive theorems and a definition of a new function symbol following diverse heuristics. Moreover, the system must provide tools to prove inductive properties; to verify that enrichment produces neither junk nor confusion; and to check for ground confluence and termination. These properties are related to the correctness of the transformation.

An Efficient Representation of Arithmetic for Term Rewriting
Dave Cohen, Phil Watson

We give a locally confluent set of rewrite rules for integer (positive and negative) arithmetic using the familiar system of place notation. We are unable to prove its termination at present, but we strongly conjecture that rewriting with this system terminates and give our reasons. We show that every term has a normal form and so the rewrite system is normalising.
We justify our choice of representation in terms of both space efficiency and speed of rewriting.
Finally we give several examples of the use of our system.

Query Optimization Using Rewrite Rules
Sieger van Denneheuvel, Karen L. Kwast, Gerard R. Renardel de Lavalette, Edith Spaan

In literature on query optimization the normal form for relational algebra expressions consisting of Projection, Selection and Join, is well known. In this paper we extend this normal form with Calculation and Union and define a corresponding language UPCSJL. In addition we show how the normal form can be used for query optimization.

Boolean Algebra Admits No Convergent Term Rewriting System
Rolf Socher-Ambrosius

Although there exists a normal form for the theory of Boolean Algebra w.r.t. associativity and commutativity, the so called set of prime implicants, there does not exist a convergent equational term rewriting system for the theory of boolean algebra modulo AC. The result seems well-known, but no formal proof exists as yet. In this paper a formal proof of this fact is given.

Decidability of Confluence and Termination of Monadic Term Rewriting Systems
Kai Salomaa

Term rewriting systems where the right-hand sides of rewrite rules have height at most one are said to be monadic. These systems are a generalization of the well known monadic Thue systems. We show that termination is decidable for right-linear monadic systems but undecidable if the rules are only assumed to be left-linear. Using the Peterson-Stickel algorithm we show that confluence is decidable for right-linear monadic term rewriting systems. It is known that ground confluence is undecidable for both left-linear and right-linear monadic systems. We consider partial results for deciding ground confluence of linear monadic systems.

Bottom-Up Tree Pushdown Automata and Rewrite Systems
Jean-Luc Coquidé, Max Dauchet, Rémi Gilleron, Sándor Vágvölgyi

Studying connections between term rewrite systems and bottom-up tree pushdown automata (tpda), we complete and generalize results of Gallier, Book and K. Salomaa. We define the notion of tail reduction free rewrite systems (trf rewrite systems). Using the decidability of inductive reducibility (Plaisted), we prove the decidability of the trf property. Monadic rewrite systems of Book, Gallier and K. Salomaa become an obvious particular case of trf rewrite systems. We define also semi-monadic rewrite systems which generalize monadic systems but keep their fair properties. We discuss different notions of bottom-up tree pushdown automata, that can be seen as the algorithmic aspect of classes of problems specified by trf rewrite systems. Especially, we associate a deterministic tpda with any left-linear trf rewrite system.

On Relationship Between Term Rewriting Systems and Regular Tree Languages
Gregory Kucherov

The paper presents a new result on the relationship between term rewriting systems (TRSs) and regular tree languages. Important consequences (concerning, in particular, a problem of ground-reducibility) are discussed.

The Equivalence of Boundary and Confluent Graph Grammars on Graph Languages of Bounded Degree
Franz-Josef Brandenburg

Let B-edNCE and C-edNCE denote the families of graph languages generated by boundary and by confluent edNCE graph grammars, respectively. Boundary means that two nonterminals are never adjacent, and confluent means that rewriting steps are order independent. By definition, boundary graph grammars are confluent, so that B-edNCE ⊆ C-edNCE. Engelfriet et. al. [8] have shown that this inclusion is proper, in general, using certain graph languages of unbounded degree as a witness. We prove that equality holds on graph languages of bounded degree, i.e., B-edNCEdeg=C-edNCEdeg, where the subscript "deg" refers to graph languages of bounded degree. Thus, for bounded degree, boundary graph grammars are the operator normal form of confluent graph grammars and e.g., the characterization results obtained independently for B-edNCE and C-edNCE can be merged. Our result confirms boundary and confluent graph grammars as notions for context-free graph grammars.

Left-to-Right Tree Pattern Matching
Albert Gräf

We propose a new technique to construct left-to-right matching automata for trees. Our method is based on the novel concept of prefix unifcation which is used to compute a certain closure of the pattern set. From the closure a kind of deterministic matching automaton can be derived immediately. We also point out how to perform the construction incrementally which makes our approach suitable for applications in which pattern sets change dynamically, such as in the Knuth-Bendix completion algorithm.
Our method, like most others, is restricted to linear patterns (the case of non-linear matching can be handled as usual by checking the consistency of variable bindings in a separate pass following the matching phase).

Incremental Techniques for Efficient Normalization of Nonlinear Rewrite Systems
R. Ramesh, I. V. Ramakrishnan

In [8] we described a nonlinear pattern-matching algorithm with the best known worst-case and optimal average-case time complexity. In this paper we first report on some experiments conducted on our algorithm. Based on these experiments we believe that our algorithm is useful in speeding up normalization of nonlinear rewrite systems even when it has a small number (≥5) of rewrite rules.
In order to find matches quickly our algorithm operates in two phases. In the first phase it scans the subject tree to collect some "information" which is then used to find matches quickly in the second phase. Scanning can become very expensive especially for large subject trees. However the normalization process is "incremental" in nature. After each reduction step the subject tree is altered and this modified tree is usually not completely different from the old tree. Hence it should be possible to avoid scanning and searching the entire tree for new matches. We describe general techniques to exploit the incremental nature of normalization to speed up each reduction step. Specifically, using these techniques we show that the search for new matches in the subject tree following a replacement can be made independent of the size of the subject tree.

On Fairness of Completion-Based Theorem Proving Strategies
Maria Paola Bonacina, Jieh Hsiang

Fairness is an important concept emerged in theorem proving recently, in particular in the area of completion-based theorem proving. Fairness is a required property for the search plan of the given strategy. Intuitively, fairness of a search plan guarantees the generation of a successful derivation if the inference mechanism of the strategy indicates that there is one. Thus, the completeness of the inference rules and the fairness of the search plan form the completeness of a theorem proving strategy. A search plan which exhausts the entire search space is obviously fair, albeit grossly inefficient. Therefore, the problem is to reconciliate fairness and efficiency. This problem becomes even more intricate in the presence of contraction inference rules - rules that remove data from the data set.
The known definitions of fairness for completion-based methods are designed to ensure the confluence of the resulting system. Thus, a search plan which is fair according to these definitions may force the prover to perform deductions completely irrelevant to prove the intended theorem. In a theorem proving strategy, on the other hand, one is usually only interested in proving a specific theorem. Therefore the notion of fairness should be defined accordingly. In this paper we present a target-oriented definition of fairness for completion, which takes into the account the theorem to be proved and therefore does not require computing all the critical pairs. If the inference rules are complete and the search plan is fair with respect to our definition, then the strategy is complete. Our framework contains also notions of redundancy and contraction. We conclude by comparing our definition of fairness and the related concepts of redundancy and contraction with those in related works.

Proving Equational and Inductive Theorems by Completion and Embedding Techniques
Jürgen Avenhaus

The Knuth-Bendix completion procedure can be used to transform an equational system into a convergent rewrite system. This allows to prove equational and inductive theorems. The main draw back of this technique is that in many cases the completion diverges and so produces an infinite rewrite system. We discuss a method to embed the given specification into a bigger one such that the extended specification allows a finite "parameterized" description of an infinite rewrite system of the base specification. The main emphasis is in proving the correctness of the approach. Examples show that in many cases the Knuth-Bendix completion in the extended specification stops with a finite rewrite system though it diverges in the base specification. This indeed allows to prove equational and inductive theorems in the base specification.

Divergence Phenomena during Completion
Andrea Sattler-Klein

We will show how any primitive recursive function may be encoded in a finite canonical string rewriting system. Using these encodings for every primitive recursive function f (and even for every recursively enumerable set C) a finite string rewriting system R and a noetherian ordering > may be constructed such that completion of R with respect to > will generate a divergence sequence that encodes explicitly the input/output behaviour of f (or the set C, respectively). Furthermore, we will show by an example that if completion of a set R with respect to a noetherian ordering > diverges, then there need not exist any rule that causes infinitely many other ones by overlapping.

Simulation Buchberger's Algorithm by Knuth-Bendix Completion
Reinhard Bündgen

We present a canonical term rewriting system whose initial model is isomorphic to GF(q)[x1,...,xn]. Using this set of rewrite rules and additional ground equations specifying an ideal we can simulate Buchberger's algorithm for polynomials over finite fields using Knuth-Bendix term completion modulo AC. In order to simplify our proofs we exhibit a critical pair criterion which transforms critical pairs into simpler ones.

On Proving Properties of Completion Strategies
Miki Hermann

We develop methods for proving the fairness and correctness properties of rule based completion strategies by means of process logic. The concepts of these properties are formulated generally within process logic and then concretized to rewrite system theory based on transition rules. We develop in parallel the notions of success and failure of a completion strategy, necessary to support the proves of the cited properties. Finally we show the necessity of another property, called justice, in the analysis of completion strategies.

On Ground AC-Completion
Claude Marché

We prove that a canonical set of rules for an equational theory defined by a finite set of ground axioms plus the associativity and commutativity of any number of operators must be finite.
As a corollary, we show that ground AC-completion, when using a total AC-simplification ordering and an appropriate control, must terminate.
Using a recent result of Narendran and Rusinowitch (in this volume), this implies that the word problem for such a theory is decidable.

Any Gound Associative-Commutative Theory Has a Finite Canonical System
Paliath Narendran, Michaël Rusinowitch

We show that theories presented by a set of ground equations with several associative-commutative (AC) symbols always admit a finite canonical system. This result is obtained through the construction of a reduction ordering which is AC-compatible and total on the set of congruence classes generated by the associativity and commutativity axioms. As far as we know, this is the first ordering with such properties, when several AC function symbols and free function symbols are allowed. Such an ordering is also a fundamental tool for deriving complete theorem proving strategies with built-in associative commutative unification.

A Narrowing-Based Theorem Prover
Ulrich Fraus

This work presents a theorem prover for inductive proofs within an equational theory which supports the verification of universally quantified equations. This system, called TIP, is based on a modification of the well-known narrowing algorithm. Particulars of the implementation are stated and practical experiences are summarized.

ANIGRAF: An Interactive System for the Animation of Graph Rewriting Systems with Priorities
Michel Billaud

TODO

EMMY: A Refutational Theorem Prover for First-Order Logic with Equation
Aline Deruyver

Emmy is an implementation in Quintus Prolog of a refutational theorem prover for first-order logic with equations based on a superposition calculus. This paper describes the structure and the functionalities of this theorem prover.

The Tecton Proof System
Raj Agarwal, David R. Musser, Deepak Kapur, Xumin Nie

TODO

Open Problems in Rewriting
Nachum Dershowitz, Jean-Pierre Jouannaud, Jan Willem Klop

TODO

1989

Characterization of Unification Type Zero
Franz Baader

In the literature several methods have hitherto been used to show that an equational theory has unification type zero. These methods depend on conditions which are candidates for alternative characterizations of unification type zero. In this paper we consider the logical connection between these conditions on the abstract level of partially ordered sets. Not all of them are really equivalent to type zero.
The conditions may be regarded as tools which can be used to determine the unification type of given theories. They are also helpful in understanding what makes a theory to be of type zero.

Proof Normalization for Resolution and Paramodulation
Leo Bachmair

We prove the refutation completeness of restricted versions of resolution and paramodulation for first-order predicate logic with equality. Furthermore, we show that these inference rules can be combined with various deletion and simplification rules, such as rewriting, without compromising refutation completeness. The techniques employed in the completeness proofs are based on proof normalization and proof orderings.

Complete Sets of Reductions Modulo Associativity, Commutativity and Identity
Timothy B. Baird, Gerald E. Peterson, Ralph W. Wilkerson

We describe the theory and implementation of a process which finds complete sets of reductions modulo equational theories which contain one or more associative and commutative operators with identity (ACI theories). We emphasize those features which distinguish this process from the similar one which works modulo associativity and commutativity. A primary difference is that for some rules in ACI complete sets, restrictions are required on the substitutions allowed when the rules are applied. Without these restrictions, termination cannot be guaranteed. We exhibit six examples of ACI complete sets that were generated by an implementation.

Completion-Time Optimization of Rewrite-Time Goal Solving
Hubert Bertling, Harald Ganzinger

Completion can be seen as a process that transforms proofs in an initially given equational theory to rewrite proofs in final rewrite rules. Rewrite proofs are the normal forms of proofs under these proof transformations. The purpose of this paper is to provide a framework in which one may further restrict the normal forms of proofs which completion is required to construct, thereby further decreasing the complexity of reduction and goal solving in completed systems.

Computing Ground Reducability and Inductively Complete Positions
Reinhard Bündgen, Wolfgang Küchlin

We provide the extended ground-reducibility test which is essential for induction with term-rewriting systems based on [Küc89]: Given a term, determine at which sets of positions it is ground-reducible by which subsets of rules. The core of our method is a new parallel cover algorithm based on recursive decomposition. From this we obtain a separation algorithm which determines constructors and defined function symbols in a term-algebra presented by a rewrite system. We then reduce our main problem of extended ground-reducibility to separation and cover. Furthermore, using the knowledge of algebra separation, we refine the bounds of [JK86] for the size of ground reduction test-sets. Both our cover algorithm and our extended ground-reducibility test are engineered to be adaptive to actual problem structure, i.e., to allow for lower than the worst case bounds for test-sets on well conditioned problems, including well conditioned subproblems of difficult cases.

Inductive Proofs by Specification Transformation
Hubert Comon-Lundh

We show how to transform equational specifications with relations between constructors (or without constructors) into order-sorted equational specifications where every function symbol is either a free constructor or a completely defined function.
This method allows to reduce the problem of inductive proofs in equational theories to Huet and Hullot's proofs by consistency [HH82]. In particular, it is no longer necessary to use the socalled "inductive reducibility test" which is the most expensive part of the Jouannaud and Kounalis algorithm [JK86].

Narrowing and Unification in Functional Programming - An Evaluation Mechanism for Absolute Set Abstraction
John Darlington, Yike Guo

The expressive power of logic programming may be achieved within a functional programming framework by extending the functional language with the ability to evaluate absolute set abstractions. By absolute set abstraction, logical variables are introduced into functional languages as first class objects. Their set-valued interpretations are implicitly defined by the constraining equations. Narrowing and unification can be used to solve these constraining equations to produce the satisfying instantiations of the logic variables. In this paper, we study an execution mechanism for evaluating absolute set abstraction in a first order (non-strict) functional programming setting. First, we investigate the semantics of absolute set abstraction. It is shown that absolute set abstraction is no more than a setvalued expression involving the evaluation of function inversion. Functional equality is defined coinciding with the semantics of the continuous and strict equality function in functional programming. This new equality means that the well known techniques for equation solving can be adopted as a proper mechanism for solving the constraining equations which are the key to the evaluation of absolute set abstraction. The main result of this paper lies in the study of a particular narrowing strategy, called lazy pattern driven narrowing, which is proved to be complete and optimal for evaluating absolute set abstraction in the sense that a complete set of minimal solutions of the constraining equations can be generated by a semantic unification procedure based on this narrowing strategy. This indicates that a mechanism for equation solving can be developed within a functional programming context, producing a more expressive language.

Simulation of Turning Machines by a Left-Linear Rewrite Rule
Max Dauchet

We prove in this paper that for every Turing machine there exists a left-linear, variable preserving and non-overlapping rewrite rule that simulates its behaviour. The main corollary is the undecidability of the termination for such a rule. If we suppose that the left-hand side can be unified with an only subterm of the right-hand side, then termination is decidable.

Higher-order Unification with Dependent Function Types
Conal Elliott

Roughly fifteen years ago, Huet developed a complete semidecision algorithm for unification in the simply typed λ-calculus (λ). In spite of the undecidability of this problem, his algorithm is quite usable in practice. Since then, many important applications have come about in such areas as theorem proving, type inference, program transformation, and machine learning.
Another development is the discovery that by enriching λ to include dependent function types, the resulting calculus (λΠ) forms the basis of a very elegant and expressive Logical Framework, encompassing the syntax, rules, and proofs for a wide class of logics.
This paper presents an algorithm in the spirit of Huet's, for unification in λΠ. This algorithm gives us the best of both worlds: the automation previously possible in λ, and the greatly enriched expressive power of λΠ. It can be used to considerable advantage in many of the current applications of Huet's algorithm, and has important new applications as well. These include automated and semi-automated theorem proving in encoded logics, and automatic type inference in a variety of encoded languages.

An Overview of LP, The Larch Power
Stephen J. Garland, John V. Guttag

TODO

Graph Grammars, A New Paradigma for Implementing Visual Languages
Herbert Göttler

This paper is a report on an ongoing work which started in 1981 and is aiming at a general method which would help to considerably reduce the time necessary to develop a syntax-directed editor for any given diagram technique. The main idea behind the approach is to represent diagrams by (formal) graphs whose nodes are enriched with attributes. Then, any manipulation of a diagram (typically the insertion of an arrow, a box, text, coloring, etc.) can be expressed in terms of the manipulation of its underlying attributed representation graph. The formal description of the manipulation is done by programmed attributed graph grammars.

Termination Proofs and the Length of Derivations (Preliminary Version)
Dieter Hofbauer, Clemens Lautemann

The derivation height of a term t, relative to a set R of rewrite rules, dhR(t), is the length of a longest derivation from t. We investigate in which way certain termination proof methods impose bounds on dhR. In particular we show that, if termination of R can be proved by polynomial interpretation then dhRis bounded from above by a doubly exponential function, whereas termination proofs by Knuth-Bendix ordering are possible even for systems where dhRcannot be bounded by any primitive recursive functions. For both methods, conditions are given which guarantee a singly exponential upper bound on dhR. Moreover, all upper bounds are tight.

Abstract Rewriting with Concrete Operations
Stéphane Kaplan, Christine Choppy

TODO

On How To Move Mountains 'Associatively and Commutatively'
Mike Lai

In this paper we give another characterization of a set of rules which defines a Church-Rosser reduction on the term algebra specified by some associative and commutative equations. This characterization requires fewer conditions to be satisfied than those previously given in the literature do. As a result, when the required conditions are satisfied, the word problem in the term algebra defined by the set of rules and the set of associative and commutative equations can be solved by successive applications of rewriting to the elements in question.
In addition, what makes this approach different from the others is that notions such as AC-compatibility or coherence modulo AC of reductions induced by sets of rules, which are essential in [Pe-S] or [Jo-Ki] respectively, are not required here. Consequently, a proof of correctness of the completion algorithm (given in [Lai 2]) for constructing a desired set of rules based on this approach can be compared directly with that of Huet in [Hu 2]. In fact, it turns out that all we have to do is to replace terms in [Hu 2] by AC-equivalence classes of terms. The main reason is that all the complications due to AC-compatibility or coherence modulo AC simply are not present here.
Finally, we shall discuss how to minimize the unnecessary computation of some critical pairs during the completion.

Generalized Gröbner Bases: Theory and Applications. A Condensation
Dallas Lankford

Zacharias and Trinks proved that it can be constructively determined whether a finite generating set is a generalized Gröbner basis provided ideals are detachable and syzygies are solvable in the coefficient ring. We develop an abstract rewriting characterization of generalized Gröbner bases and use it to give new proofs of the Spear-Zacharias and Trinks theorems for testing and constructing generalized Gröbner bases. In addition, we use the abstract rewriting characterization to generalize Ayoub's binary approach for testing and constructing Gröbner bases over polynomial rings with Euclidean coefficient rings to arbitrary principal ideal coefficient domains. This also shows that Spear-Zacharias' and Trinks' approach specializes to Ayoub's approach, which was not known before.

A Local Termination Property for Term Rewriting Systems
Dana May Latch, Ron Sigal

We describe a desirable property, local termination, of rewrite systems which provide an operational semantics for formal functional programming (FFP) languages, and we give a multiset ordering which can be used to show that the property holds.

An Equational Logic Sampler
George F. McNulty

TODO

Modular Aspects of Properties of Term Rewriting Systems Related to Normal Forms
Aart Middeldorp

In this paper we prove that the property of having unique normal forms is preserved under disjoint union. We show that two related properties do not exhibit this kind of modularity.

Priority Rewriting: Semantics, Confluence, and Conditional
Chilukuri K. Mohan

Priority rewrite systems (PRS) are partially ordered finite sets of rewrite rules; in this paper, two possible alternative definitions for rewriting with PRS are examined. A logical semantics for priority rewriting is described, using equational formulas obtained from the rules, and inequations which must be assumed to permit rewriting with rules of lower priority. Towards the goal of using PRS to define data type and function specifications, restrictions are given that ensure confluence and encourage modularity. Finally, the relation between priority and conditional rewriting is studied, and a natural combination of these mechanisms is proposed.

Negation with Logical Variables in Conditional Rewriting
Chilukuri K. Mohan, Mandayam K. Srivas

We give a general formalism for conditional rewriting, with systems containing conditional rules whose antecedents contain literals to be shown satisfiable and/or unsatisfiable. We explore semantic issues, addressing when the associated operational rewriting mechanism is sound and complete. We then give restrictions on the formalism which enable us to construct useful and meaningful specifications using the proposed operational mechanism.

Algebraic Semantics and Complexity of Term Rewriting Systems
Tohru Naoi, Yasuyoshi Inagaki

The present paper studies the semantics of linear and non-overlapping TRSs. To treat possibly non-terminating reduction, the limit of such a reduction is formalized using Scott's order-theoretic approach. An interpretation of the function symbols of a TRS as a continuous algebra, namely, continuous functions on a cpo, is given, and universality properties of this interpretation are discussed. Also a measure for computational complexity of possibly non-terminating reduction is proposed. The space of complexity forms a cpo and function symbols can be interpreted as monotone functions on it.

Optimization by Non-Deterministic, Lazy Rewriting
Sanjai Narain

Given a set S and a condition C we address the problem of determining which members of S satisfy C. One useful approach is to set up the generation of S as a tree, where each node represents a subset of S. If from the information available at a node, we can determine that no members of the subset it represents satisfy C, then the subtree rooted at it can be pruned, i.e. its generation suppressed. Thus, large subsets of S can be quickly eliminated from consideration. We show how such a tree can be simulated by interpretation of non-deterministic rewrite rules, and its pruning simulated by lazy evaluation.

Combining Matching Algorithms: The Rectangular Case
Tobias Nipkow

The problem of combining matching algorithms for equational theories with disjoint signatures is studied. It is shown that the combined matching problem is in general undecidable but that it becomes decidable if all theories are regular. For the case of regular theories an efficient combination algorithm is developed. As part of that development we present a simple algorithm for solving the word problem in the combination of arbitrary equational theories with disjoint signatures.

Restrictions of Congruence Generated by Finite Canonical String-Rewriting Systems
Friedrich Otto

Let Σ1 be a subalphabet of Σ2, and let R1 and R2 be finite string-rewriting systems on Σ1 and Σ2, respectively. If the congruence ↔*R1 generated by R1 and the congruence ↔*R2 generated by R2 coincide on Σ1*, then R1 can be seen as representing the restriction of the system R2 to the subalphabet Σ1. Is this property decidable ? This question is investigated for several classes of finite canonical string-rewriting systems.

Embedding with Patterns and Associated Recursive Path Ordering
Laurence Puel

TODO

Rewriting Techniques for Program Synthesis
Uday S. Reddy

We present here a completion-like procedure for program synthesis from specifications. A specification is expressed as a set of equations and the program is a Noetherian set of rewrite rules that is efficient for computation. We show that the optimizations applicable for proving inductive theorems are useful for program synthesis. This improves on the use of general completion procedure for program synthesis, reported by Dershowitz, in that it generates fewer rules and terminates more often. However, there is a qualitative difference between this procedure and completion, as superposition is used not for eliminating critical overlaps but to find a complete set of cases for an inductive theorem.

Transforming Strongly Sequential Rewrite Systems with Constructors for Efficient parallel Execution
R. C. Sekar, Shaunak Pawagi, I. V. Ramakrishnan

Strongly sequential systems, developed by Huet and Levy [2], has formed the basis of equational programming languages. Experience with such languages so far suggests that even complex equational programs are based only on strongly sequential systems with constructors. However, these programs are not readily amenable for efficient parallel execution. This paper introduces a class of strongly sequential systems called path sequential systems. Equational programs based on path sequential systems are more natural for parallel evaluation. An algorithm for transforming any strongly sequential system with constructors into an equivalent path sequential system is described.

Efficient Ground Completion: An O(n log n) Algorithm for Generating Reduced Sets of Ground Rewrite Rules Equivalent to a Set of Ground Equations E
Wayne Snyder

We give a fast method for generating reduced sets of rewrite rules equivalent to a given set of ground equations. Since, as we show, reduced ground rewrite systems are in fact canonical, this is essentially an efficient Knuth-Bendix procedure for the ground case. The method runs in O(n log n), where n is the number of occurrences of symbols in E. We also show how our method provides a precise characterization of the (finite) collection of all reduced sets of rewrite rules equivalent to a given ground set of equations E, and prove that our algorithm is complete in that it can enumerate every member of this collection. Finally, we show how to modify the method so that it takes as input E and a total precedence ordering on the symbols in E, and returns a reduced rewrite system contained in the lexicographic path ordering generated by the precedence.

Extensions and Comparison of Simplification Orderings
Joachim Steinbach

The effective calculation with term rewriting systems presumes termination. Orderings on terms are able to guarantee termination. This report deals with some of those term orderings: Several path and decomposition orderings and the Knuth-Bendix ordering. We pursue three aims:

Classes of Equational Programs that Compile into Efficient Machine Code
Robert Strandh

Huet and Lévy [HL79] showed that, if an equational program E is strongly sequential, there exists an automaton that, given a term in the language L(E), finds a redex in that term.
The most serious problem with their approach becomes evident when one tries to use their result in a programming system. Once a redex has been found, it must be replaced by a term built from the structure of the right-hand side corresponding to the redex, and from parts of the old term. Then, the reduction process must be restarted so that other redexes can be found. With their approach, a large part of the term tree may have to be rescanned.
Hoffmann and O'Donnell [HO82a] improved the situation by defining the class of strongly left-sequential programs. For this class, a particularly simple reduction algorithm exists. A stack is used to hold information about the state of the reduction process. When a redex has been found and replaced by the corresponding right-hand side, the stack holds all the relevant information needed to restart the reduction process in a well defined state such that no unnecessary rescanning of the term is done.
However, it turns out that the approach of Hoffmann and O'Donnell is unnecessarily restrictive. In this paper, we define a new class of Equational Programs, called the forward branching programs. This class is much larger than the class of strongly left-sequential programs. Together with a new reduction algorithm, briefly discussed in this paper, our approach allows us to use the hardware stack to hold reduction information in a way similar to the way a block structured programming language uses the stack to hold local variables. In effect, our approach allows us to use innermost stabilization, while preserving the overall outermost reduction strategy.

Fair Termination is Decidable for Ground Systems
Sophie Tison

By summing up, we have reduced the problem of fair termination to the emptiness of the intersection of two constructible and recognizable forests. Since the family of recognizable forests is closed under intersection and since emptiness is decidable in this family, fair termination is decidable.

Termination for the Direct Sum of left-Linear Term Rewriting Systems -Preliminary Draft-
Yoshihito Toyama, Jan Willem Klop, Hendrik Pieter Barendregt

The direct sum of two term rewriting systems is the union of systems having disjoint sets of function symbols. It is shown that two term rewriting systems both are left-linear and complete if and only if the direct sum of these systems is so.

Conditional Rewrite Rule Systems with Built-In Arithmetic and Induction
Sergei G. Vorobyov

Conditional rewriting systems, conditions being the formulae of decidable theories, are investigated. A practical search space-free decision procedure for the related class of unquantified logical theories is described. The procedure is based on cooperating conditional reductions, case splittings and decision algorithms, and is able to perform certain forms of inductive inferences. Completeness and termination of the procedure are proved.

Consider Only General Superpositions in Completion Procedures
Hantao Zhang, Deepak Kapur

Superposition or critical pair computation is one of the key operations in the Knuth-Bendix completion procedure and its extensions. We propose a practical technique which can save computation of some critical pairs where the most general unifiers used to generate these critical pairs are less general than the most general unifiers used to generate other joinable critical pairs. Consequently, there is no need to superpose identical subterms at different positions in a rule more than once and there is also no need to superpose symmetric subterms in a rule more than once. The combination of this technique with other critical pair criteria proposed in the literature is also discussed. The technique has been integrated in the completion procedures for ordinary term rewriting systems as well as term rewriting systems with associative-commutative operators implemented in RRL, Rewrite Rule Laboratory. Performance of the completion procedures with and without this technique is compared on a number of examples.

Solving Systems of Linear Diophantine Equations and Word Equations
Habib Abdulrab, Jean-Pierre Pécuchet

TODO

SbReve2: A Term Rewriting Laboratory with (AC-) Unfailing Completion
Siva Anantharaman, Jieh Hsiang, Jalel Mzali

TODO

THEOPOGLES - An efficient Theorem Prover based on Rewrite-Techniques
Jürgen Avenhaus, Jörg Denzinger, Jürgen Müller

TODO

COMTES - An Experimental Environment for the Completion of Term Rewriting Systems
Jürgen Avenhaus, Klaus Madlener, Joachim Steinbach

TODO

ASSPEGIQUE: An Integrated Specification Environment
Michel Bidoit, Francis Capy, Christine Choppy

TODO

KBlab: An Equational Theorem Prover for the Macintosh
Maria Paola Bonacina, Giancarlo Sanna

TODO

Fast Knuth-Bendix Completion: Summary
Jim Christian

TODO

Compilation of Ground Term Rewriting Systems and Applications (DEMO)
Max Dauchet, Aline Deruyver

We get an algorithm (based on tree automata technics) which "compiles" ground term rewriting systems to solve reachability problem in linear time relatively to the terms. The name of the software is VALERIAAN: V=Verification, A= Algebraic, L=Logic, E=Equation, R=Rewrite, I=Inference, A = (tree) Automata, N=Normal (form). VALERIAN is a comics character who travels across time and space.

An Overview of Rewrite Rule Laboratory (RRL)
Deepak Kapur, Hantao Zhang

TODO

InvX: An Automatic Function Inverter
Hessam Khoshnevisan, K. M. Sephton

We have implemented InvX, a system that will mechanically generate inverses for a substantial class of functions. By applying an extended set of function-level axioms at compile-time, expressions for the inverses are transformed so that no unification is required at run-time in many cases. This makes the cost of their execution comparable with that of reduction-based semantics. We have also described the incorporation of InvX in the FLAGSHIP programming environment. InvX has been used successfully on a wide range of examples.

A Parallel Implementation of Rewriting and Narrowing
Naomi Lindenstrauss

A parallel implementation of rewriting and narrowing is described. This implementation is written in Flat Concurrent Prolog, but the ideas involved are applicable to any system where processes are capable of creating other processes and communicating with each other. Using FCP enables one to write very short programs, virtually no longer than the verbal description of the algorithms. Running programs under the FCP interpreter and using facilities provided by it, one can compare the efficiencies of various strategies. Theoretical results about the efficiency of strategies in certain cases are also mentioned.

Morphocompletion for One-Relation Monoids
John Pedersen

TODO

1987

Optimizing Equational Programs
Robert Strandh

Equational programming [HO82b] involves replacing subterms in a term according to a set of equations or rewrite rules. Each time an equation is applied to the term, the subterm that matches the left hand side of the equation is replaced by the corresponding right hand side. In that process several nodes of the term tree are created. Some of these nodes may later turn out to be useless, and will be reclaimed.
This paper discusses important relationships between two equational programs. In particular we define the term mutual confluence and show that two equational programs with the mutual confluence property have the same output behavior with very general assumptions about the reduction strategy. As an application of our result, we discuss source-to-source transformations of an equational program E to an equational program F. Our transformations are used as a part of a compiler to improve execution time of E by avoiding the creation of too many nodes in the reduction process. We show that our transformations indeed give E and F the mutual confluence property, thus preserving the output behavior of E when transformed to F.
Preserving the output behavior is more general than preserving just normal forms, in that we allow for infinite computations where we output stable parts of the term, i.e., parts that can never change as a result of further reductions.

A Compiler for Conditional Term Rewriting Systems
Stéphane Kaplan

In this paper, we present a compiler for conditional term rewriting systems. With respect to traditional interpreters, the gain in execution time that we obtain is of several orders of magnitude. We discuss several optimizations, among which a method to share code in the premises of the conditional rules, well-adapted to algebraic specifications.

How to Choose Weights in the Knuth Bendix Ordering
Ursula Martin

Knuth and Bendix proposed a very versatile technique for ordering terms, based upon assigning weights to operators and then to terms by adding up the weights of the operators they contain. Our purpose in this paper is twofold. First we give some examples to indicate the flexibility of the method. Then we give a simple and practical algorithm, based on the simplex algorithm, for determining whether or not a set of rules can be ordered by a Knuth Bendix ordering.

Detecting Looping Simplifications
Paul Walton Purdom Jr.

A generalization of tree matching and unification algorithms is presented. Given the equation s=t, this algorithm can often quickly determine that the rewrite rule s→t leads to an infinite sequence of "simplifications". The rule t→s can be tested in the same way. Rules leading to infinite simplifications should not be included in a rewrite system. In general, the problem of deciding whether a set of rewrite rules leads to infinite simplifications is undecidable. The algorithm that is used for this problem is a cross between a unification algorithm for terms with overlapping variables and a matching algorithm. In the simplest case it attempts to find a, σM and σU such that σMσUs = σUt/a. In other words, is there a substitution σ U such that in the rule σUs → σUt the left side matches a subpart of the right side. The same basic algorithm can be used to test more complex cases of looping involving the interaction of several rules, but it is limited to those cases where each application of a rule occurs inside of the previous rule application. Experiments suggest that the simplest form of the algorithm is about 80 percent effective in eliminating bad orientations of rules. The algorithm never rules out a good orientation of a rule, and so it is most useful when one wants to consider all possible rule orientations.

Combinatorial Hypermap Rewriting
Eric Sopena

Combinatorial hypermaps may be viewed as topological representations of hypergraphs. In this paper, we introduce a hypermap rewriting model, based on a purely combinatorial formulation of the rewriting mechanism. We illustrate this definition by providing a hypermap grammar which generates the set of all connected planar maps. We also investigate a special kind of hypermap grammars, the H-grammars, for which we give a Pumping Theorem enlightening the combinatorial structure of the generated hypermap languages.

Th Word Problem for Finitely Presented Monoids and Finite Canonical Rewriting Systems
Craig C. Squier, Friedrich Otto

The main purpose of this paper is to describe a negative answer to the following question:
Does every finitely presented monoid with a decidable word problem have a presentation (Σ;R) where R is a finite canonical rewriting system?
To obtain this answer a certain homological finiteness condition for monoids is considered. If M is a monoid that can be presented by a finite canonical rewriting system, then M is an (FP)3-monoid. Since there are well-known examples of finitely presented groups that have easily decidable word problem, but that do not meet this condition, this implies that there are finitely presented monoids (and groups) with decidable word problem that cannot be presented by finite canonical rewriting systems.

Term Rewriting Systems with Priorities
Jos C. M. Baeten, Jan A. Bergstra, Jan Willem Klop

Term rewriting systems with rules of different priority are introduced. The semantics are explained in detail and several examples are discussed.

A Gap Between Linear and Non Linear Term-Rewriting Systems (1)
Max Dauchet, Francesco de Comité

We prove that every recursively enumerable set of terms is generated by a non linear t.r.s in 2 passes. Conversely, every set of terms which is generated by a linear t.r.s in a finite number of passes is recognizable.

Code Generator Generation Based on Template-Driven Target Term Rewriting
Annie Despland, Monique Mazaud, Raymond Rakotozafy

A major problem in deriving a compiler from a formal definition is the production of efficient object code. In this context, we propose a solution to the problem of code generator generation.
Our approach is based on a target machine description where the basic concepts used (access modes, access classes and instructions) are bottom-up hierarchically described by tree-patterns. These tree-patterns are written in an abstract tree language which is also used to write the intermediate program representation (input to the code generator).
The first phase of code production is based on access mode template-driven rewritings in which the program intermediate representation is progressively transformed into its "canonical form". The result is that each program instruction is reduced to a sequence of elementary instructions, each of these elementary instructions representing an instance of an instruction pattern.
The local and global optimizations phases as well as the storage management phase may be realized by multipass rewritings and attribute evaluations of the canonical form.
In the last phase of code production, each pattern instruction instance of the updated intermediate form is replaced by the corresponding instance of the associated pattern code.

Descendants of Regular Language in a Class of Rewriting Systems: Algorithm and Complexity of an Automata Construction
M. Benois

Recent works on public key encryption for secure network communication [7] have brought back the following problem : given a regular set R on A*, defined by a non deterministic finite automaton with n states and a rewriting system T, how can we construct an automaton that recognizes the set of descendants of R : Δ*(R) when this language is regular [1]. Some algorithms are found by Book and Otto [6] or Sakarovitch and me [3],in very particular cases of systems and gave complexity in O(n4) in [6] and O(n3) in [3]. Here we give a strong extension of these algorithms in a large class of systems however the complexity of our algorithm does not depend on the lenght of the words of T and is at most in O(n6).

Groups Presented by Certain Classes of Finite Length-Reducing String-Rewriting Systems
Klaus Madlener, Friedrich Otto

TODO

Some Results about Confluence on a Given Congruence Class
Friedrich Otto

It is undecidable in general whether or not a term-rewriting system is confluent on a given congruence class. This result is shown to hold even when the term-rewriting systems under consideration contain unary function symbols only, and all their rules are length-reducing. On the other hand, for certain subclasses of these systems confluence on a given congruence class is decidable.

Ground Confluence
Richard Göbel

TODO

Structured Contextual Rewriting
Zhenyu Qian

In this paper, we develop a mechanism, which we call structured contextual system, (SCS for short,) to deal with some non-finitely-based algebraic specifications. The sufficient condition for confluence and termination of this kind of systems is also considered, based on a generalization of the approach by O'Donell.

Schematization of Infinite Sets of Rewrite Rules. Application to the Divergence of Completion Processes
Hélène Kirchner

This study was originally motivated by the divergence problem of the completion procedure for term rewriting systems [17,18,11]. The practical interest of a completion procedure is limited by the fact that it can generate infinite sets of rewrite rules. Moreover the uniqueness of the result of the completion procedure, given a fixed ordering for orienting equations, implies that it cannot be expected to find another completion strategy for which the completion terminates. [...]
The goal of that paper is neither to discover recurrence relations in a set of rules generated by completion, nor to predict a priori the divergence of completion, but rather to propose a formalism to deal with the problem of divergence, namely the definition of meta-rules. This paper is an attempt to give answers for different questions:
1) given an infinite set of rules, the first problem is to find a finite set of schemas, here called meta-rules, where some variables, called meta-variables, may have infinite sets of possible values. [...]
2) The main problem is to be able to use mats-rules for deciding the validity or satisfiability of an equation in the equational theory defined by the infinite set of rules. [...]
3) Of course a preliminary definition of how to use meta-rules must be given. [...]
4) Meta-rules can be used to solve equations, with a narrowing-like process. [...]

Completion for Rewriting Modulo a Congruence
Leo Bachmair, Nachum Dershowitz

We present completion methods for rewriting modulo a congruence, generalizing previous methods by Peterson and Stickel (1981) and Jouannaud and Kirchner (1986). We formalize our methods as equational inference systems and describe techniques for reasoning about such systems.

On Equational Theories, Unification and Decidability
Hans-Jürgen Bürckert, Alexander Herold, Manfred Schmidt-Schauß

The following classes of equational theories, which are important in unification theory, are presented: permutative, finite, Noetherian, simple, almost collapse free, collapse free, regular, and Ω-free theories. The relationships between the particular theories are shown and the connection between these classes and the unification hierarchy is pointed out. We give an equational theory that always has a minimal set of unifiers for single equations, but there exists a system of two equations which has no minimal set of unifiers. This example suggests that the definition of the unification type of an equational theory has to be changed. Furthermore we study the conditions, under which minimal sets of unifiers always exist.
Decidability results about the membership of equational theories to the classes above are presented. It is proved that Noetherianness, simplicity, almost collapse freeness and Ω-freeness are undecidable. We show that it is not possible to decide where a given equational theory resides in the unification hierarchy and where in the matching hierarchy.

A General Complete E-Unification Procedure
Jean H. Gallier, Wayne Snyder

In this paper, a general unification procedure that enumerates a complete set of E-unifiers of two terms for any arbitrary set E of equations is presented. It is more efficient than the brute force approach using paramodulation, because many redundant E-unifiers arising by rewriting at or below variable occurrences are pruned out by our procedure, still retaining a complete set. This procedure can be viewed as a nondeterministic implementation of a generalization of the Maxtelli-Montanari method of transformations on systems of terms [13], which has its roots in Herbrand's thesis [7]. Remarkably, only two new transformations need to be added to the transformations used for standard unification. This approach differs from previous work based on transformations because, rather than sticking rather closely to the Martelli-Montanari approach using multi-equations [13] as in Kirchner [10,11], we introduce transformations dealing directly with rewrite rules.
As an example of the flexibility of this approach, we apply it to the problem of higher-order unification, and find an improved version of Huet's procedure [8]. Our major new result is the presentation and justification of a method for enumerating (relatively minimal) complete sets of unifiers modulo arbitrary sets of equations.

Improving Basic Narrowing Techniques
Pierre Réty

In this paper, we propose a new and complete method based on narrowing for solving equations in equational theories. It is a combination of basic narrowing and narrowing with eager reduction, which is not obvious, because their naive combination is not a complete method. We show that it is more efficient than the existing methods in many cases, and for that establish commutation properties on the narrowing. It provides an algorithm that has been implemented as an extension of the REVE software.

Strategy-Controlled Reduction and Narrowing
Peter Padawitz

The inference rules "reduction" and "narrowing" are generalized from terms resp equations to arbitrary atomic formulas. Both rules are parameterized by strategies to control the selection of redices. Church-Rosser properties of the underlying Horn clause specification are shown to ensure both completeness and strategy independence of reduction. "Uniformity" turns out as the crucial property of those reduction strategies which serve as complete narrowing strategies. A characterization of uniformity (and hence completeness) of leftmost-outermost narrowing is presented.

Algorithmic Complexity of Term Rewriting Systems
Christine Choppy, Stéphane Kaplan, Michèle Soria

For the class of the regular term rewriting systems, we have provided ways of obtaining asymptotic evaluations of the cost series. The user does not need to actually manipulate formal series, since our results are given under the form of ready-to-use formulae. These results solely depend on physical characteristics of the system, easily obtainable : number of variables and of constructors in the lefthand sides, occurrences of derived operators in the right-hand sides. Then, the average cost is constant, polynomial or exponential, according to the position of the singularity of the expressions Qi(N(z)) closest to the origin.

Optimal Speedups for Parallel Pattern Matching in Trees
R. Ramesh, I. V. Ramakrishnan

Tree pattern matching is a fundamental operation that is used in a number of programming tasks such as code optimization in compilers, symbolic computation, automatic theorem proving and term rewriting. An important special case of this operation is linear tree pattern matching in which an instance of any variable in the pattern occurs at most once. If n and m are the number of nodes in the subject and pattern tree respectively and if no restriction is placed on the structure of the trees, then the fastest known sequential algorithm for linear tree pattern matching requires O(nm) time in the worst case.
In this paper we present a parallel algorithm for linear tree pattern matching on a PRAM (parallel random access machine) model. Our algorithm exhibits optimal speedup, in the sense that its processor-time product matches the worst-case time complexity of the fastest sequential algorithm.

1985

Contextual Rewriting
Hantao Zhang, Jean-Luc Rémy

TODO

Deciding Algebraic Properties of Monoids Presented by Finite Church-Rosser Thue Systems
Friedrich Otto

TODO

Two Applications of Equational Theories to Database Theory
Stavros S. Cosmadakis, Paris C. Kanellakis

Databases and equational theorem proving are well developed and seemingly unrelated areas of Computer Science Research. We provide two natural links between these fields and demonstrate how equational theorem proving can provide useful and tools for a variety of database tasks.
Our first application is a novel way of formulating functional and inclusion dependencies (the most common database constraints) using equations. The central computational problem of dependency implication is directly reduced to equational reasoning. Mathematical techniques from universal algebra provide new proof procedures and better lower bounds for dependency implication. The use of REVE, a general purpose transformer of equations into term rewriting systems, is illustrated on nontrivial sets of functional and inclusion dependencies.
Our second application demonstrates that the uniform word problem for lattices is equivalent to implication of dependencies expressing transitive closure, together with functional dependencies. This natural generalization of functional dependencies, which is not expressible using conventional database theory formulations, has a natural inference system and an efficient decision procedure.

An Experiment in Partial Evaluation: The Generation of a Compiler Generator
Neil D. Jones, Peter Sestoft, Harald Søndergaard

It has been known for several years that in theory the program transformation principle called partial evaluation or mixsd computation can be used for compiling and compiler generation (given an interpreter for the language to be implemented), and even for the generation of a compiler generator. The present paper describes an experimental partial evaluator able to generate stand-alone compilers and compiler generators. As far as we know, such generations had not been done in practice prior to summer 1984. Partial evaluation of a subIect program with respect to some of its input parameters results in a resfdualprogram. By definition, running the residual program on any remaining input yields the same result as running the original subject program on all of its input. Thus a residual program can be considered a specialization of the subject program to known, fixed values of some of its parameters. A partial evaluator is a program that performs partial evaluation given a subject program and fixed values for some of the program's parameters.

NARROWER: A New Algorithm for Unification and Its Application to Logic Programming
Pierre Réty, Claude Kirchner, Hélène Kirchner, Pierre Lescanne

TODO

Solving Type Equations by Graph Rewriting
Hassan Aït-Kaci

I have described a syntactic calculus of partially ordered structures and its application to computation. A syntax of record-like terms and a "type subsumption" ordering were defined and shown to form a lattice structure. A simple "type-as-set" interpretation of these term structures extends this lattice to a distributive one, and in the case of finitary terms, to a complete Brouwerian lattice. As a result, a method for solving systems of type equations by iterated rewriting of type symbols was proposed which defines an operational semantics for KBL - a Knowledge Base Language. It was shown that a KBL program can be seen as a system of equations. Thanks to the lattice properties of finite structures, a system of equations admits a least fixed-point solution. The particular order of computation of KBL, the "fan-out computation order", which rewrites symbols closer to the root first was formally defined and shown to be maximal. Unfortunately, the complete "correctness" of KBL is not yet established. That is, it is not known at this point whether the normal form of a term is equal to the fixed-point solution. However, as steps in this direction, two technical lemmas were conjectured to which a proof of the correctness is corollary.

Path of Subterms Ordering and Recursive Decomposition Ordering Revisited
Michaël Rusinowitch

The relationship between several simplification orderings is investigated: PSO, RPO, RDO. RDO is improved in order to deal with more pairs of terms, and made more efficient and easy to handle, by removing useless computations.

Associative Path Orderings
Leo Bachmair, David A. Plaisted

In this paper we introduce a new class of orderings - associative path orderings - for proving termination of associative commutative term rewriting systems. These orderings are based on the concept of simplification orderings and extend the well-known recursive path orderings to E-congruence classes, where E is an equational theory consisting of associativity and commutativity axioms. The associative path ordering is similar to another termination ordering for proving AC termination, described in Dershowitz, et al. (83), which is also based on the idea of transforming terms. Our ordering is conceptually simpler, however, since any term is transformed into a single term, whereas in Dershowitz, et al. (83) the transform of a term is a multiset of terms. More important yet, we show how to lift our ordering to non-ground terms, which is essential for applications of the Knuth-Bendix completion method but was not possible with the previous ordering.
Associative path orderings require less expertise than polynomial orderings. They are applicable to term rewriting systems for which a precedence ordering on the set of operator symbols can be defined that satisfies a certain condition, the associative pair condition. The precedence ordering can often be derived from the structure of the reduction rules. We include termination proofs for various term rewriting systems (for rings, boolean algebra, etc.) and, in addition, point out ways of dealing with equational theories for which the associative pair condition does not hold.

A Procedure for Automatically Proving the Termination of a Set of Rewrite Rules
David Detlefs, Randy Forgaard

TODO

PETRIREVE: Proving Petri Net Properties with Rewriting Systems
Christine Choppy, Colette Johnen

We present here an approach using rewriting systems for analysing and proving properties on Petri nets. This approach is implemented in the system PETRIREVE. By establishing a link between the graphic Petri net design and simulation system PETRIPOTE and the term rewriting system generator REVE, PETRIREVE provides an environment for the design and verification of Petri nets. Representing Petri nets by rewriting systems allows easy and direct proofs of the behaviour correctness of the net to be carried out, without having to build the marking graph or to search for net invariants.

Fairness in Term Rewriting Systems
Sara Porat, Nissim Francez

The notion of fair derivation in a term-rewriting system is introduced, whereby every rewrite rule enabled infinitely often along a derivation is infinitely-often applied. A term-rewriting system is fairly-terminating iff all its fair derivations are finite. The paper presents the following question: is it decidable, for an arbitrary ground term rewriting system, whether it fairly terminates or not? A positive answer is given for several subcases. The general case remains open.

Two Results in Term Rewriting Theorem Proving
Jieh Hsiang

Two results are presented in this paper. (1) We extend the term rewriting approach to first order theorem proving, as described in [HsD83], to the theory of first order predicate calculus with equality. Consequently, we have showed that the term rewriting method can be as powerful as paramodulation and resolution combined. Possible improvements of efficiency are also discussed.
(2) In [KaN84], Kapur & Narendran proposed a method similar to [HsD83]. Motivated by the Kapur-Narendran method, we introduce a notion of splitting for theorem proving in first order predicate calculus. The splitting strategy provides a better utilization of the reduction mechanism of term rewriting systems than the N-strategy in [HsD83], although it generates more critical pairs. Comparisons and the relation between the splitting strategy, Kapur-Narendran method, and the N-strategy are also given.
We conjecture that our way of dealing with first order theories with full equality can be extended to the splitting and the Kapur-Narendran methods as well.
Due to the lack of space, we only give a sketch of the proofs of the completeness of the two theorem proving methods. They will be provided in detail in a longer version of the paper.

Handling Function Definitions through Innermost Superposition and Rewriting
Laurent Fribourg

This paper presents the operating principles of SLOG, a logic interpreter of equational clauses (Horn clauses where the only predicate is '='). SLOG is based on an oriented form of paramodulation called superposition. Superposition is a complete inference rule for first-order logic with equality. SLOG uses only a strong restriction of superposition (innermost superposition) which is still complete for a large class of programs. Besides superposition, SLOG uses rewriting which provides eager evaluation and handling of negative knowledge. Rewriting combined with superposition improves terminability and control of equational logic programs.

An Ideal-Theoretic Approach to Work Problems and Unification Problems over Finitely Presented Commutative Algebras
Abdelilah Kandri-Rody, Deepak Kapur, Paliath Narendran

A new approach based on computing the Gröbner basis of polynomial ideals is developed for solving word problems and unification problems for finitely presented commutative algebras. This approach is simpler and more efficient than the approaches based on generalizations of the Knuth-Bendix completion procedure to handle associative and commutative operators. It is shown that (i) the word problem over a finitely presented commutative ring with unity is equivalent to the polynomial equivalence problem modulo a polynomial ideal over the integers, (ii) the unification problem for linear forms is decidable for finitely presented commutative rings with unity, (iii) the word problem and unification problem for finitely presented boolean polynomial rings are co-NP-complete and co-NP-hard respectively, and (iv) the set of all unifiers of two forms over a finitely presented abelian group can be computed in polynomial time. Examples and results of algorithms based on the Gröbner basis computation are also reported.

Combining Unification Algorithms for Confined Regular Equational Theories
Katherine A. Yelick

This paper presents a method for combining equational unification algorithms to handle terms containing "mixed" sets of function symbols. For example, given one algorithm for unifying associative-commutative operators, and another for unifying commutative operators, our algorithm provides a method for unifying terms containing both kinds of operators. We restrict our attention to a class of equational theories which we call confined regular theories. The algorithms is proven to terminate with a complete and correct set of E-unifiers. An implementation has been done as part of a larger system for reasoning about equational theories.

An Algebraic Approch to Unification Under Assoiativity and Commutativity
Albrecht Fortenbach

TODO

Unification Problems with One-Sided Distributivity
Stefan Arnborg, Erik Tidén

We show that unification in the equational theory defined by the one-sided distributivity law x×(y+z)=x×y+x×z is decidable and that unification is undecidable if the laws of associativity x+(y+z)=(x+y)+z and unit element 1×x=x×1=x are added. Unification under one-sided distributivity with unit element is shown to be as hard as Markov's problem, whereas unification under two-sided distributivity, with or without unit element, is NP-hard. A quadratic time unification algorithm for one-sided distributivity, which may prove interesting since available universal unification procedures fail to provide a decision procedure for this theory, is outlined. The study of these problems is motivated by possible applications in circuit synthesis and by the need for gaining insight in the problem of combining theories with overlapping sets of operator symbols.

Fast Many-to-One Matching Algorithms
Paul Walton Purdom Jr., Cynthia A. Brown

Matching is a fundamental operation in any system that uses rewriting. In most applications it is necessary to match a single subject against many patterns, in an attempt to find subexpressions of the subject that are matched by some pattern. In this paper we describe a many-to-one, bottom-up matching algorithm. The algorithm takes advantage of common subexpressions in the patterns to reduce the amount of work needed to match the entire set of patterns against the subject.
The algorithm uses a compact data structure called a compressed dag to represent the set of patterns. All the variables are represented by a single pattern node. Variable usages are disambiguated by a variable map within the pattern nodes. Two expressions that differ only in the names of their variables are represented by the same node in the compressed dag. A single compressed dag contains all the patterns in the system.
The matching proceeds bottom-up from the leaves of the subject and set of patterns to the roots. The subject keeps track of the variable bindings needed to perform the match. A hashing method is used to speedily retrieve specific nodes of the compressed dag.
The algorithm has been implemented as a part of a Knuth-Bendix completion procedure. In comparison with the standard matching algorithm previously used, the new algorithm reduced the number of calls to the basic match routine to about 1/5 of their former number. It promises to be an effective tool in producing more efficient rewriting systems.

Complexity of Matching Problems
Dan Benanav, Deepak Kapur, Paliath Narendran

We show that the associative-commutative matching problem is NP-complete; more precisely, the matching problem for terms in which some function symbols are uninterpreted and others are both associative and commutative, is NP-complete. It turns out that the similar problems of associative-matching and commutative-matching are also NP-complete. However, if every variable appears at most once in a term being matched, then the associative-commutative matching problem is shown to have an upper-bound of O(|s|*|t|3), where |s| and |t| are respectively the sizes of the pattern s and the subject t.

The Set of Unifiers in Typed Lambda-Calculus as Regular Expression
Marek Zaionc

The main problem for mechanization of the theorem proving in the typed λ-calculus is the problem of determining whether two terms of a fixed type with free variables have the common instance. This unification problem is undecidable for orders ≥2 (see Huet (1973) and Goldfarb (1981)). The Huet (1975) paper contains the description of semi-decision algorithm which in the case of success, returns a substitution called the most general unifier. This algorithm produces the matching tree, in which the most general unifiers are represented by the final branches. The most interesting is the case when the matching tree is infinite and consists of infinite number of most general unifiers. I have noticed that some infinite matching trees have an interesting property: for every infinite branch there is a node which occurs earlier in this tree. It means that there are some fragments which repeated in this tree. In this paper I consider some class of unification problems. For the fixed problem from this class the set of unifiers can be represented by the regular expression built up on the basis of finite alphabet. This alphabet consists of symbols called elementary substitutions. Every word is interpreted as a composition of the elementary substitutions.


Statcounter W3C Validator Last updated on 26 October 2016. Come back to main page.